London Daily

Focus on the big picture.
Monday, Jul 21, 2025

Apple to scan user’s iPhones for images of child sexual abuse

Apple to scan user’s iPhones for images of child sexual abuse

Child protection groups have applauded the announcement but some security researchers are concerned that the system could be misused as no one know what other spy activity Apple does against their users and to how many other 3rd parties they pass all your chat, calls, contacts, and location.

Apple unveiled plans to scan US iPhones for images of child sexual abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused, including by governments looking to surveil their citizens.

The tool designed to detect known images of child sexual abuse, called "neuralMatch," will scan images before they are uploaded to iCloud.

If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified.

Separately, Apple plans to scan users' encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates.

The detection system will only flag images that are already in the centre's database of known child pornography. Parents snapping innocent photos of a child in the bath presumably need not worry.

But researchers say the matching tool — which doesn’t "see" such images, just mathematical "fingerprints" that represent them — could be put to more nefarious purposes.

Matthew Green, a top cryptography researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography. That could fool Apple’s algorithm and alert law enforcement.

"Researchers have been able to do this pretty easily," he said of the ability to trick such systems.

Potential for abuse


Other abuses could include government surveillance of dissidents or protesters. "What happens when the Chinese government says, ‘Here is a list of files that we want you to scan for,'" Green asked.

"Does Apple say no? I hope they say no, but their technology won’t say no".

Tech companies including Microsoft, Google, Facebook and others have for years been sharing digital fingerprints of known child sexual abuse images. Apple has used those to scan user files stored in its iCloud service, which is not as securely encrypted as its on-device data, for child pornography.

Apple has been under government pressure for years to allow for increased surveillance of encrypted data.

Coming up with the new security measures required Apple to perform a delicate balancing act between cracking down on the exploitation of children while keeping its high-profile commitment to protecting the privacy of its users.

But a dejected Electronic Frontier Foundation, the online civil liberties pioneer, called Apple's compromise on privacy protections "a shocking about-face for users who have relied on the company’s leadership in privacy and security".

Meanwhile, the computer scientist who more than a decade ago invented PhotoDNA, the technology used by law enforcement to identify child pornography online, acknowledged the potential for abuse of Apple's system but said it was far outweighed by the imperative of battling child sexual abuse.

"Is it possible? Of course. But is it something that I’m concerned about? No," said Hany Farid, a researcher at the University of California at Berkeley, who argues that plenty of other programme designed to secure devices from various threats haven't seen "this type of mission creep".

For example, WhatsApp provides users with end-to-end encryption to protect their privacy, but also employs a system for detecting malware and warning users not to click on harmful links.

'Gamechanger'


Apple was one of the first major companies to embrace "end-to-end encryption, in which messages are scrambled so that only their senders and recipients can read them. Law enforcement, however, has long pressured the company for access to that information in order to investigate crimes such as terrorism or child sexual exploitation.

Apple said the latest changes will roll out this year as part of updates to its operating software for iPhones, Macs and Apple Watches.

"Apple’s expanded protection for children is a gamechanger," John Clark, the president and CEO of the National Center for Missing and Exploited Children, said in a statement. "With so many people using Apple products, these new safety measures have lifesaving potential for children".

Julia Cordua, the CEO of Thorn, said that Apple's technology balances “the need for privacy with digital safety for children." Thorn, a nonprofit founded by Demi Moore and Ashton Kutcher, uses technology to help protect children from sexual abuse by identifying victims and working with tech platforms.

Breaking security


But in a blistering critique, the Washington-based nonprofit Center for Democracy and Technology called on Apple to abandon the changes, which it said effectively destroy the company’s guarantee of "end-to-end encryption".

Scanning of messages for sexually explicit content on phones or computers effectively breaks the security, it said.

The organisation also questioned Apple’s technology for differentiating between dangerous content and something as tame as art or a meme. Such technologies are notoriously error-prone, CDT said in an emailed statement. Apple denies that the changes amount to a backdoor that degrades its encryption. It says they are carefully considered innovations that do not disturb user privacy but rather strongly protect it.

Separately, Apple said its messaging app will use on-device machine learning to identify and blur sexually explicit photos on children’s phones and can also warn the parents of younger children via text message. It also said that its software would “intervene” when users try to search for topics related to child sexual abuse.

In order to receive warnings about sexually explicit images on their children's devices, parents will have to enroll their child’s phone. Kids over 13 can unenroll, meaning parents of teenagers won’t get notifications.

Apple said neither feature would compromise the security of private communications or notify police.

Newsletter

Related Articles

0:00
0:00
Close
Man Dies After Being Pulled Into MRI Machine Due to Metal Chain in New York Clinic
NVIDIA Achieves $4 Trillion Valuation Amid AI Demand
US Revokes Visas of Brazilian Corrupted Judges Amid Fake Bolsonaro Investigation
U.S. Congress Approves Rescissions Act Cutting Federal Funding for NPR and PBS
North Korea Restricts Foreign Tourist Access to New Seaside Resort
Brazil's Supreme Court Imposes Radical Restrictions on Former President Bolsonaro
Centrist Criticism of von der Leyen Resurfaces as she Survives EU Confidence Vote
Judge Criticizes DOJ Over Secrecy in Dropping Charges Against Gang Leader
Apple Closes $16.5 Billion Tax Dispute With Ireland
Von der Leyen Faces Setback Over €2 Trillion EU Budget Proposal
UK and Germany Collaborate on Global Military Equipment Sales
Trump Plans Over 10% Tariffs on African and Caribbean Nations
Flying Taxi CEO Reclaims Billionaire Status After Stock Surge
Epstein Files Deepen Republican Party Divide
Zuckerberg Faces $8 Billion Privacy Lawsuit From Meta Shareholders
FIFA Pressured to Rethink World Cup Calendar Due to Climate Change
SpaceX Nears $400 Billion Valuation With New Share Sale
Microsoft, US Lab to Use AI for Faster Nuclear Plant Licensing
Trump Walks Back Talk of Firing Fed Chair Jerome Powell
Zelensky Reshuffles Cabinet to Win Support at Home and in Washington
"Can You Hit Moscow?" Trump Asked Zelensky To Make Putin "Feel The Pain"
Irish Tech Worker Detained 100 days by US Authorities for Overstaying Visa
Dimon Warns on Fed Independence as Trump Administration Eyes Powell’s Succession
Church of England Removes 1991 Sexuality Guidelines from Clergy Selection
Superman Franchise Achieves Success with Latest Release
Hungary's Viktor Orban Rejects Agreements on Illegal Migration
Jeff Bezos Considers Purchasing Condé Nast as a Wedding Gift
Ghislaine Maxwell Says She’s Ready to Testify Before Congress on Epstein’s Criminal Empire
Bal des Pompiers: A Celebration of Community and Firefighter Culture in France
FBI Chief Kash Patel Denies Resignation Speculations Amid Epstein List Controversy
Air India Pilot’s Mental Health Records Under Scrutiny
Google Secures Windsurf AI Coding Team in $2.4 Billion Licence Deal
Jamie Dimon Warns Europe Is Losing Global Competitiveness and Flags Market Complacency
South African Police Minister Suspended Amid Organised Crime Allegations
Nvidia CEO Claims Chinese Military Reluctance to Use US AI Technology
Hong Kong Advances Digital Asset Strategy to Address Economic Challenges
Australia Rules Out Pre‑commitment of Troops, Reinforces Defence Posture Amid US‑China Tensions
Martha Wells Says Humanity Still Far from True Artificial Intelligence
Nvidia Becomes World’s First Four‑Trillion‑Dollar Company Amid AI Boom
U.S. Resumes Deportations to Third Countries After Supreme Court Ruling
Excavation Begins at Site of Mass Grave for Children at Former Irish Institution
Iranian President Reportedly Injured During Israeli Strike on Secret Facility
EU Delays Retaliatory Tariffs Amid New U.S. Threats on Imports
Trump Defends Attorney General Pam Bondi Amid Epstein Memo Backlash
Renault Shares Drop as CEO Luca de Meo Announces Departure Amid Reports of Move to Kering
Senior Aides for King Charles and Prince Harry Hold Secret Peace Summit
Anti‑Semitism ‘Normalised’ in Middle‑Class Britain, Says Commission Co‑Chair
King Charles Meets David Beckham at Chelsea Flower Show
If the Department is Really About Justice: Ghislaine Maxwell Should Be Freed Now
NYC Candidate Zohran Mamdani’s ‘Antifada’ Remarks Spark National Debate on Political Language and Economic Policy
×