Apple to scan photos for Child Abuse Images, Raising privacy debate

Apple scan photos
Share this
FacebookTwitterPinterestLinkedInTelegramWhatsAppCopy LinkWordPressShare

Apple, iPhones, photos and child safety: What’s happening and should you be concerned?

Apple’s new plan to combat CSAM

Apple announced it will begin scanning users photos stored in iCloud using some new features like NeuralMatch in a bit to fight CSAM.  The company said that it plans to add scanning software to its iPhones, iPads, Mac computers and Apple Watches when the new iOS 15, iPad OS 15, MacOS Monterey and WatchOS 8 operating systems all launch later this year. This information published on its website  said the program is designed to “limit the spread of child sexual abuse material”  and is part of a new collaboration between the company and child safety experts.

It is important to state that this new feature will be rolled out with the next iOS update in the fall and only United States users will be affected by this new policy for the time being.

How will this new feature work?

The Financial Times identified the feature as neuralMatch; The system includes a database of 200,000 images from the National Center for Missing & Exploited Children which have unique identifiers called hashes. These have been calculated and marked in a way that no two images are remotely identical. NueralMatch will proactively alert a team of human reviewers if it believes illegal imagery is detected during scan –then once confirmed by a human person, they would then contact law enforcement if the material can be verified.

The Financial Times said

“According to people briefed on the plans, every photo uploaded to iCloud in the US will be given a ‘safety voucher,’ saying whether it is suspect or not, “Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.”

Apple’s expanded protection for children is a gamechanger

Apple’s expanded protection for children is a gamechanger, said John Clark, the president and chief executive of the NCMEC. “With so many people using Apple products, these new safety measures have lifesaving potential for children.”

Other supporters of the new feature like Hany Farid the inventor of PhotoDNA, whilst admitting the system could be subject to abuse, he is not too concerned about that –Stating that other programs that have been designed to make devices secure from various threats haven’t experienced “this type of mission creep.”

New plan worries privacy advocates – the ability to trick such system?

Similar technology have and are still being used by other tech giants like Google, Facebook  and Twitter have worked with the NCMEC and other organisations to eradicate child sexual abuse content on their platforms. Companies like Microsoft also apply similar technology in identifying such images in search results and emails.

The difference here, is that Apple plans to extend these searches to devices directly, where photos will be scanned on a user’s device prior to being uploaded to the iCloud.

Some media watchdogs are not impressed with this new plan. The Electronic Frontier Foundation, a pioneer in the field of civil liberties online, stated that Apple’s compromise in this area of privacy was “a shocking about-face for users who have relied on the company’s leadership in privacy and security.”

Cryptography expert Matthew Green warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography. That could fool Apple’s algorithm and alert law enforcement. “Researchers have been able to do this pretty easily,” he said of the ability to trick such systems.

Others believe that the image matching tool, which doesn’t even “see” the images, just the mathematical “fingerprints” that represent them, could be used for sinister ends.

Apple’s commitment to privacy

Following out cry on social media and concerns raised by some media watchdogs such as EFF who stated , Apple has been very careful in reiterating its commitment to user privacy and wants to continue to be seen as the leader in complete user privacy when it comes to user data.

Apple released a 6 page FAQ document titled Expanded Protections for Children in a bid to address concerns and provide more clarity and transparency in the process.

Among other things the Apple scan photos new feature is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images.”

Apple says that the feature will not have any “impact to any other on-device data” and that it “does not apply to Messages.” It also stresses that it will refuse any demands from governments looking to expand the feature to include non-CSAM images.

As for the accuracy of properly identifying people, Apple says that “the likelihood that the system would incorrectly flag any given account is less than one in one trillion per year,” with the company conducting a “human review” before sending any report to the National Center for Missing and Exploited Children. Apple concludes that “system errors or attacks will not result in innocent people being reported to NCMEC.”

“Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups,” the company writes. “We have faced demands to build and deploy government-man- dated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future.

“At Apple, our goal is to create technology that empowers people and enriches their lives — while helping them stay safe,” Apple said in a statement. “We want to protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material.”

Research/Sources:

Click to access Expanded_Protections_for_Children_Frequently_Asked_Questions.pdf

https://www.theguardian.com/technology/2021/aug/06/apple-plans-to-scan-us-iphones-for-child-sexual-abuse-images

https://www.cnet.com/tech/services-and-software/apple-iphones-photos-and-child-safety-whats-happening-and-should-you-be-concerned/

Be the first to comment on "Apple to scan photos for Child Abuse Images, Raising privacy debate"

Leave a comment

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.