Apple applies limits to new system scanning for child sex abuse

Apple announced limits to a new system they are using to scan for child sex abuse on Friday after facing backlash on the system due to privacy concerns.

In a 14-page document, the company addresses concerns raised by its child sexual abuse material (CSAM) detection system.

Apple announced at the beginning of August its system would match photos with ones provided by the National Center for Missing and Exploited Children to detect child pornography.

“Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations,” the statement of the new policy said. “Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.”

Apple said Friday it would take 30 matching images for the system to activate, giving “the possibility of any given account being flagged incorrectly is lower than one in one trillion.”

“If and only if you meet a threshold of something on the order of 30 known child pornographic images matching, only then does Apple know anything about your account and know anything about those images, and at that point, only knows about those images, not about any of your other images,” Apple SVP Craig Federighi told The Wall Street Journal in an interview Friday.

“Once Apple’s iCloud Photos servers decrypt a set of positive match vouchers for an account that exceeded the match threshold, the visual derivatives of the positively matching images are referred for review by Apple,” the document states.

Apple also says the CSAM hash database will be given material by two child safety groups controlled by different governments.

“Any perceptual hashes appearing in only one participating child safety organization’s database, or only in databases from multiple agencies in a single sovereign jurisdiction, are discarded by this process, and not included in the encrypted CSAM database that Apple includes in the operating system,” according to the company.

The announcement on Friday follows thousands who signed an open letter against the system saying there were privacy and surveillance concerns that needed to be addressed.

“While child exploitation is a serious problem, and while efforts to combat it are almost unquestionably well-intentioned, Apple’s proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products,” the letter said.

LATEST VIDEO