Apple's Head of Privacy, Erik Neuenschwander, has responded to some of
users' concerns around the company's plans for new
child safety features that will scan messages and Photos libraries, in an interview with
TechCrunch.
When asked why Apple is only choosing to implement child safety features that scan for Child Sexual Abuse Material (CSAM), Neuenschwander explained that Apple has "now got the technology that can balance strong child safety and user privacy," giving the company "a new ability to identify accounts which are starting collections of known CSAM."
Neuenschwander was asked if, in retrospect, announcing the Communication Safety features in Messages and the CSAM detection system in iCloud Photos together was the right decision, to which he responded:
When asked if Apple was trying to demonstrate to governments and agencies around the world that it is possible to scan for illicit content while preserving user privacy, Neuenschwander explained:
He was asked if Apple had created a framework that could be used for law enforcement to scan for other kinds of content in users' libraries and if it undermines Apple's commitment to end-to-end encryption.
Neuenschwander was then asked if Apple could be forced to comply with laws outside the United States that may force it to add things that are not CSAM to the database to check for them on-device, to which he explained that there are a "number of protections built-in" to the service.
Neuenschwander continued that for users who are "not into this illegal behavior, Apple gain no additional knowledge about any user's cloud library," and "it leaves privacy completely undisturbed."
See
TechCrunch's full interview with Neuenschwander for more information.
Article Link:
Apple Privacy Chief Explains 'Built-in' Privacy Protections in Child Safety Features Amid User Concerns