Apple employees are now joining the choir of individuals raising concerns over Apple's plans to scan iPhone users' photo libraries for CSAM or child sexual abuse material, reportedly speaking out internally about how the technology could be used to scan users' photos for other types of content, according to a report from
Reuters.
According to
Reuters, an unspecified number of Apple employees have taken to internal Slack channels to raise concerns over CSAM detection. Specifically, employees are concerned that governments could force Apple to use the technology for censorship by finding content other than CSAM. Some employees are worried that Apple is damaging its industry-leading privacy reputation.
Apple employees in roles pertaining to user security are not thought to have been part of the internal protest, according to the report.
Ever since its
announcement last week, Apple has been bombarded with criticism over its CSAM detection plans, which are
still expected to roll out with iOS 15 and iPadOS 15 this fall. Concerns mainly revolve around how the technology could present a slippery slope for future implementations by oppressive governments and regimes.
Apple has firmly pushed back against the idea that the on-device technology used for detecting CSAM material could be used for any other purpose. In a
published FAQ document, the company says it will vehemently refuse any such demand by governments.
An
open letter criticizing Apple and calling upon the company to immediately halt it's plan to deploy CSAM detection has gained more than 7,000 signatures at the time of writing. The head of WhatsApp has also
weighed into the debate.
Article Link:
Apple Employees Internally Raising Concerns Over CSAM Detection Plans