Apple has delayed the rollout of the
Child Safety Features that it announced last month following negative feedback, the company has today announced.
The planned features include scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.
Apple confirmed that feedback from customers, non-profit and advocacy groups, researchers, and others about the plans has prompted the delay to give the company time to make improvements. Apple issued the following statement about its decision:
Following their announcement, the features were criticized by a wide range of individuals and organizations, including
security researchers, the privacy whistleblower
Edward Snowden, the
Electronic Frontier Foundation (EFF), Facebook's
former security chief,
politicians,
policy groups,
university researchers, and even some
Apple employees. Apple has since endeavored to dispel misunderstandings and reassure users by releasing detailed information, sharing
FAQs, various
new documents,
interviews with company executives, and more.
The suite of Child Safety Features were originally set to debut in the United States with an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. It is now unclear when Apple plans to roll out the "critically important" features, but the company still appears to be intent on releasing them.
Article Link:
Apple Delays Rollout of Controversial Child Safety Features to Make Improvements