Apple today held a questions-and-answers session with reporters regarding its
new child safety features, and during the briefing, Apple confirmed that it would be open to expanding the features to third-party apps in the future.
As a refresher, Apple unveiled three new child safety features coming to future versions of iOS 15, iPadOS 15, macOS Monterey, and/or watchOS 8.
Apple's New Child Safety Features
First, an optional Communication Safety feature in the Messages app on iPhone, iPad, and Mac can warn children and their parents when receiving or sending sexually explicit photos. When the feature is enabled, Apple said the Messages app will use on-device machine learning to analyze image attachments, and if a photo is determined to be sexually explicit, the photo will be automatically blurred and the child will be warned.
Second, Apple will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies. Apple confirmed today that the process will only apply to photos being uploaded to iCloud Photos and not videos.
Third, Apple will be expanding guidance in Siri and Spotlight Search across devices by providing additional resources to help children and parents stay safe online and get help with unsafe situations. For example, users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report.
Expansion to Third-Party Apps
Apple said that while it does not have anything to share today in terms of an announcement, expanding the child safety features to third parties so that users are even more broadly protected would be a desirable goal. Apple did not provide any specific examples, but one possibility could be the Communication Safety feature being made available to apps like Snapchat, Instagram, or WhatsApp so that sexually explicit photos received by a child are blurred.
Another possibility is that Apple's known CSAM detection system could be expanded to third-party apps that upload photos elsewhere than iCloud Photos.
Apple did not provide a timeframe as to when the child safety features could expand to third parties, noting that it has still has to complete testing and deployment of the features, and the company also said it would need to ensure that any potential expansion would not undermine the privacy properties or effectiveness of the features.
Broadly speaking, Apple said expanding features to third parties is the company's general approach and has been ever since it introduced support for third-party apps with the introduction of the App Store on iPhone OS 2 in 2008.
Article Link:
Apple Open to Expanding New Child Safety Features to Third-Party Apps
Pandoras Box?
(Link:
Pandora's_box) I guess, this is it for privacy on all fronts.
Apple caving in, is a bit surprising, but… someone must have “convinced “ Tim Apple, that this is the way forward. By the means necessary.
I can see a loooot of company admins starting to get questions from the higher ups. Like “Is our data safe”? What about content on a company network, that has ANY Apple product connected? It will surely not be “only” iPhones and iPads (the newest one sharing the same capabilities and chipsets with Mac mini, MacBooks, iMacs).
How about the odd MacBook with Apple Silicon used in company settings? Suddenly it looks like Apple has created a whole new market: Protection against unknown “data exports” with “unknown content” to “unknown receivers” for “unknown reasons”.
How about iPhones used in South America? In Europe? In Asia? In the Middle East? What if a European citizen transfers in New York on the way to Guatamala? Will contents suddenly be hashed and transferred to US-servers? You’re legally spied upon by both NSA, CIA and a whole host of US agencies. Airside EU citizens are not protected in any way by US laws. US citizens outside US soil are not protected by US laws either, and certainly not from surveillance by foreign powers. Democracies or not. The laws, that apply, are the laws of the land, where you stay. Like it or not.
How do you distinguish hashes of images and hashes of documents or books? The data pushed to Apple’s servers, will - of course - be encrypted, and no ordinary person will know, what is actually transferred, when push comes to shove. You’ll have to trust Apple to ONLY do good, and NEVER cave into ANY government pressure ANYWHERE. All countries in this world have security laws, that require any legal entity to support any legal request to the fullest - also to accept, that the existence of any such request is not divulged to affected parties.
If Apple has access to phone content, and the ability to transfer information of any content on the device (or any connected content e.g. content on your local NAS, or information present on the company network, you’re connected to), no modern country needs to create new laws to get access to the information. The twist in China was, that China demanded that data of Chinese citizens be stored in China, and not in the US (Imagine, that data from US citizens were to be stored on Chinese servers - that would probably lead to a quick response from US-authorities too, don’t you think?).
No new laws will be required anywhere, if Apple gets even an inkling of a slight thought of granting permission to execute ANY scans, but Apple has just signalled “willingness” and “ability”, where none was so blatantly present before.
This is the “new privacy” in modern “double talk”. What governments WILL get access to and know, will of course be available to others. NSA could not even keep their own spy-tools safe; why do you even suspect, that any government agency can keep things secret. It’s not that US military or security personnel are not engaging in highly profitable “private information distribution enterprise” covering - maybe hitherto - widely unknown - ahem - “government facts” from time to time (I refrain from using the word intelligence, where governments are involved - expediency yes, but intelligence? No!).
What Apple is suggesting looks like (US controlled!) NSO Group Pegasus on steroids, and it begs the question “why”?
Now, whatever REAL reason Apple has for the new approach, what is to prevent other multinationals from getting inspired? And governments everywhere?
Should any Word document, you open on a computer, automatically be scanned, and a “hash” automatically sent to Microsoft for further processing and possible action? What will be the basis for the hash? A set of words or phrases? Like “democracy”, “freedom” or even “Winnie the Pooh” etc.?
Will - for instance - US-citizens traveling abroad be subject to “foreign scanning rules”, when travelling to, let’s say Mexico? Using their iPhones in Cancun, Oaxaca, San Miguel de Allende or Mexico City? How about a European departing from Helsinki, Finland passing through Hong Kong in transit to Brisbane, Australia? Many quite ordinary books, documents, songs and videos pose no problem what so ever in Europe, the US, Canada etc., but they will be “illegal and punishable content” by existing on your phone in Hong Kong - also “airside”.
Who is to say, that Byelorussia wouldn’t get “hashes” of “items of interest” on their citizens. That country recently hi-jacked a Ryan Air flight between two European capitals and forced it to land by threads from fighter aircrafts because a dissident was spotted on the passenger list! Lukashienko is a ruffian with around 30 years in power, and few regimes would behave with so obvious stupidity, but that does not prevent any regime from seeking access to your physical person in order for you to help the authorities in their enquiries - based on a hash of a word or phrase in ANY language.
Just for fun, two completely innocent examples: In Denmark you can see signs in elevators containing the words “I FART” (which light up, when the elevator is moving, meaning “AT SPEED”) and “GODSELEVATOR” (which just means “GOODS ELEVATOR”). Now imagine, what religious police in some countries could interpret the latter term to represent? Blasphemy? In some countries punishable by death. Also for minors. After a looong period of “interrogation”, if you’re extremely lucky, you may not even be keen on being released into “freedom” without any emergency hospital treatment.
Do you really think, that the world will let Tim Apple control, what is scanned, hashed and transferred? And would it at all be guaranteed, that it was ONLY hashes, and not complete sentences or paragraphs containing a “suspect phrase”, that was transferred. Or a hash indicating an image of Martin Luther King on your phone today - and only today - in a future Trump-leaning - or worse - US administration? The cloud never forgets.
Just to illustrate, that if you open Pandora’s box, you’d better be absolutely certain about, what you’re starting. There’s no way back. Only forward. Especially, when the “power hungry” smell “opportunities” for influencing the future limits on the freedom for “the unwashed masses”.
Until now, you had the possibility or at least an illusion of privacy. Encryption could go a long way, but most countries in this world have no problem breaking any encryption, if you’re in their hands. They just use the good ol’ universal decryption key (you know, the “rubber hose” approach or the CIA method “water boarding” deemed legal by a former US administration).
What Tim Apple has demonstrated, is, that the time of privacy has passed. It will no longer exist for any of us. Governments will see to that, and you can do nothing to prevent it. Pandoras box has opened, and there is no way back. Not for Tim Apple. Not for Apple. Not for any of us.
Life will become rough in a lot of places on this planet within the foreseeable future.