Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I have an android phone on order. And no, I don't trust google as far as I can throw them, but they don't do on device scanning, yet, Apple is leading the way. But at least that android phone is truly different (a S Flip 3), unlike years of "faster" and better camera. I was already getting tired of that, this just pushed me WAY over the edge.

Same. I'm going to see what the Pixel 6 is before making a decision, but the 5a is a REALLY good phone for a really good price. Or possibly a Galaxy S21 line, which is now discounted heavily. I expect no Note this year so this should be the end of the major, US available at least, phone announcements really.

And since I went prepaid on Mint, since $20/month with more data is WAY better than $60 on ATT, I have to buy any new phone outright. Not that money is a concern at all, and a partial business expense anyway truly, but why spend double? Im getting tired of the premium phone thing personally and do I really need it anymore after buying many $1k+ iphones now.

And no, it is not JUST CSAM. But anyone who was on the fence already with Apple, or doesn't care what OS, may be pushed off it now.
 
Last edited:
If they don’t cancel this I’m seriously going to have to look at alternative products, which saddens me.

Which “alternative products”? Android flavours? If you feel strongly about it, why not do something? You could write an email to Cook to voice your concerns and ask your social media circle to do the same. This might actually help.
 
  • Like
Reactions: briko
Which “alternative products”? Android flavours? If you feel strongly about it, why not do something? You could write an email to Cook to voice your concerns and ask your social media circle to do the same. This might actually help.

With respect, he will never read it. There are hundreds of thousands or millions of people vocally opposing this. And prominent people and groups/researchers/foundations out there already VERY vocal asking Apple to stop. Some big names attached. Apple has not expressed any intent to change course still.

Some random guy writing a better isnt going to change Apple's mind here. It's a waste of time honestly.
 


Respected university researchers are sounding the alarm bells over the technology behind Apple's plans to scan iPhone users' photo libraries for CSAM, or child sexual abuse material, calling the technology "dangerous."

apple-privacy.jpg

Jonanath Mayer, an assistant professor of computer science and public affairs at Princeton University, as well as Anunay Kulshrestha, a researcher at Princeton University Center for Information Technology Policy, both penned an op-ed for The Washington Post, outlining their experiences with building image detection technology.

The researchers started a project two years ago to identity CSAM in end-to-end encrypted online services. The researchers note that given their field, they "know the value of end-to-end encryption, which protects data from third-party access." That concern, they say, is what horrifies them over CSAM "proliferating on encrypted platforms."

Mayer and Kulshrestha said they wanted to find a middle ground for the situation: build a system that online platforms could use to find CSAM and protect end-to-end encryption. The researchers note that experts in the field doubted the prospect of such a system, but they did manage to build it and in the process noticed a significant problem.
Since Apple's announcement of the feature, the company has been bombarded with concerns that the system behind detecting CSAM could be used to detect other forms of photos at the request of oppressive governments. Apple has strongly pushed back against such a possibility, saying it will refuse any such request from governments.

Nonetheless, concerns over the future implications of the technology being used for CSAM detection are widespread. Mayer and Kulshrestha said that their concerns over how governments could use the system to detect content other than CSAM had them "disturbed."
Apple has continued to address user concerns over its plans, publishing additional documents and an FAQ page. Apple continues to believe that its CSAM detection system, which will occur on a user's device, aligns with its long-standing privacy values.

Article Link: University Researchers Who Built a CSAM Scanning System Urge Apple to Not Use the 'Dangerous' Technology
It is conceivable if China got their hands on the key.
 
  • Like
Reactions: Mal Blackadder
obviously, Apple will be the gatekeeper here. They can allow or deny government interference in this system.

But isn’t that always the case? They already get requests for data, to install back doors, etc and each time Apple has to allow or (hopefully) deny these requests. So I’m not sure CSAM changes anything. They can say yes or no as before.

As a customer your only option is to trust or distrust the company. I can’t imagine Apple is doing a 180 here on their privacy stands. Otherwise they wouldn’t have made a big deal of it all these years.
 
  • Like
Reactions: cupcakes2000
From what I’ve read here: they don’t. The problem with their service was that it used an external server to scan for content. That’s not the case in Apple’s implementation at all. All communication is completely end-to-end encrypted. Malicious users can still send offensive material to whomever they want and no-one except for the receiving user will know about it. However, with the new service, if parents choose to enable it, children’s accounts will scan the received images after decrypting them but before displaying them and present the minor with a content warning. If the kid is below the age of thirteen, the parents will can choose to get a warning that improper material was sent to their child. None of this is enabled by default. No external sources are alerted, the service (iMessage) or its provider (Apple) don’t get a notification at all. So, the E2E-messaging is still safe, but children get an optional layer of protection from creeps. Also, older minors can avoid unsolicited dick pics without their parents knowing about it (just in case some moronic parents try to blame their kids just for receiving that kind of harassment. Sadly, victim blaming is not unheard of).
All good and fair, you are describing one feature that Apple plans to add to iOS 15, but the rage is about a completely different feature which Apple unfortunately presented at the same time, having a lot of people confused.

You being a prime example.

Minor users iMessage protection ≠ automated CSAM scanning of your photos
 
All it will take to frame someone is one malicious app with some encrypted CSAM hidden in it calling UIImageWriteToSavedPhotosAlbum() to put 30 incriminating images into someone's photo library. If they have iCloud Photos enabled the on-device scanning will then pick those images up and report the device's owner to Apple despite that person being innocent.

Worse still, if Apple is declared a monopoly and forced to open up the walled garden it will be much easier to get apps like this onto devices from cowboy app stores as there would be no checks by Apple like they do with the App Store.
 
  • Like
Reactions: diego
They clearly don't know how the technology works.. oh wait
They clearly don't know that Cloudflare has been offering "fuzzy hash" CSAM scanning for all its customers for nearly 2 years, and it seems neither do you: https://blog.cloudflare.com/the-csam-scanning-tool/
All this is doing is moving the hashing from the encrypted network to the device.
I will also point out once again that [almost?] all services already use hashes pre- and post-upload to verify file integrity and report a successful upload.
I don't know the details of their system, but it sounds like Apple's system will perform the scanning locally only when iCloud photos are enabled and most likely because the "fuzzy hashing" wouldn't work once the file is encrypted and transferred onto their servers.
 
"Jonanath Mayer, an assistant professor of computer science and public affairs at Princeton University" - Is that assistant professor or assistant to the professor?
 
  • Haha
Reactions: Margorilla
Don´t like the way Apple is going, seems creepy, feels like they assume you are guilty until proved innocent.

I am now playing with, and learning my new Google free Android Phone. Lots to learn, plenty of work, some good points - set your own, different notification sounds etc in all apps, some bad - converting Pages documents to something Android can read.

The built in app store is interesting, it shows all the permissions an app requires and any tracking the app does.

So far I prefer Apple phone speakers though, better sound, but thats minor. Will try for a few weeks, if happy, then no more iPhones and iPads for me. Hopefully one day a Galaxy Fold 3 will run this os.
 
All it will take to frame someone is one malicious app with some encrypted CSAM hidden in it calling UIImageWriteToSavedPhotosAlbum() to put 30 incriminating images into someone's photo library. If they have iCloud Photos enabled the on-device scanning will then pick those images up and report the device's owner to Apple despite that person being innocent.

Worse still, if Apple is declared a monopoly and forced to open up the walled garden it will be much easier to get apps like this onto devices from cowboy app stores as there would be no checks by Apple like they do with the App Store.
Let's say that will work (I'm not sure of the intricacies or reality of it). All it will take now is for the malicious app to do those exact same things and trigger some sort of communication to the authorities that you are in possession of CSAM material. Said malicious app could also post messages as you on various forums to get you in trouble with authoritarian regimes. The issue would exist whether there was "fuzzy hashing" or not.
 
With respect, he will never read it. There are hundreds of thousands or millions of people vocally opposing this. And prominent people and groups/researchers/foundations out there already VERY vocal asking Apple to stop. Some big names attached. Apple has not expressed any intent to change course still.

Some random guy writing a better isnt going to change Apple's mind here. It's a waste of time honestly.
Well - I guess we all could .... tcook@apple.com would be my guess. Of course he most likely will not read it. But perhaps when he hears that his inbox has melted, it may cause a pause ?
 
"A foreign government could, for example, compel a service to out people sharing disfavored political speech. That's no hypothetical: WeChat, the popular Chinese messaging app, already uses content matching to identify dissident material. India enacted rules this year that could require pre-screening content critical of government policy. Russia recently fined Google, Facebook and Twitter for not removing pro-democracy protest materials."

Without taking a side in this, none of those things in the above warning could be forced on Apple with what Apple says they're doing. Apple has a list of hashes and they're checking images uploaded to iCloud to see if they match that hash. WeChat's content matching would require text content analysis — totally different. The India example appears to be the same kind of thing (or maybe a requirement for human pre-screening, which is more different still). And the Russia example is where Russia identified posts or pictures and demanded they be removed; which is absolutely not the same thing.

So... this article seems to be people urging Apple not to proceed with its plans, based on warnings that have little to do with what Apple is actually doing.

By their nature, the hashes of two identical files will match. All an adversary needs to do is compute the hashes of all the images they want automatically flagged and submit them to Apple. Sure, Apple can say "no" but that no is different from "that's impossible"

I have a feeling this is Apple's attempt to avoid certain governments really coming after them *and* a way to allow them to encrypt iCloud end to end but still look for illegal content. It's a smart system.

In the reality we live in, it will be abused.
 
A lot of folks pooping the bed again, just for clarity (not that I am sure any bed poopers will care), this article isn’t talking about Apples implementation itself just the overall principle of CSAM scanning and they themselves said that a couple major things need to be fixed, both of which Apple documents.

Not saying folks should drink kool aid but the article appears sensational until you logically break it down.

Also interesting to note that Apple worked with Stanford on this, not their university and they are not even referenced in the papers.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.