Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It's about bashing Apple. No one yells at Google for their GMail scanning for child sex abuse imagery. But Apple is for some reason different.
Reports like this why Apple is not different.

Two iPhone repair technicians in Sacramento uploaded “10 photos of her in various stages of undress and a sex video” to her Facebook account, resulting in “severe emotional distress” for the young woman, according to the Telegraph’s review of legal records.

 
Not being funny, but I've never visited such a paranoid forum as this. It wouldn't be so bad if people got the basic facts right, but to continue to spread misinformation and believe that Apple employees will be "snooping" on photos - even though it's been said multiple times that the actual process is automated by AI, looking for matches with a database - is ridiculous.

Makes me wonder how they react when they see a speed camera on the highway...
 
Only photos saved on iCloud are scanned... ah... because nobody does that, right? :)
What do they want to achieve? Child molesters are warned now. They will get rid of those pictures. What will they do instead? Molest children, because as long as the fotos they take are not in the database, they're safe.
 
Not being funny, but I've never visited such a paranoid forum as this. It wouldn't be so bad if people got the basic facts right, but to continue to spread misinformation and believe that Apple employees will be "snooping" on photos - even though it's been said multiple times that the actual process is automated by AI, looking for matches with a database - is ridiculous.
It’s been said. Not proven!

Huge difference.
 
It doesn't matter how they dress it up, this oversteps the mark. There is no difference between this and a keystroke logger built into a keyboard looking for specific words and phrases. This is fundamentally identical to a wiretap and needs to be treated in the same way with explicit specific permission sought prior to each and every use of the technology, not mass surveilance just in case there might be criminals.

Innocent until proven guilty is still alive and well in some countries, and it's up to all of us to ensure our home is on that list.

Also worth bearing in mind that processing personal data without my explicit consent for the usage is illegal under GDPR and similar laws. This is not a law enforcement agency doing the scanning and so is not exempt.
 
Of course not. As the old joke goes, “won’t someone think of the children?”

The question is, now Apple has created the ability to use the hash, are we to expect that capability will not be expanded? At this point we only have a promise from Apple it won’t. Well, better for Apple if it didn’t put itself in a position to have to promise in the first place.
Well, the next step is to protect the world from terrorists. Nobody is against fighting terrorism, right? So they will generate hashes of war weapons. However, to avoid false alarms (as firearms are legal in the U.S.), they have to refine it and generate hashes for faces of known terrorists. 100% safe, believe me!

Next up: Drugs! Need to invoke regional information this time, so need to involve user‘s global position, requiring constant tracking. But the data is encrypted and anonymized, rest assured!

And then Apple takes care of
  • indecent lingerie and amoral sex practices (applying U.S. standards for the whole world, as the difference would be too big to properly consider in code - after all Apple has experience with oversimplification)
  • unhealthy ways of living (Apple watch data could be considered here as well, so there won’t be false alarms and insurance companies only raise the fees for people not living healthy enough)
  • people consuming protected animals and plants
  • environment pollution (you have been tracked at the same time near the site where someone dumped a huge amount of chemicals into nature - what‘s your alibi?)
  • people reading forbidden books (with a regionalized approach to properly consider the needs of local dict… errrr governments)
What a wonderful mix of „1984“ and „Brave New World“ …
 
It doesn't matter how they dress it up, this oversteps the mark. There is no difference between this and a keystroke logger built into a keyboard looking for specific words and phrases. This is fundamentally identical to a wiretap and needs to be treated in the same way with explicit specific permission sought prior to each and every use of the technology, not mass surveilance just in case there might be criminals.

Innocent until proven guilty is still alive and well in some countries, and it's up to all of us to ensure our home is on that list.

Also worth bearing in mind that processing personal data without my explicit consent for the usage is illegal under GDPR and similar laws. This is not a law enforcement agency doing the scanning and so is not exempt.

And yet people trust third party anti-virus/anti-malware and firewalls to do the right thing (you give them permission to access your files and disks). Similarly for your ISP which may be doing a variety of things (DPI, traffic shaping, monitoring) without you ever knowing about it.
 
Exactly.

Everybody is hating on Apple when just about everybody else has been doing it too - and possibly for longer. Apple is an easy target, apparently.
Apple is known to be all about securing users privacy. Tell me what happened to this campaign? They do not align together in the year of 2021 anymore.

1628501371244.jpeg
 
EXSUM of their "FAQ": 0.0%. new information provided - guess the voice of the "shrieking minority" was deemed insignificant ... which we already knew was Apple's stance 😂
 
Well, the next step is to protect the world from terrorists. Nobody is against fighting terrorism, right? So they will generate hashes of war weapons. However, to avoid false alarms (as firearms are legal in the U.S.), they have to refine it and generate hashes for faces of known terrorists. 100% safe, believe me!

Next up: Drugs! Need to invoke regional information this time, so need to involve user‘s global position, requiring constant tracking. But the data is encrypted and anonymized, rest assured!

And then Apple takes care of
  • indecent lingerie and amoral sex practices (applying U.S. standards for the whole world, as the difference would be too big to properly consider in code - after all Apple has experience with oversimplification)
  • unhealthy ways of living (Apple watch data could be considered here as well, so there won’t be false alarms and insurance companies only raise the fees for people not living healthy enough)
  • people consuming protected animals and plants
  • environment pollution (you have been tracked at the same time near the site where someone dumped a huge amount of chemicals into nature - what‘s your alibi?)
  • people reading forbidden books (with a regionalized approach to properly consider the needs of local dict… errrr governments)
What a wonderful mix of „1984“ and „Brave New World“ …

What a load of nonsense.

You do realise that this technology is being deployed in the way it is because the actual illegal material is the images themselves?
 
Google and others don't sell their products with a promise of privacy. That's the problem here. Apple does.

The image matching based on hashes is done on-device, but it only happens if you have iCloud Photo Library enabled. So if that bothers you for whatever reason, disable the feature.

Many other cloud storage services are already doing that, in a much less privacy-preserving way.
 
Apple is known to be all about securing users privacy. Tell me what happened to this campaign? They do not align together in the year of 2021 anymore.

View attachment 1816471

That is literally true in this case. If your iPhone (not a server) finds a photo that matches CSAM, then that photo stays on your phone and it doesn't get sent to the could.
 
I find it really scary how many people on this forum defend Apple for this technology. I fear the day when governments will dictate to Apple what images to search for to make it easier for them to track down certain people (like people in China who want a democracy or homosexual people who enter places in Poland's LGBT-free zones).

Yes, other companies have been doing this in one way or another for years, but Apple until recently stood as a sign of privacy for their customers, which I was willing to pay a premium for. Over the weekend I migrated all my data from iCloud to my private server and will never use iCloud again.

The trust I once had in Apple has been irrevocably destroyed.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.