Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
People are so confused about this.

Apple already scan your photos for
- trees
- dogs
- cats
- ice cream cones
- grass
- sky
- the Moon
- etc.

This is just one more scan, and it’s even less invasive because it’s not based on actual AI analysis of your photos but by comparing hashes of them to hashes of known certified CP. Apple’s not looking at your pics, just smelling them next to a turd and applying a super-precise 1-error-in-a-trillion “turd or not” label. It’s not a backdoor, or even looking for dogs and trees was a backdoor by this logic.

The difference is that this could get you suspended from iCloud and in some cases (is Apple supposed not to report a big pedo “whale”? really?) reported to authorities, but only for multiple offences.

This is like mask mandates and vaccines, a collective sacrifice to get this crap out of the internet. Of course there’s no silver bullet and companies experiment with way to approach thIf Ei

Either you're playing coy or you're being willfully ignorant. We've already been through this at least twice in the past two decades. How many times have Silicon Valley worked with or for the Five Eyes to expand the Surveillance State? Edward Snowden has and continues to warn us about the depth and breadth of their efforts.

Today, it's we're doing it to fight pedos. Tomorrow it can and will be used against a nation's citizens. Don't think what happened in WWII won't happen again in The Woke States of America.
 
I'm sure the Chinese communist government will soon require Apple to scan iCloud servers(based in China) for any photos that contain unapproved images such as Winnie the Pooh or any other non-flattering anti-communist stuff.

You have any insider information or you are just BS because your bias?

Who to say US government or any government won’t abuse this?
 
how many times do we have to go down this road? If another company does it, nobody cares. If Apple does it, the internet is on fire. In this particular case: multiple tech companies already do image scanning for CSAM. Google, Twitter, Microsoft, Facebook, and others use image hashing methods to look for and report known images of child abuse.
 
that is concerning with this. Some spam bot maliciously sending out abunch of these hashtagged photos and Apple flags you, reports you and the up to you to use life savings on a lawyer to prove your innocence.

If we’re talking iMessage, you would have to press and hold to “Save Photo” from the Message app so it ends up in the Photos app, afterwards it would be uploaded to iCloud Photo Library, before that it would be checked against the hashes.
What happens in Messages, stays in Messages, if you don’t actively save the pics.

Now, Messages conversations (including all the pics they contain) are backed up to iCloud, but that doesn’t count for this system.

This particular system (hash match -> multiple offences -> human review -> maybe Apple reports you, maybe not) applies to
- iCloud Photo Library

It doesn’t apply to
- iCloud Backup of Messages app data
- iCloud Backup of Photos app data
- My Photo Stream album

Apple has confirmed that if you disable iCloud Photo Library, the system won’t be enabled.
 
I'm no Edward Snowden fan but this quote and the one from the EFF are perfect:

"No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this," said prominent whistleblower Edward Snowden, adding that "if they can scan for kiddie porn today, they can scan for anything tomorrow." The non-profit Electronic Frontier Foundation also criticized Apple's plans, stating that "even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor."

I've read enough to understand this system won't false flag personal photos on a phone etc. I do trust the only people getting busted for child abuse photos will be true criminals. However, the bottom line is Apple has stated, over and over, that backdoors are unacceptable to them because of the potential for abuse. They have even defended not making backdoors when pressed to help the goverment recover data from known terrorists planning to attack US citizens on our home soil. Now, suddenly, backdoors are acceptable and despite our universal revulsion for child abusers this is the top of a very, very slippery slope where the protections so unique to Apple begin to slip away.

Still on Apple.com: "Apple has never created a backdoor or master key to any of our products or services. We have also never allowed any government direct access to Apple servers. And we never will."
 
Their courage has landed them in hot water. I am very surprised they went forward with this. I can imagine they went through some lengthy think tanks but have somehow decided this was a good idea. This will not be good for Apple in the long run, and even worse for everyone in general.

Glad to see everyone being so vocal about it.
From marketing standpoint this is the truly an epic crises for Apple. For years Apple has stated the importance of privacy and how they will protect it. They even refused to decrypt the data on Boston bombers iPhone because it could lead to same tech being used to access other devices also. Even if creating a back door to a terrorists phone sounds tempting I can see the bigger picture and understand the potential of using it for mass surveillance etc.

Now for some reason Apple has decided to create a back door and scan every single picture on everyone’s device. If Apple really thinks they will catch child abusers or those distributing child porn they are showing serious lack of brain activity. Those who collect and distribute that material most likely know they have to keep it private and secure so Apple only ends up scanning photos of regular users. Which some like NSO Group will surely utilise for the benefit of their clients.

All in all, this is marketing suicide for Apple. The idiot who came up with the idea of scanning every single Apple users picture library needs to be send to North Korea so he/she learn what truly happens to those who think different.
 
This is what you get when EVERYONE in the US feels Apple is the best any only solution. I feel bad for those people who bought Apple products just because they wanted to belong. Everyone else in the background waving red flags on their business practices and monopolistic behaviors. To late now.. Sad.
It is a bit rich that up until this, a good chunk of the forum was more than happy to be all but locked into using Apple's products and services if you use an iPhone. Now everyone is all, "but muh security, wtf Apple!"
Why are people so upset? Time and time again people say this is Apple's OS. They own it, not you. You have no rights. You're just licensing their software and if you don't like the way Apple operates, there are other smartphone options or develop your own.

This is what we get when it's a closed system so deal with it.
Exactly. The irony is too much.
 
Turns out 1 in a trillion isn't quite as reassuring as you might think...

webpc-passthru.php

so, 1 person will be misidentified in the year 2020.

i stand by my statement, you have a higher chance of winning the lottery.
 
  • Angry
Reactions: TakeshimaIslands
As many have stated before, this is ripe for abuse.

Why would Tim and Apple spend so much time trying to differentiate the brand using a privacy-first mantra, then out of nowhere launch this?

It makes no sense.
It is quite contrary to their marketing message. Even if the self-proclaimed software gurus on here insist this is not the case, on the business end, the perception is truly a contrary direction. If they were going to do something like this, they should have spent the same amount of time and money to brainwash us into accepting this. As it is there is such a strong whiplash effect and a sense of betrayal.
 
how many times do we have to go down this road? If another company does it, nobody cares. If Apple does it, the internet is on fire. In this particular case: multiple tech companies already do image scanning for CSAM. Google, Twitter, Microsoft, Facebook, and others use image hashing methods to look for and report known images of child abuse.

This.

This is straight up Apple hate. People are in this thread saying it's about privacy and freedom, but really, it's just another reason to bash Apple.
 
The hilarious reaction from the 'Apple can do no wrong crowd' just screams r/LeopardsAteMyFace.

 
You're being dishonest.

This is not going to affect those who do not have child pornography on their phone.
Well. Clearly you dont understand how hash works. Knowing the hash from the database you can easily manipulate catbpics to be flagged. You have enough flags and have cops at your doors. It can be your son photos or gf photos …
 
Uh, that's the point.

Why are we still pretending privacy is the thing in 2021?

I mean your bank process transactions and probably will report any suspicious transactions to police? No body is freaking out.

Your cellphone carrier probably have more data on you than government and probably will report to police if there are any suspicious activities. No body is freaking out.

In the future, all the connect car and connect staffs will all track everything you do, everywhere you go. Yet everybody is jump on smart car, smart watch, smart home bandwagon.

I mean you are using a phone build by someone, software written by someone. How do you know if your iOS software not ping some IP all the time?

I find funny how everybody is freaking out when Apple is scanning child porn. I mean come on, whether you like or not, this is bond to happen
 
Either you're playing coy or you're being willfully ignorant. We've already been through this at least twice in the past two decades. How many times have Silicon Valley worked with or for the Five Eyes to expand the Surveillance State? Edward Snowden has and continues to warn us about the depth and breadth of their efforts.

Today, it's we're doing it to fight pedos. Tomorrow it can and will be used against a nation's citizens. Don't think what happened in WWII won't happen again in The Woke States of America.

Are you saying the secure enclave CHIP which encrypts all your personal files on iPhone is useless?

it is the secure enclave which removes end to end encryption on your iphone photos because YOU as a user CHOSE to use iCloud. you have a choice to completely lock down your device OR you can use iCloud photos and let Apple scan files for child sexual exploitation. Apple is giving you a choice here. you dont need to turn on iCloud photos if you don't want to. It is not even turned on by default.

you have to first opt in to iCloud

THEN

you have to opt in to CSAM

there are 2 levels of opt in before your files are scanned.

do people think Apple is FORCING users to scan the files? are we that dumb as iPhone users?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.