Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
its not about 1 in trillion. Its about principles. The basic fact is that Apple will be scanning your imiges (if you using Cloud they say...) but than why to do it on my phone? Please use your technology on the iCloud servers, you can decrypt all back ups and photos anyway. Stay away from my device!
That’s a dumb take, locally files and photos were already indexed, sliced, diced, searched for faces, cats, dog, trees, sunset, that wasn’t principle-offending? That wasn’t a slippery slope? That couldn’t be pressured by Xi?

Basically to be coherent you should ask to
- ban any file indexing
- ban any hashing, don’t take fingerprints of my stuff for any purpose because then you could be pressured by dictators to do more with that!
- ban any metadata
- ban any AI deep analysis to look for cats and dogs in my pics

Basically people have no clue about what OSes already do locally once they’re logged in.

Apple should be commended for doing as much as possible of this stuff locally.
 
People do realise that companies such as Google, Adobe, Facebook et. al already use some form of automated technology to scan for and detect CSAM? Adobe does it with Creative Cloud:


That's just one example.

Google has been doing this with GMail since 2014


No one bats an eye for that

But when Apple does it, NOW everyone gets upset

Why is this?

Yes, and this is why I do not use Google and Facebook any more.

Now I use ProtonMail, DuckDuckGo, FireFox, and so on.

There is a huge movement against Google and FB including a subreddit dedicated to de-googling your life:DeGoogle
But there is no DeApple , but might be soon joining the list.
 
Google has been doing this with GMail since 2014


No one bats an eye for that

But when Apple does it, NOW everyone gets upset

Why is this?
It is very simple. Cook tries to come across as caring deeply about users and their experiences with his products and services. His posturing, choice of terminology, cringe worthy presentations are big turn offs when the reality comes out. The other companies do not parade their CEOs out to act like they will protect you and do everything in their power to support human rights etc. Cooks half truths, zero truths and phoniness is what has made many of us stop using Apple products and services.
 
  • Like
Reactions: 09872738
Windows search shouldn’t index my files, what if that indexing is transmitted out of my system to the internet?

Windows isn’t open source, how do we know?
 
I bet you the Chinese government will be very interested in this technology.

Notice how Apple keeps changing the wording and being very careful with the document. I hope this backfires and Apple catches themselves in a massive lawsuit.

Privacy matters. Apple: Let us (the consumers decide) if we want you to scan our iPhone.

Apple you are a TECH company. You are not a law enforcement. Don’t lose your vision. Stop chasing the $.

Reports like this will be out left and right…



Easiest way to avoid this spying technology.

1. Turn off iCloud photos and messages.
2. Do not login to your iPhone using your iCloud credentials.
3. Possibly use a fake account to continue using your iPhone. Or, simply do not login with your Apple ID at all.

View attachment 1816465
Under that assumption, they could do whatever they want behind closed doors already. Hell, why even publish this stuff about CSAM, if they could just do it without telling people, as you say?
That is the issue. When abuse starts, we won't hear about it.
 
  • Like
Reactions: Wildkraut
Yes, and this is why I do not use Google and Facebook any more.

Now I use ProtonMail, DuckDuckGo, FireFox, and so on.

There is a huge movement against Google and FB including a subreddit dedicated to de-googling your life:DeGoogle
But there is no DeApple , but might be soon joining the list.

Better DeInternet, it's for the best.
 
  • Like
Reactions: movielad
This isnt good because in all likely it can be abused now or in the future.

Nobody would want a camera in its apartment/house that they say only scan for illegal activites right?

Not i think we can do much about it i have pretty much taken the aproach anything stored in cloud is public information
that way if i dont want the info to be public i store the data inhouse.
 
People do realise that companies such as Google, Adobe, Facebook et. al already use some form of automated technology to scan for and detect CSAM? Adobe does it with Creative Cloud:


That's just one example.
The way I heard it described on a podcast was like this:

Apple is not doing a visual scan of your photos. They're not looking at the actual contents of your photos.

They are, instead, comparing hashes of *known* CSAM images. These are photos that have already been labeled as child porn.

So there's no danger of Apple flagging a photo of your child in the bathtub or whatever.

With all that said... no one knows what else Apple could do in the future. Perhaps they could start scanning the actual contents of your photos. So I can see why people are freaked out.

But as others have said... all of the big companies are doing similar things. So I dunno.
Exactly my view too. It's a very well intentioned move but what stops it from being abused? If it was something that cannot be abused please am all for it. Scan away.
 
  • Like
Reactions: Michael Scrip
"No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this," said prominent whistleblower Edward Snowden, adding that "if they can scan for kiddie porn today, they can scan for anything tomorrow."

YUP! Get ready for an Apple employee to snoop around your wife’s, girlfriend, ex, aunty, mom and dad, kids, friends photos. An Apple employee will definitely checking them out… out of curiosity… humans nature. People are nosy.

Everything will be exposed unfortunately. They will know it all and will be collecting the data.


Expecting reports like this in the future…

Well, in such cases it will surely be one of those very rare "false positives" ;D
Apple won't scratch their image with their own surveillance tools.
 
Last edited:
Wonder why you would get down voted? I have been reading about people on the Apple centric sites how they are going to dump iCloud and go to Google Microsoft or other cloud service. Good luck with that because as you clearly point out more companies do this but since it is Apple, let’s hop on the let’s bash Apple train.

Absolutely. Practically any cloud service provider worth their salt is going to be deploying some form of CSAM scanning system. Even WAFs/CDNs such as Cloudflare now support this.

I always think it's healthy to be paranoid, but not too paranoid. Lots of ifs being bandied about. The trouble with the future is that it's very difficult to predict, otherwise there'd be an awful lot more lottery winners.
 
  • Like
Reactions: rp2011
“Apple now scans photos in ios 15 so it can tell you what type of flower or dog breed is in your picture”

THIS IS THE COOLEST FEATURE EVER!


“Apple will scan photos in iOS 15 to catch child pornographers…literally the worst people on the planet.”

BUT WHAT ABOUT MY PRIVACY?? APPLE IS THE WORST! Arrrrggghhhgg!!!!!
 
BWAHAHA what a joke!

Could governments force Apple to add non-CSAM images to the hash list?
Apple will refuse any such demands.


Just like they refused to co-op with China for mass surveillance?! LOL

LOL Apple, a FAQ wont rebuild the trust...
The privacy train just has departed.
They are just a bunch of liars!

What if governments pressure Microsoft into giving up Windows search local file indexing of Windows users?
There’s a slippery slope there.
 
So as I said in the other thread regarding "slippery slope", this isn't some nefarious ploy and Apple has no intention of doing anything other than what they have stated.

They are fully aware that if they step out of line with something like this the backlash would be huge.
How would people know if it was the case
 
But here's the thing Apple.
One of your selling points is customer privacy. You have now weakened your own argument against Governments trying to invade that privacy. You are giving them the opportunity to get their foot in that door.

No more than 3 days ago.
Still less than other companies.
 
Of course not. As the old joke goes, “won’t someone think of the children?”

The question is, now Apple has created the ability to use the hash, are we to expect that capability will not be expanded? At this point we only have a promise from Apple it won’t. Well, better for Apple if it didn’t put itself in a position to have to promise in the first place.
Yes, I see your point and the potential issue of “where do we stop”.
 
  • Like
Reactions: mainemini
Can Apple guarantee that every single person in the NCMEC, and all Apple employees involved in the review process are all saints and sinless?
Simply asking because any system can be compromised because of the human factor.
 
  • Like
Reactions: Euronimus Sanchez
Why putting ut the FAQs now instead of last week, or during WWDC? Apple can claim that their process is privacy proof, but when there's a human involved, whether inside or outside Apple, the system can be compromised, as human can be coerced.

Thus the key is to not have such potential backdoor to begin with. This was Apple's own excuse during the FBI request. The FBI asked for a specific backdoor, just like Apple putting specific scans of hashes.

The cat is out of the bag.
This is the least human-involved process of all.
The security vouchers are impossible to decrypt if you don’t cross the threshold of multiple offences.
That’s so elegant.
 
  • Like
Reactions: MozMan68
Because everyone else does it, that makes it right. 🙄

Well, of course not - but what can you do? Glower at them? Bend them over a knee and spank them silly? Alas, the only way forward is to either keep on trucking or move to another platform - until that too suffers a similar fate.
 
The way I heard it described on a podcast was like this:

Apple is not doing a visual scan of your photos. They're not looking at the actual contents of your photos.

They are, instead, comparing hashes of *known* CSAM images. These are photos that have already been labeled as child porn.

So there's no danger of Apple flagging a photo of your child in the bathtub or whatever.

With all that said... no one knows what else Apple could do in the future. Perhaps they could start scanning the actual contents of your photos. So I can see why people are freaked out.

But as others have said... all of the big companies are doing similar things. So I dunno.
Over simplified I think; a flagged image gets checked for a false positive. At this point there is a window for Apple Employees to access your photo library for 'review'.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.