Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
So someone who looks a child porn photos stops using iCloud Photos.

Except that they won't. Other tech companies have been scanning at the cloud level for a while now. And advertise that. And child predators still use the service. Facebook/Instagram reported over 20 million instances of CSAM flagged images just last year alone. The contention that criminals are these masterminds that will just stop using the service doesn't seem to match with reality. Its hard to quit the internet completely these days, and if Apple joins Facebook, Google, Microsoft, Snapchat, Twitter, etc it's just one less avenue for predators to go.

This is a complicated issue with no real easy solution. My hope is that Apple realizes this and is constantly tweaking the system to make it better and more secure where they find holes along the way. I wish none of this was necessary, but in a world where Facebook flags 20 million images in a single year I'm not sure sitting on their hands is a good course of action for Apple either.
 
We, who consider ourselves absolutely leading on privacy, see what we are doing here as an advancement of the state of the art in privacy, as enabling a more private world

How is looking through other peoples photos advancing state of the art privacy? Or, enabling a more private world?

This is very confusing messaging from them. To be honest, it wreaks of China and will absolutely be abused once in place.
 
people seem to think if you are against it, you support it. they know exactly what they are doing.
I said this exact thing the other day. Given the subject matter, it's an unwin-able situation. Keep it and the users get angry. Get rid of the feature (and because of the bolded part above) the media (and probably politicians) paint Apple as the "perv protectors".
 
The last of my PC components arrives today. Later this weekend my i7 Mac Mini will be for sale.

Sorry Apple, you've completely lost all my trust. The movie 1984 was not an instruction manual.
Congrats, so you go straight forward to the real dark garden…

I wouldn't overreact and wait to see when Apple comes to its senses. Apple is destroying many potential business areas with this madness, and this discussion will be very heated at Apple, I'm sure.

I almost feel sorry for Craig, how he has to publicly dismantle himself and make a fool of himself.
 
Last edited:
This is why some of you have this huge problem with Apple. You believe privacy = secrecy.

I don't think Apple subscribes to this definition of privacy and I certainly don't. Secrecy is just one part of privacy. In many cases complete secrecy of private data would be catastrophic.
 
This has caused more harm than good for Apple.

I've said it before and I'll say it again: Privacy is the sole reason I went back to Apple and bought a MacBook Air and iPhone 12 and no, I don't and never have owned CP.

The fact that that even has to be qualified in order to make the statement means the whole thing is wrapped in 'road to hell is paved with good intentions' and set up to be doomed if one were to dissent. Not screech from the top of buildings endlessly out of being existentially threatened, but plain and simple dissent.

--



Why are we just now learning about this before iPhone 13 and iOS 15 too? If they were so proud and confident of the feature, wouldn't they have announced this during WWDC?

A last minute slip in to hope they brush kickback under the rug in time?
 
Wow. People really read what they want to read.
1. What back door?. You are confusing things.
2. You have opt-out, just don’t use iCloud.
3. It scans a hash a the photos, not the photos.
4. Why are you so worried????. About your consented porn?. The system does not care.
 
This just gets more stupid:

- Mass surveillance of a billion iPhone users for what – now that every criminal has been warned?

- On the device so that every security scientist knows what happens – no, they don't know if there is more in iCloud

- Since it is on the device it looks like a first step, the second step could be a neural network detecting images

To reiterate myself, after buying a new iPhone every since 2007, I will not update to iOS 15 and will not buy an iPhone 13 Pro until this is sorted out. Same applies to macOS Monterey.
 
"they can spot that that's happening."

He is still CLUELESS. I've known he and TC aren't the sharpest tools in the shed for a long time, but they just seem to be hellbent on building tools for the worldwide totalitarians to use later.

So you can spot what is happening, so what! Are you going to defy the Chinese, US, EU or other governments when they say, "Add a guy in front of a tank" or "Add images for homosexuality" or "Add images of transgenderism" or "Add images of Thomas Paine, or John Hancock" or "Add images of XYZ" ?

They can't even defy an EU Law that could force the iPhone to switch to USB-C (see previous story here).

Once you build it, authoritarians around the world will force Apple to exploit it. At that point it will be too late to say "I told you so" because that will be flagged too.

Once the precedent is established that it is fine to scan ANYTHING on a phone because it is a "bad thing" then at some point it will be a "bad thing" that is required by governments elsewhere.

"Because it's on the [phone], security researchers are constantly able to introspect what’s happening in Apple’s [phone] software. So if any changes were made that were to expand the scope of this in some way —in a way that we had committed to not doing—there's verifiability, they can spot that that's happening."
 
We learned now the size of the threshold - 30 is quite a statement.

I think this makes any false positive (you'd need 30 in a row)
and hash-insertions (at least 30 - better 60) an impossibility -
*iff* Apple doesn't commit any blunder in the implementation.

(Microsoft did mess up all of their cryptography until well into 2000-somethings -
but Apple being Apple is a different animal)
 
We learned now the size of the threshold - 30 is quite a statement.

I think this makes any false positive (you'd need 30 in a row)
and hash-insertions (at least 30 - better 60) an impossibility -
*iff* Apple doesn't commit any blunder in the implementation.

(Microsoft did mess up all of their cryptography until well into 2000-somethings -
but Apple being Apple is a different animal)
Funny, how Apple keeps changing its statement and being very careful with the wording.
 
The gaslighting is a PR strategy, and not a subtle one.

They are attempting to conjoin something no one really cared with the issue so they can temper the discussion.
They sent Federighi to the WSJ so that the investor community can have a soundbite to repeat to each other that avoids the issue. "People got confused!" (read in a Jim Cramer voice)

It's amazing how badly Apple is handling this; it's a huge mistake. They just keep digging the hole deeper. I think this is peak AAPL, time to sell.
 
Oh Craig.

Instead of just getting spell check to work properly, giving us shareable contacts in the app and end to end encryption everywhere (also product names and annual OS updates tied to the calendar year of release), we get apple as an extension of state law enforcement and warrantless spyware draining our batteries short term and risking expansion of Apple into other warrantless spying categories in the future.

So contradictory, so disappointing.
 
I wonder how they can turn this one around.

Also the fact that practically all other services have been using this for a decade, like Google / Dropbox / Social Media, etc.

it begs the question why the corner of Apple needs it? Apple users all use at least one of these services, and for people who really are sickos they've been given advance notice how to work around this.
 
I'm sure they think this scanning of photos is a good idea, but I don't understand why, especially when they talk about privacy. Just saying it aloud should make it clear:

"We're going to use an automatic algorithm to determine if your pictures are child porn, then we'll look through them and decide if we'll report you to the police."

Because your statement is wrong.

"If you voluntarily use iCloud Photo Library we will use several algorithms in conjunction to determine if you have a copy of a set of known child pornography pictures in your Photo Library known to be in circulation. We will get notified if these algorithms have determined about 30 such matches. We will then manual review only those photos. If we determine it is child pornography we will notify NCMEC, a non-profit private organization, as required by federal law."
 
  • Haha
Reactions: Jemani
Like i told someone from here...under pressure Apple will start to be gentle and start to cover this...and make the users being not so offensed by it..but under the hood, Apple under the government pressure it will hardly do this
 
Because your statement is wrong.

"If you voluntarily use iCloud Photo Library"
I like how by their wording they alienate people and make them feel like fringe groups for using the service they peddle as much as possible publicly and with pop ups in their apps

It's like, you use the Music app - and you listen to... music? Or you go on the App Store... and you download Apps?

That's kinda weird bro.

Even all their mouthpiece Youtubers stress 'if you use it' and well they only give you 5GB free so not many people are using iCloud photos. Are you frigging kidding? That is so damn disingenuous it pains me. But it also exposes who is just a mouthpiece and cant operate outside of the bowling lane bumpers.

Especially as Apple ramps up their services arm and even offers an inclusive 'One' subscription that covers Music, ATV+, iCloud, Fitness, Arcade, etc..
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.