Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
depends on your semantics of "scanning".

A hash-function is applied to the picture on-device (that is an information destroying function, so I wouldn't call that a scan). In case of a match, the picture is flagged with this "safety voucher" - which probably has the necessary encryption keys, that an Apple dedicated team might inspect those photos, once the threshold has been exceeded.

So privacy is reduced only in those cases and for those pictures, where a match was found.
The probability that that happens in case of a non-CSAM picture is related to the "coarseness" of the hash-algorithm,
by using a finer hash function Apple could further reduce this risk.

If the mechanism works as designed in my view the benefits outweigh the risk of the privacy intrusion.
However Apple should first prove it can make the mechanism tamper-proof before even thinking of opening it up to third parties.
All you need is for NSA to sneak a non-related but “interested” photo into the database, then it will be reported.
 
depends on your semantics of "scanning".

A hash-function is applied to the picture on-device (that is an information destroying function, so I wouldn't call that a scan). In case of a match, the picture is flagged with this "safety voucher" - which probably has the necessary encryption keys, that an Apple dedicated team might inspect those photos, once the threshold has been exceeded.

So privacy is reduced only in those cases and for those pictures, where a match was found.
The probability that that happens in case of a non-CSAM picture is related to the "coarseness" of the hash-algorithm,
by using a finer hash function Apple could further reduce this risk.

If the mechanism works as designed in my view the benefits outweigh the risk of the privacy intrusion.
However Apple should first prove it can make the mechanism tamper-proof before even thinking of opening it up to third parties.
The issue isn't the likelihood of false hash matches, it's the precedent set by scanning people's photos for illegal content.
 
If I had a kid under 13 they would have a dumb phone with MMS disabled. They wouldn't get a smartphone or unsupervised internet access until I thought they were mature enough for it which would probably be 13 or 14.This seems more like Apple trying to compensate for bad or lazy parenting.
Most kids have smart phones. You are drawing parallel with your childhood.
 
Apple wouldn't decrypt a freakin terrorists phone to help the investigation. Yet they are doing this on a massive scale now.
The funny part is they probably cracked the iPhone without Apple's help anyway. Such products exist, at least for the older phones. Hardware security only goes so far when, at the root of things, there's very little entropy in that 6-digit passcode.
 
In that case, your kid is also breaking the terms of service for Snapchat and many other online services by being under the age of 13. Maybe as a parent you should make sure you are not providing a tool to underaged users. A lot like alcohol to minors.
It’s not, as long as the legal guardian gives consent.
 
  • Like
Reactions: peanuts_of_pathos
Life is not a mission impossible movie. That’s really all I can say at this point. Some of the “worst case scenarios” I see being raised seem like they belong more in spy thriller movies than a serious discussion forum.
We're already in the unlikely-sounding situation those privacy freaks warned us about years ago. Those people who use Linux (and pretend it doesn't suck), avoid the cloud, and follow rms. I'm not saying they were right for the right reasons, but things happened that way. It's no longer speculation that Apple might one day roll out mass surveillance features.
 
Illegal pictures today, but then its not going to be accurate as they don't have access to the world's database on child abuse photos? This renders it useless.

Illegal pictures of children today
possibly illegal pictures of something else tomorrow
ANY pictures the day after
Sure I agree. But that's the slippery slope argument.
 
What about people with an Apple ID who don't have 2FA turned on, use iCloud photos, and a bad actor gains access and decides to upload filth to that persons iCloud photos via iCloud web? Would Apple be notified? If so, that could cause a lot of major problems for that innocent party. I can see something like that happening.
You've just identified the next version of swatting. That's a pretty big what if scenario though.

As with everything in the news lately, we need more information than what is currently being released. There's an awful lot of implications doing some heavy lifting for this current issue, with convenient timing.
 
In my case I really don't think this system will be misused at all, but it does not harm to know how apple check the list of hash isn't altered to protect their customer from ill intentioned people.
Or at least not flood their check department of false positive if modified by hacker.

Apple is using the database provided by NCMEC, plus they are the ones compiling and submitting the reports to said organisation, so other organisations would simply not be able to get around Apple to spy on users.

Apple has also said that CSAM can’t be used to target a specific individual since the same set of hashes is stored on the OS of every iOS device. If you are worried that the government of another country may try to force Apple to build a new detection system to identify other kinds of content, I don’t think there is anything Apple can say which would alleviate your concern.

It ultimately comes down to consumer trust. Which is why Apple is being upfront and truthful with customers right here, right now. And that is why Apple still has my trust for the moment.

I imagine any entity willing to do the extent of adding a completely unrelated photo to this database without detection (especially a foreign government), would already possess the means to infiltrate your smartphone using a myriad of other alternatives. For example, China already possesses easier and more practical means of doing so via Wechat.

So it still comes down to “it’s technically possible, but probably very likely” territory.
 
You've just identified the next version of swatting. That's a pretty big what if scenario though.

As with everything in the news lately, we need more information than what is currently being released. There's an awful lot of implications doing some heavy lifting for this current issue, with convenient timing.
I agree that the scenario I described is a pretty big what if but, it is certainly within the realm of possibility, given people have their account taken over (for X period of time).

I know a lot of people like to use iCloud Photo for the convenience and syncing but, in my opinion, there is too much risk from wolves and from wolves acting like lambs.
 
Just feels bad to think Apple will intentionally go through all the files. for any reason. today X is their justification, tmrw what will justify it? and in what nations? not a good thing for apple to do to it's customer base.
 
  • Like
Reactions: femike
What about people with an Apple ID who don't have 2FA turned on, use iCloud photos, and a bad actor gains access and decides to upload filth to that persons iCloud photos via iCloud web? Would Apple be notified? If so, that could cause a lot of major problems for that innocent party. I can see something like that happening.
If you have access to someone's account, you can probably already do this sort of damage. Send emails to self about plotting something awful, done. In general I'm not concerned about the system, as currently implemented, being a problem. I'm concerned about what it will turn into.
 
  • Like
Reactions: Stunning_Sense4712
We're already in the unlikely-sounding situation those privacy freaks warned us about years ago. Those people who use Linux (and pretend it doesn't suck), avoid the cloud, and follow rms. I'm not saying they were right for the right reasons, but things happened that way. It's no longer speculation that Apple might one day roll out mass surveillance features.

I suspect they don’t really know the details of Apple’s actions either, and it makes me wonder what is going on behind the scenes with such privacy advocacy groups.

It’s also ironic that WhatsApp immediately went after Apple, in a move I can only describe as sheer and utter hypocrisy.

But I guess it’s simply been way too long since the last “Apple vs the world” debacle, hasn’t it?
 
  • Like
Reactions: peanuts_of_pathos
I suspect they don’t really know the details of Apple’s actions either, and it makes me wonder what is going on behind the scenes with such privacy advocacy groups.

It’s also ironic that WhatsApp immediately went after Apple, in a move I can only describe as sheer and utter hypocrisy.

But I guess it’s simply been way too long since the last “Apple vs the world” debacle, hasn’t it?
WhatsApp, Facebook, Instagram, and Google did not even have the balls to pull what Apple is doing by mass installing surveillance software to customers phones. They kept that kind of software on their servers which they own.
 
My primary issue with this system is it is installed on my device which Apple does not own. Apple needs to follow industry standards and keep this type of software on their iCloud photo servers just like everybody else. In addition I can see hackers screwing with Apple by modifying the hash so it constantly trips their systems.

Maybe it’s the standards that need to be changed then. I won’t be surprised if this ends up being another headphone jack scenario where Apple receives all the flak for doing something, then everyone else hops on board once Apple has taken all the heat and made a particular practice both mainstream and socially acceptable.

Besides, I am pretty sure including such a feature is probably covered under the TOS somewhere which gives Apple ownership and control of the software on your device.
 
  • Like
Reactions: peanuts_of_pathos
Maybe it’s the standards that need to be changed then. I won’t be surprised if this ends up being another headphone jack scenario where Apple receives all the flak for doing something, then everyone else hops on board once Apple has taken all the heat and made a particular practice both mainstream and socially acceptable.

Besides, I am pretty sure including such a feature is probably covered under the TOS somewhere which gives Apple ownership and control of the software on your device.
The problem is, Apple owns the software but they do not own the phone once sold. I imagine a judge is not going to be okay green lighting software manufacturer's installing surveillance software on cellphones.
 
WhatsApp, Facebook, Instagram, and Google did not even have the balls to pull what Apple is doing by mass installing surveillance software to customers phones. They kept that kind of software on their servers which they own.

How would WhatsApp, a company which doesn’t even make photos, even go about doing such a thing? They scan for images on their servers because that is the only thing they do control.

Apple controls the hardware and the software, so of course they would be able to accomplish a particular task in a manner that other companies can’t.

Each is simply leveraging on their individual areas of expertise to tackle an existing problem.
 
I agree that the scenario I described is a pretty big what if but, it is certainly within the realm of possibility, given people have their account taken over (for X period of time).

I know a lot of people like to use iCloud Photo for the convenience and syncing but, in my opinion, there is too much risk from wolves and from wolves acting like lambs.
If people are so desperate to reach that far for something to live in fear of, they probably should not own a smart device.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.