Apple Employees Internally Raising Concerns Over CSAM Detection Plans

I don’t get the outrage. Apple is scanning known hash values. That means they already have the photo they are scanning for, they are only looking for others who have it. They are also only scanning iCloud photos. So if you’re concerned your photo library has something in it that could be flagged, just disable iCloud photos.
Maybe you should duck duck go 'national security letter'
 
I don’t get the outrage. Apple is scanning known hash values. That means they already have the photo they are scanning for, they are only looking for others who have it.
Using hash values sounds reassuring, because the way hashes are commonly used means that - apart from a tiny, predictable chance of two images generating the same hash which can be allowed for - only images identical to known CSAM would be detected. Except, when you think about it, that would be almost useless, since re-sizing the image or adding a watermark would completely foil it.

In reality, they're using something called a "NeuralHash" which claims to cope with cropping/re-sizing etc. by looking for "perceptually and semantically similar images". You can argue until the cows come home as to whether that is technically still "hashing" - but it looks much more like AI/ML than the usual uses of hashing.

...and aside from that, this is a line-in-the-sand issue rather than something that is going to have a huge practical impact. We've gone from your images being checked on the server after you entrust them to a cloud service to checking the images on your own phone before uploading and getting your consent for this when you accept the iCloud T&Cs (which is strongly encouraged by the iPhone set-up routine and on which more and more new iOS functionality depends).
 
While I appreciate that protecting our children is important, I have a couple of thoughts on this whole situation.

First, "Innocent until proven guilty". How many of you have taken a picture of your kids playing in the bathtub? Or in their new swimsuits? So, given the exact same picture... one is taken by the parents of their kid in a swimsuit at the pool, the other is taken by a pedo of a kid in a swimsuit by the pool. No AI is going to be able to differentiate between those.

Second, "persecute the many to guarantee prosecution of the few". This is not how our world should be. I should have an expectation that if I take a picture or two of my consenting wife for our own enjoyment, that it's not triggering some "Skin-o-meter" at Apple. I should have a reasonable expectation that the phone that I paid $1200 for isn't "spying" on me.

Third, isn't the CSAM thing looking for existing thumbprints of child porn/exploitation? Who would store those in iCloud? Who would have those in their camera roll?

I think that overall, this is not only a bad idea, but it is going to put additional load on our end-user devices without catching a single person, who is either taking their own pictures or not storing child porn on iCloud.

It's not up to Apple to be the "thought police". While I think that it was probably originally a noble thought from good people, they are crossing a line when they start persecuting innocent people in hopes of catching someone doing something wrong.

If law enforcement has reason to look at my photos / texts / etc. and Apple reacts by providing them those files, I have no issue with that. For Apple to be proactive in monitoring my actions and notifying the police if their AI thinks they found something, well, Big Brother is watching.

Apple should be reactive, not proactive in this situation.
 
Nice paranoid fantasy.
In reality they have on multiple occasions refused requests by the FBI and other agencies to do things like that and taken a lot of heat for it. But sure, Apple has been secretly doing the governments bidding this whole time, and decided to publicly announce this particular feature because?
If Apple wanted to secretly spy on people THEY WOULDNT HAVE ANNOUNCED IT.
Yeah because the FBI is weak in terms of preventing Apple from selling more phones if they don’t give in. But other countries are stronger in that standpoint.
 
And from broadly controversial on to topics on which some ideas, theses, and proposals are simply disliked by a minority.


Though, this could be used to route out every My Little Pony fan who isn’t a prepubescent girl.

Oh wait , that’s the point, insert a group you want to find, and report them to the authorities.
 
That is correct but a physical Apple employee (that is a human being) will be reviewing the pictures once it gets flagged. That does not sound like a AI to me.

View attachment 1818083
Sounds like only the threshold is determined by AI. The hash would probably be calculated in a deterministic way by an algorithm, comparing every single image on your phone (e.g. at import/save time) or iCloud photo storage or Mac to the set of hashes in an index of CSAM material. They don't have to add a new category for things besides CSAM, they just have to maintain an index of hashes of anything that someone insists is objectionable, including CSAM.

In addition to well-intentioned objection to CSAM, I also suspect that Apple lawyers are pushing for this to help mitigate risk of criminal charges or civil liability from distribution of CSAM (e.g. via services such as iCloud, iMessage, etc.)
 
Apple needs to figure out another way to catch Pedophiles. Scanning your photos or visually hashing it is not the strategic solution.

As a consumer, Why are we getting punished?
Persecution of the many to ensure prosecution of the few.

Apple doesn't need to figure out a way to catch pedophiles. That is not their job. And they certainly don't need to engage in violating everyone's privacy in doing so. You're right; we, as consumers, are being punished. And this is Big Brother stuff, if ever we've had it.
 
I never thought I’d do it, but this is making me reconsider continuing to use the Apple ecosystem.

I’ve started looking at PC hardware, and quite frankly, what I’m finding is a lot more innovative than the same old iMac/MacBook Pro metal slab rehash we’ve been getting for the past 15 years or so.
 
While I appreciate that protecting our children is important, I have a couple of thoughts on this whole situation.

First, "Innocent until proven guilty". How many of you have taken a picture of your kids playing in the bathtub? Or in their new swimsuits? So, given the exact same picture... one is taken by the parents of their kid in a swimsuit at the pool, the other is taken by a pedo of a kid in a swimsuit by the pool. No AI is going to be able to differentiate between those.

Second, "persecute the many to guarantee prosecution of the few". This is not how our world should be. I should have an expectation that if I take a picture or two of my consenting wife for our own enjoyment, that it's not triggering some "Skin-o-meter" at Apple. I should have a reasonable expectation that the phone that I paid $1200 for isn't "spying" on me.

Third, isn't the CSAM thing looking for existing thumbprints of child porn/exploitation? Who would store those in iCloud? Who would have those in their camera roll?

I think that overall, this is not only a bad idea, but it is going to put additional load on our end-user devices without catching a single person, who is either taking their own pictures or not storing child porn on iCloud.

It's not up to Apple to be the "thought police". While I think that it was probably originally a noble thought from good people, they are crossing a line when they start persecuting innocent people in hopes of catching someone doing something wrong.

If law enforcement has reason to look at my photos / texts / etc. and Apple reacts by providing them those files, I have no issue with that. For Apple to be proactive in monitoring my actions and notifying the police if their AI thinks they found something, well, Big Brother is watching.

Apple should be reactive, not proactive in this situation.
You really need to read the FAQ document as you clearly do not understand how this works.

and FYI...anyone else reading this that wants to write something about their nude selfies they share, personal sex vids or babies in bathtubs, please read the FAQ before commenting.
 
Agreed. Still within the period of living memory people were literally sentenced to death or the Eastern Front for nothing more than making jokes about Hitler or calling Goering fat.
The potential for abuse of this system is maximum, and not just in China, Russia, etc. The rising tide of authoritarianism around the world should make everyone consider the possibilities and attractiveness of this tool. The United States itself just spent four years under a thin-skinned president who considered people either loyalists or traitors based on their support of him personally, and regularly used law to attack and destroy them personally and professionally. …from a procession of his own closest supporters and staff to entire groups of citizens unpopular with the extremists of his base. There is no guarantee anywhere that the next president or three from now won’t be more hostile yet. People need to consider the short and long term risks when they implement technologies capable of and designed for incriminating and destroying peoples lives. It isn’t what this tool will be useful for in the best case scenario today, but how will it be used by the next guy against you?
 
Persecution of the many to ensure prosecution of the few.

Apple doesn't need to figure out a way to catch pedophiles. That is not their job. And they certainly don't need to engage in violating everyone's privacy in doing so. You're right; we, as consumers, are being punished. And this is Big Brother stuff, if ever we've had it.
Maybe read up on US Law and the requirements of Electronic Service Providers like Apple.
 
You really need to read the FAQ document as you clearly do not understand how this works.

and FYI...anyone else reading this that wants to write something about their nude selfies they share, personal sex vids or babies in bathtubs, please read the FAQ before commenting.
If that selfie somehow ended up on the internet, it could be in the database.
 
Happy to see that also folks at Apple think that this kind of mass surveillance is completely unacceptable.

To me it's against the right of presumption of innocence.

Personally (I bought a new iPhone every year since 2007) I won't update to iOS 15 and will delay purchasing a new iPhone until this is sorted out!
 
Nice paranoid fantasy.
In reality they have on multiple occasions refused requests by the FBI and other agencies to do things like that and taken a lot of heat for it. But sure, Apple has been secretly doing the governments bidding this whole time, and decided to publicly announce this particular feature because?
If Apple wanted to secretly spy on people THEY WOULDNT HAVE ANNOUNCED IT.
You have no idea how intertwined these big tech companies are with the surveillance apparatus….

It goes wayyyyy back to AT&T in the USA.

It is definitely a conspiracy, and a real one. Don’t dismiss this as Alex Jones lizard people lunacy.
 
I don’t feel Google is any better here. I think Apple is no better now. I trusted Apple and therefore had all my thousands of photos of my kids childhood, private photos of holidays etc on iCloud.
*Facepalm*

Don't trust the (i)Cloud. Don't use it.
There is no End-to-End-Encryption.
What about vulnerabilities like directory traversal?
Apple has experience with that:
2009-07-01 idisk.me.com
A directory traversal issue was addressed.

 
You really need to read the FAQ document as you clearly do not understand how this works.

and FYI...anyone else reading this that wants to write something about their nude selfies they share, personal sex vids or babies in bathtubs, please read the FAQ before commenting.
They’re not telling you everything in a FAQ…..

There is historical precedents for our concern. Very recent ones in fact. Those giant NSA data centers are not for keeping track of people living in caves and yurts in Afghanistan. They are for you and me.
 
They’re not telling you everything in a FAQ…..

There is historical precedents for our concern. Very recent ones in fact. Those giant NSA data centers are not for keeping track of people living in caves and yurts in Afghanistan. They are for you and me.
You had better get offline then and back to building your bunker...make sure you have your data-free cage like Gene Hackman built in Enemy Of The State so they can't track you.
 
You really need to read the FAQ document as you clearly do not understand how this works.

and FYI...anyone else reading this that wants to write something about their nude selfies they share, personal sex vids or babies in bathtubs, please read the FAQ before commenting.
Ok, so even though you've established yourself as a troll, I'll bite. What part am I not understanding? (And before you make a comment about me being computer illiterate or whatnot, I'm a systems administrator who has been on the Internet since 1986)
 
Get back in that box you’ve just thought out of. Finally somebody that thinks. Lol
I don’t feel Google is any better here. I think Apple is no better now. I trusted Apple and therefore had all my thousands of photos of my kids childhood, private photos of holidays etc on iCloud. I only moved from Android/PC years and years ago as i felt Apple were more secure. Now they are no different. I’d rather save a fortune in 5 people worth of tech every year, can store my documents and pics offline so NOBODY can acces them unless I decide so and have no compromise when I need to do really heavy colourist work (even the Mac Pro is slow compared to a really decent PC).
I think this only works if you have iCloud switched on btw.
So essentially you could just use another backup service for your photos (DropBox, Google etc..) and just not use the iCloud service.
Is that not good enough for you or your worried about the principal of it all?

(Also, remember that whatever backdoors are in software these companies by law will never tell you. Snowden demonstrated that a while back. All of the big companies were signed up).

The only way to be truly secure is to run an open source OS (e.g. linux) where you can view every line of code (or someone has done it) and then set up your own private backup services etc.. A bit of effort but worthwhile I suppose if you feel strongly about this stuff.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.
Back
Top