Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
So how many people were turned in by Photos? (That's why it's not the same -- it's not why photos scans)

And honestly, I'd personally be happy to opt out of *all* analysis of my photos by Apple -- even just local face detection.

Don't care at all about it.
 
I don't know about that. If some random carrier software were doing that, you'd not even hear about it.
You'd be surprised -- their proponents and privacy people are just as loud as here and there's more of them. I'd hear about it. Not to mention you can replace the software with an open source version that you can check yourself if you want to!
 
And honestly, I'd personally be happy to opt out of *all* analysis of my photos by Apple -- even just local face detection.
Since it's for my benefit, I don't mind and use facial recognition -- it works so much better than finger for me. If they start using it for facial recog of criminals or scanning for that detail, I'll certainly change what I do.
 
  • Like
Reactions: turbineseaplane
Since it's for my benefit, I don't mind and use facial recognition -- it works so much better than finger for me. If they start using it for facial recog of criminals or scanning for that detail, I'll certainly change what I do.

Yeah I'm with you on that - I really mean it more as in...

If they insist you are either all in or all out on all analysis of your photos...
Then I'm all out
 
That's not how this works..... that's not how any of this works!

haha... anyways - this argument is super weak and just begging for exploits/issues. Typical weak-ass argument against mass surveillance. "I'm not hiding anything, why do I care if the police randomly pull me over and throw me out of my car and search it." This thinking rapidly escalates and it's a VERY slippery slope and hard to turn back from.
It turns out that in several EU countries, a warrant is not needed for a car search and you definitely can be randomly pulled over for that to happen.
 
That’s a term redirected by Wikipedia from Private Set Intersection page, which Apple claims to use to securely transfer hash information between Apple and user’s iPhone without either Apple or user knowing the content of hashed image unless the matching threshold reaches.
I looked a little bit deeper into that:

The "Private set intersection" is used to securely determine the matches between the predefined set of CSAM-hashes and the On-device hashes revealing just the matches (i.e. the intersection set).

The technique to ensure that Apple can only access the pictures once a threshold of matches has been reached, is "Threshold secret sharing".

For me this shows that Apple was really careful in setting that up - this also makes abuse by any malevolent organization even more difficult.
 
Imagine a Snowden case, where an image being traced back to a phone containing it can compromise Wikileaks and the whistleblower operation. This has nothing to do with kids, but being maliciously put into the database by state sponsored hackers.
It is not that easy - even if we disregard the difficulty to sneak in the "fake hashes", such an organization would need to insert at least "n"-pictures into the database (if n is the threshold value set by Apple).
Reason being that the "cryptographic safety vouchers" are only readable after at least n-matches occur on the local photos-database.
In addition the organization would have to infiltrate Apple to have their own people do the human validation, to track those individuals, that have all n-photos in their database.

(They could also hack Apple - but then they would not need to worry with the CSAM-mechanism. Given that Apple doesn't have end-to-end encryption, they could just scan iCloud for the content they need. Or they could hack the devices directly.)
 
It is not that easy - even if we disregard the difficulty to sneak in the "fake hashes", such an organization would need to insert at least "n"-pictures into the database (if n is the threshold value set by Apple).
Reason being that the "cryptographic safety vouchers" are only readable after at least n-matches occur on the local photos-database.
In addition the organization would have to infiltrate Apple to have their own people do the human validation, to track those individuals, that have all n-photos in their database.

(They could also hack Apple - but then they would not need to worry with the CSAM-mechanism. Given that Apple doesn't have end-to-end encryption, they could just scan iCloud for the content they need. Or they could hack the devices directly.)
Apple is an increasingly neoliberal postmodernist company. If you give weapons to the political extremists, hell will break lose. Things may happen within Apple that may twist common definitions to suit their own political agenda. (e.g., can’t ban free speech -> introducing hate speech, etc.)
 
  • Like
Reactions: turbineseaplane
Most kids have smart phones. You are drawing parallel with your childhood.
No this is how I would raise my kid no unsupervised internet access until I trust them Im not going to trust some corporate entity to do my job. Plenty of my coworkers have kids without smartphones they have entire plans and phones specifically dedicated for kids/teens for this purpose just enough for calling trusted contacts and or texting.
 
I've been measuring my own reaction to all of this. I do not believe this is the end of the world, because I think Apple has done as well as it can to do this and still maintain your privacy. Apple is NOT "scanning your photos." On the Message feature, the iPhone is doing a scan of images sent to and from an affected account (child under 13, part of a Family Plan, feature opted in), but the scan is happening on the phone itself and not on Apple servers. And even the alert that goes to the parent, if it goes via iMessage, is encrypted, and Apple itself couldn't see its content to know it's an alert.

The iCloud Photo Library thing is, I think, a creative way to do something lawmakers and many users (I have no idea what percentage, YMMV what "many" means) are asking for: help in the fight against child pornography. It doesn't scan your photos for content, it does everything on the phone anyway and not on a server.

It actually reminds me a lot of the contact-tracing infrastructure Apple and Google put together: neither company would be able to figure out anything about you in this context; it's all kept on-phone and anonymous.

The weak link is what some are reacting to, the potential ability to turn this feature to point to a database of photos that isn't the CSAM database, so that governments or malicious actors (I'll let you determine for yourself if those are the same thing) can leverage it to look for other content. But that still begs the question of how difficult that would be. How abstracted is the list the iPhone is checking against? Is it coded directly into the phone, hard-linked to the CSAM database? Is it accepting a generic "list" sent by Apple? Would it require an iOS update? I don't know.

But what does bother me is this: this technological mechanism to, let's say "evaluate" and not "scan" your photos didn't exist, now it does. Apple says its answer to governments that want to use it differently is a flat "no," and I believe that is their intent. But prevention of that is not longer a matter of TECHNOLOGY ("you can't do that because the system as designed doesn't allow it") to one of POLICY ("you can't do that because we say so"). That is a step in the wrong direction. How big a step? That's beyond my experience to evaluate.


EDIT:

Though this post is now an eternity old in the time span of these things, it's still a pretty complete expression of my feelings about this, and I wanted to update it to reflect further conversations I've had on this thread.

(Disclosure: If you find it valuable to consider how many likes or dislikes a post gets, the first four reactions to this post came before this edit)

Someone pointed out to me that no matter how you conceive it, your photos ARE being scanned...because the phone has to do so to generate the fingerprint hash in the first place. This is a fair point, but I would also point out two things:

(1) Whatever the process is for determining a new photo's fingerprint, that process certainly outputs the hash. But I do NOT know if that same process outputs more AI evaluation of the content of the photo. I suppose it's possible that the act of creating the hash cannot also examine the photo, "scanning" it in the way I think most people mean. I do not know, perhaps someone else here can enlighten me.

(2) The iPhone is already scanning your photos for content, and has been for years, whether or not you use iCloud Photo Library. If I go to the photo library on my phone and search for "tree" I get photos in my library that contain trees. It's not a very extensive list: my library contains over 15,000 photos and searching for "tree" got me 182 results. As I take many photos of birds, I expect I have many more than 182 photos with trees in them. So clearly it's not a very comprehensive scan, but that's a different topic. The same search on my Mac library (exactly the same library) yields a different number or results, so I infer that this search, as well, is done on the phone and doesn't leave it.

"People" is another example. Once I've identified a face, then Photos will suggest other photos with the same face. It's taking a scan of a photo and then applying the results of that scan to other photos! Again, searching for the same "Person" on my Mac yields a different number of results, because I think that data remains on the phone and is not sent to Apple.

There might be those that question whether this distinction even matters: Apple makes the phone, so Apple is scanning the photos. But I do think that this is a real distinction. If data isn't sent to Apple's servers, then Apple doesn't have the data.
Immediately alarmed when I heard the announcement Apple made; and, it struck me as a serious large red flag. Putting it under “save the children” allows a door to open that cannot be closed. Who is going to Contest anything that might assist suffering children? What I’ve seen Apple do in the past year and a half is also alarming, placing everything in their App Store under recurring fees and much is Dan unethically in my experience. Complained and later fine Apple is actually encouraging subscriptions, recurring fees instead of the initial $.99 one time fee that had been standard. That’s a rarity now. Everybody wants your credit card information to keep charging, and you have Little control Nor recourse when mistakes are made. Credit card companies say you must contact the vendor. Good luck with that. All one needs to do is read the reviews on most apps now. You cannot rely on the initial statement, “free“. I am also reading that scans will be done on your iPhone even disconnected from the Internet. That is absolutely wrong, and opens the door to exploit everyone’s data on every single level that could be thought of by nefarious people, groups and governments. and Apple‘s Tim Cook will mildly say “our intentions were good”, after the fact. And as has been said, public memory is short. Once this decision is made, things will go back to “normal“. This seems to be an incredible step in the wrong direction, a precedence that could be used on every level in our country’s Way of doing things.
 
Immediately alarmed when I heard the announcement Apple made; and, it struck me as a serious large red flag. Putting it under “save the children” allows a door to open that cannot be closed. Who is going to Contest anything that might assist suffering children? What I’ve seen Apple do in the past year and a half is also alarming, placing everything in their App Store under recurring fees and much is Dan unethically in my experience. Complained and later fine Apple is actually encouraging subscriptions, recurring fees instead of the initial $.99 one time fee that had been standard. That’s a rarity now. Everybody wants your credit card information to keep charging, and you have Little control Nor recourse when mistakes are made. Credit card companies say you must contact the vendor. Good luck with that. All one needs to do is read the reviews on most apps now. You cannot rely on the initial statement, “free“. I am also reading that scans will be done on your iPhone even disconnected from the Internet. That is absolutely wrong, and opens the door to exploit everyone’s data on every single level that could be thought of by nefarious people, groups and governments. and Apple‘s Tim Cook will mildly say “our intentions were good”, after the fact. And as has been said, public memory is short. Once this decision is made, things will go back to “normal“. This seems to be an incredible step in the wrong direction, a precedence that could be used on every level in our country’s Way of doing things.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.