Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I’m not okay with warrantless searching on people that are not presumed to have committed a crime.

PLOT TWIST: They already do this today!! You’ve been consenting for practically as long as you’ve had an iPhone.

Provide a citation

From Apple’s own support page…and it has been doing this for about 5 years now (just better now obviously as tech has improved).

Moments: Search for an event, like a concert you attended or a trip you took. Photos uses the time and location of your photos along with online event listings to find matching photos.
People: Find photos in your library of a specific person or a group of people. Just keep names and faces organized in your People album.
Places: See your photos and videos on a map in the Places section. Or type a location name in the Search bar to see photos and videos from that place.
Categories: Photos recognizes scenes, objects, and types of locations. Search for a term like "lake" and select a result to see photos that match.
The Search tab also suggests moments, people, places, categories, and groups for you to search. Tap a suggested search, such as One Year Ago or Animals, to explore your photos.
When you search your photos, the face recognition, and scene and object detection are done completely on your device. Learn more about photos and your privacy.


The last part is basically what they are doing now as well. They have on-device parameters that recognize objects in your photos. No different than the on-device database hashes they will be adding.

The difference? The older data points just help when you are searching for the stuff listed above. The new one specifically mat-res photos that match the database hashes and then flags them when uploaded to iCloud. Have enough of them in your library and Apple checks to verify. If you uploaded known child porn to your iCloud account, the authorities are contacted.


So, again, why wasn’t anyone freaking out about this existing (for 5 years now) “back door” way of scanning images on your phone? What would have stopped them previously from adding the ability to recognize “trump” or “pink triangles” or anything else they could think of? Why do all of these lame conspiracy theories have more validation now?

And still, why would the government even NEED this?? There are easier ways to scan your phone and especially iCloud by hacking into them instead of involving hard coded data being added to a phone by Apple (not the government…Apple does it!)



EDIT: When you click on the privacy link:

Photos lets you choose who has the full picture.

The Photos app uses machine learning to organize photos right on your device. So you don’t need to share them with Apple or anyone else.

More about Photos
Your photo and video albums are full of precious moments, friends, and your favorite things. Apple devices are designed to give you control over those memories.
Photos is also designed so that the face recognition and scene and object detection — which power features like For You, Memories, Sharing Suggestions, and the People album — happen on device instead of in the cloud. In fact, the A13 and A14 Bionic chips perform over 100 billion operations per photo to recognize faces and places without ever leaving your device. And when apps request access to your photos, you can share just the images you want — not your entire library.

Oh my.

If you really equate any of those examples to warrantless searches and surveillance, then perhaps we should cordially part ways.

Your examples of searches do not phone-home about our activity to the government. They are not put in place to scrutinize us and search for illegal behavior. And most importantly, they are not used as tools to build a criminal case against users.

The differences should be self-evident. But if you still can’t see it, then consider the following:

Let’s say you want to put new carpeting in your living room. A flooring installer visits your home and knocks on the door. She asks for permission to enter your home and surveil your floors so that new carpeting can be ordered and installed. You consent to having her do this.

Later, a police officer approaches your door. He asks for permission to search your home for evidence of a crime. He isn’t accusing you of the crime. He doesn’t even suspect you of the crime. Nevertheless, he wants to search your home. You know that you are innocent. He has no warrant, and there is no probable cause for the search, so you decline to let him enter.

Now, imagine the police officer says: “Well, you just let the flooring installer into your house, so you've given up the option to deny this search.”

In what looney-tunes world would that make sense? Allowing a flooring installer to enter your home does not give justification for the police officer to enter. Just as using Apple’s "Moments" feature should not preclude us from denouncing warrantless searches and surveillance. The former is obviously a feature that I use at my pleasure and benefits. The latter is a tool of surveillance used against me.
 
  • Love
Reactions: Pummers
Speculating about whether apple is going to roll out end to end encryption for photos is just like speculating that they’ll use the neuroHash to do harm.

As far as I’m aware, apple is still maintaining their strong privacy push. They DONT wanna know what we have on our phones, BUT if you’re uploading to iCloud, then yes, you’re subject to be searched for child abuse imagery just like you always have.
 
I’m convinced that you and others just choose not to read the facts about this tech and that it has in reality been on your phone in one form or another for 5 years now.

I'm sorry, but "this tech" has never been on the iPhone before. This tech is not a search feature that the user utilizes. It is a surveillance tool, and the fruits of it can and will be used against you in the court of law.

If we can't agree on basic facts, then I doubt we'll be able to have a productive discussion on an internet forum.
 
Are iCloud photos able to de decrypted if iCloud Backup is set to OFF? The way I've always understood things like Photos in iCloud and Messages in iCloud is that if your device is set to having backups OFF while the other two mentioned are ON then there really isn't a way to decrypt those seeing as they aren't accompanied by an iCloud backup that can be decrypted.
iCloud photos is separate from iCloud device backup. I don’t think the device backup includes the photos and other components that are separately stored in iCloud. iCloud photos are encrypted but Apple has a key. I’m not sure about iCloud device backup being encrypted but if it is I’m sure Apple also has a key for it. iCloud messages are encrypted end to end so Apple does not have a key for them per my understanding.
 
Sorry but I feel you are bit off kilter on this and keep drifting from the point. Maybe you missed it.
Client side searching has been around for a long time. I seriously doubt Apple is doing this today. They are reporting far to few CSAM violations for that. All they apparently do today is some Mail scans and other checks when issued a subpeona. The function to take a face, a leaf, a flower and do a “Google” search is far different than this. I suspect you are quite well aware of this.

It isn’t the solution, rather the why do it client side? Despite the potential for misuse, there are easier methods to accomplish this. After all I have reviewed and learned, that single piece stands out. Stands out to a lot of others also.

They have been doing that for 2 years…another article listed here that shows that.
 
If the "tech was already on the phone", why is Apple even talking about this, right?

They'd just be "doing it"..

I think you're going in circles with that other user honestly.
He/She is off on basic facts here

They are talking about it because after the tech does what it has been doing with other hashed photos for 5 years, they are letting you know that they are taking the extra step to THEN flag accounts with multiple illegal images and then report.

The new part is NOT the “scanning” that is happening on the device. But that is what everyone seems to be getting upset about.

The new part is what happens after you upload illegal photos to iCloud. Those images and those images only are marked.
 
Can you point that link out? It would be appreciated.
Client side is not listed in any document I read including Apples’. At least not that I am aware of.

I think it’s in this thread somewhere. Link to an article where they found that Apple has been scanning iCloud for the images since 2019.

This whole move is to refine that search and actually provide MORE privacy.
 
I think it’s in this thread somewhere. Link to an article where they found that Apple has been scanning iCloud for the images since 2019.

The only scanning Apple is doing is at the behest of a subpeona.
Out of 21.4 million items filed regarding CSAM, Apple only reported 265. They have by far the lowest number of any. While Apple won’t confirm if they are doing iCloud scanning, the numbers don’t support this activity.
 
They have been doing that for 2 years…another article listed here that shows that.

You've completely lost the plot. And you are being called out for it.

You can't link a credible source, because you aren't making credible claims.

We are talking about on-device scanning, which has been established many times already. And without a warrant.
 
You've completely lost the plot. And you are being called out for it.

You can't link a credible source, because you aren't making credible claims.

We are talking about on-device scanning, which has been established many times already. And without a warrant.

I’m only repeating what someone else posted (and I read the article) linked in this thread that showed Apple was indeed scanning iCloud for the past two years for images.

They have been doing on device scanning in one form or another for 5 years.

I copied what they scan for now right off their own website.

My question to you and others, what is the difference between on device scanning for dogs and cats that are hard coded into iOS (along with a slew of other “images”) versus hard coded images from the child porn database?

The answer of course is nothing…other than the subject matter of the hashed images.

If your main concern is them scanning your personal images on your device, why are you upset now and not when they were doing it before?

Why could some government upload other images into the hashed dog and cat database they have hard coded in versus the ones they will be adding in iOS 15?

What…is…the….difference????
 
  • Like
Reactions: 5H3PH3RD
What…is…the….difference????
There isn't one, the fake outrage is hilarious - people that have no idea what's been happening on their device for years or have bothered to read the EULAs and Privacy Policy.

Edit: They also haven't bothered to educate themselves properly on how the CSAM tech works.
 
Last edited:
I’m only repeating what someone else posted (and I read the article) linked in this thread that showed Apple was indeed scanning iCloud for the past two years for images.

They have been doing on device scanning in one form or another for 5 years.

I copied what they scan for now right off their own website.

My question to you and others, what is the difference between on device scanning for dogs and cats that are hard coded into iOS (along with a slew of other “images”) versus hard coded images from the child porn database?

The answer of course is nothing…other than the subject matter of the hashed images.

If your main concern is them scanning your personal images on your device, why are you upset now and not when they were doing it before?

Why could some government upload other images into the hashed dog and cat database they have hard coded in versus the ones they will be adding in iOS 15?

What…is…the….difference????

I laid out the reasoning for you just a few posts above. Conveniently, you have neither acknowledged nor refuted my post.

They also haven't bothered to educate themselves properly on how the CSAM tech works.

Oh, I understand it quite well. But frankly, how the tech works is irrelevant.

It doesn’t matter if the search is done using hash-matching algorithms, bloodhounds, or black magic. A search is a search. And a search should only be conducted on my personal device if there is a warrant.

Innocent citizens, who are not suspected of committing a crime, should not be subjected to mass surveillance. The technical implementations of the search/surveillance do not matter.

This is all the more frustrating, because I really don't care if Apple searches my photos. I don't have anything to hide, so I'd be happy to consent to a search in iCloud, in exchange for accessing the iCloud services. However, I must object to this implementation of on-device scanning in principle. Once surveillance technology is built into our personal devices, the devices are no longer personal.

And no, using the "moments" feature is not the same thing. I consent to using that image analysis, and it is done for my benefit. It is not a surveillance feature that phones home to report on citizens of criminal behavior.
 
And no, using the "moments" feature is not the same thing. I consent to using that image analysis, and it is done for my benefit. It is not a surveillance feature that phones home to report on citizens of criminal behavior.

So you are worried about the illegal content “search” versus the Spotlight “search?”

The issue with your prior example regarding the carpet cleaner and the police officer is that they are two different people.

If Apple is the police in this example, you’ve already invited them in. If I invite the police in my house to check out some furniture I have for sale and they happen to notice child pornography sitting on my table, they don’t ignore that simply because they were there looking at furniture.

I invited them in…I have something illegal in my house…they can arrest me.

The issue here is believing that your phone is indeed comparable to your house simply because you own it. It’s more comparable to your house if you live in a neighborhood with a strict HOA. Hah.

Or keep the carpet cleaner in your example…they see something illegal in your house and then report it to the police. Either way, you invited someone in.

We’ll see what happens when implemented, as I don’t think anyone has a leg to stand on to stop it from happening. No law enforcement or government is involved in this hard coding or even asking Apple to do this.
 
Looks like some Apple Employees aren't comfortable with this idea either. I guess they have faux outrage, and don't know how the CSAM Scanner 'really works'.

 
Or keep the carpet cleaner in your example…they see something illegal in your house and then report it to the police. Either way, you invited someone in.
But the whole point of @briko was that the carpenter was invited in to your benefit! In other words, what service does the hash algorithm do you as a customer on your device? Moments, face scanning, etc in the photos app does, yet this new tech does not. And crucially, it was never meant to be.


My question to you and others, what is the difference between on device scanning for dogs and cats that are hard coded into iOS (along with a slew of other “images”) versus hard coded images from the child porn database?
One is there to serve me, one exists to check that I do t do anything wrong… where the definition of “wrong” is both time- and place-dependent.

A couple of questions in return: why do you think they feel the need to do this on device? And why, beyond “I have nothing to hide”, do you personally not care about this? Serious questions, I’m not trying to catch you out.
 
  • Like
Reactions: Pummers and briko
So you are worried about the illegal content “search” versus the Spotlight “search?”

The issue with your prior example regarding the carpet cleaner and the police officer is that they are two different people.

If Apple is the police in this example, you’ve already invited them in. If I invite the police in my house to check out some furniture I have for sale and they happen to notice child pornography sitting on my table, they don’t ignore that simply because they were there looking at furniture.

I invited them in…I have something illegal in my house…they can arrest me.

The issue here is believing that your phone is indeed comparable to your house simply because you own it. It’s more comparable to your house if you live in a neighborhood with a strict HOA. Hah.

Or keep the carpet cleaner in your example…they see something illegal in your house and then report it to the police. Either way, you invited someone in.

We’ll see what happens when implemented, as I don’t think anyone has a leg to stand on to stop it from happening. No law enforcement or government is involved in this hard coding or even asking Apple to do this.

So here I am, advocating for the presumption of innocence, and the first thing you do when reading my analogy is presume that the carpet cleaner would find something illegal in my house and report me to the police...

On top of that, now you are changing your argument. The whole reason you brought up "moments" and "Spotlight" was that if we are comfortable with those searches, then we should be comfortable with this CSAM scanning implementation as well.

My point still stands, you can consent to one without approving of the other. Especially when the former is a feature that benefits users, and the later is designed to build a criminal case against users. You asked how they are different... and that is how they are different.

We could go down the rabbit hole of tweaking analogies all night if that interest you. For example: even people living in neighborhoods with a strict HOA have privacy rights...
 
  • Like
Reactions: Pummers and LinusR
But the whole point of @briko was that the carpenter was invited in to your benefit! In other words, what service does the hash algorithm do you as a customer on your device? Moments, face scanning, etc in the photos app does, yet this new tech does not. And crucially, it was never meant to be.



One is there to serve me, one exists to check that I do t do anything wrong… where the definition of “wrong” is both time- and place-dependent.

A couple of questions in return: why do you think they feel the need to do this on device? And why, beyond “I have nothing to hide”, do you personally not care about this? Serious questions, I’m not trying to catch you out.

They are doing it on device because they already do it today…they are simply adding another data point to check. The other reason is security…and I would use that word more than privacy…but they seem to be throwing both out there at the same time.

People break into others iCloud accounts all of the time and if they want to, can sift through tens of thousands of pictures (think of all of the celebrity nudes that have been released this way). But that is not very efficient and requires one to “look” at every pic.

By hard coding this on the phone, Apple will only be looking at new photos saved to the phone and marking those that are a match to the database images. But nothing happens unless those marked pics are uploaded via iCloud.

No one can see the tagging other than Apple…they never scan your photos off your device….no one would know anything as long as you keep your photos on your device and not share across devices….the chances of a false positive are so minute, not even worth worrying about…Apple controls the filtering and it is hard coded into iOS whereas anything happening in the cloud is easier to hack (still incredibly unlikely).

I understand the disagreement about on device versus iCloud, but the way they get the data is irrelevant since it only matters once it is in iCloud. It’s not about presumption of innocence since there is a set datapoint being used. And the idea that this somehow adds a system that is easier to abuse as some sort of back door is ridiculous since that door/system has been there for years.

I’m sure there will be lawsuits and we’ll see how it plays out, but I’m pretty sure Apple will prevail in this case.
 
  • Like
Reactions: artfossil
I’ve had enough. I have reverted back to iOS 14 and macOS 11 from the dev betas, and I will stay there until the hardware stops working. Also I have stopped working on a macOS dev project and will just focus on command line stuff to get my work done. When my Apple hardware quits working I’ll move to a flip phone and Linux. You all have fun arguing.
 
  • Like
Reactions: turbineseaplane
What's to stop a government to search your photos for guns, drugs or political memes instead of cats? That technology IS actually built into the iPhone right now without iOS 15.

That tech *could* be used in other ways, but they don't and we trust them that they won't. Same exact thing here. We trust that Apple won't abuse this function.
 
I don’t think most people understand that as the meaning of scan in this context. Many people in this thread with all the information at their fingertips are still repeating disinformation about the process.
...but others are making a disingenuous distinction between "scanning" and "hashing" as if the former is scary but the latter is "nothing to see here, move along".

That's true even before you go and read upon Apple "NeuralHash" and how it claims to cope with cropping/resizing, different image quality etc:

The main purpose of the hash is to ensure that identical and visually similar images result in the same hash,
(https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf)
That document is still full of doubletalk - like conflating "identical" and "visually similar" - but makes it pretty clear that the process is more like the ML/AI techniques (that some are trying to use as the definition of "scanning") than the common uses of hashes (for verifying that two files are literally identical) that most people are likely to encounter.
 
...but others are making a disingenuous distinction between "scanning" and "hashing" as if the former is scary but the latter is "nothing to see here, move along".

That's true even before you go and read upon Apple "NeuralHash" and how it claims to cope with cropping/resizing, different image quality etc:


That document is still full of doubletalk - like conflating "identical" and "visually similar" - but makes it pretty clear that the process is more like the ML/AI techniques (that some are trying to use as the definition of "scanning") than the common uses of hashes (for verifying that two files are literally identical) that most people are likely to encounter.
You are reading too much into those words...it is quite clear that they are talking about the SAME IMAGE, just slightly modified by cropping, color levels, etc.

They are NOT talking about your private sex pic being so close in size, position, color, etc. to one of the online images. And as they clearly state, it takes more than one image like that to be "similar" PER ACCOUNT. The odds of that happening are so astronomically high, it is not even worth worrying about.
 
You are reading too much into those words...it is quite clear that they are talking about the SAME IMAGE, just slightly modified by cropping, color levels, etc.

They are NOT talking about your private sex pic being so close in size, position, color, etc. to one of the online images. And as they clearly state, it takes more than one image like that to be "similar" PER ACCOUNT. The odds of that happening are so astronomically high, it is not even worth worrying about.
Exactly. If a photo is slightly cropped, it should still result in a match. I don't know about you, but I have nothing to be afraid of.

Also, if someone were to hack into my iCloud account and plant illegal photos there, then I wouldn't be safe either way.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.