That is true. I would look into seeing if creating shortcut is possible to make it automated for you.Good idea. I’d have to update that folder every once in a while though.
That is true. I would look into seeing if creating shortcut is possible to make it automated for you.Good idea. I’d have to update that folder every once in a while though.
I’m not okay with warrantless searching on people that are not presumed to have committed a crime.
PLOT TWIST: They already do this today!! You’ve been consenting for practically as long as you’ve had an iPhone.
Provide a citation
From Apple’s own support page…and it has been doing this for about 5 years now (just better now obviously as tech has improved).
Moments: Search for an event, like a concert you attended or a trip you took. Photos uses the time and location of your photos along with online event listings to find matching photos.
People: Find photos in your library of a specific person or a group of people. Just keep names and faces organized in your People album.
Places: See your photos and videos on a map in the Places section. Or type a location name in the Search bar to see photos and videos from that place.
Categories: Photos recognizes scenes, objects, and types of locations. Search for a term like "lake" and select a result to see photos that match.
The Search tab also suggests moments, people, places, categories, and groups for you to search. Tap a suggested search, such as One Year Ago or Animals, to explore your photos.
When you search your photos, the face recognition, and scene and object detection are done completely on your device. Learn more about photos and your privacy.
The last part is basically what they are doing now as well. They have on-device parameters that recognize objects in your photos. No different than the on-device database hashes they will be adding.
The difference? The older data points just help when you are searching for the stuff listed above. The new one specifically mat-res photos that match the database hashes and then flags them when uploaded to iCloud. Have enough of them in your library and Apple checks to verify. If you uploaded known child porn to your iCloud account, the authorities are contacted.
So, again, why wasn’t anyone freaking out about this existing (for 5 years now) “back door” way of scanning images on your phone? What would have stopped them previously from adding the ability to recognize “trump” or “pink triangles” or anything else they could think of? Why do all of these lame conspiracy theories have more validation now?
And still, why would the government even NEED this?? There are easier ways to scan your phone and especially iCloud by hacking into them instead of involving hard coded data being added to a phone by Apple (not the government…Apple does it!)
EDIT: When you click on the privacy link:
Photos lets you choose who has the full picture.
The Photos app uses machine learning to organize photos right on your device. So you don’t need to share them with Apple or anyone else.
More about Photos
Your photo and video albums are full of precious moments, friends, and your favorite things. Apple devices are designed to give you control over those memories.
Photos is also designed so that the face recognition and scene and object detection — which power features like For You, Memories, Sharing Suggestions, and the People album — happen on device instead of in the cloud. In fact, the A13 and A14 Bionic chips perform over 100 billion operations per photo to recognize faces and places without ever leaving your device. And when apps request access to your photos, you can share just the images you want — not your entire library.
I’m convinced that you and others just choose not to read the facts about this tech and that it has in reality been on your phone in one form or another for 5 years now.
I'm sorry, but "this tech" has never been on the iPhone before.
iCloud photos is separate from iCloud device backup. I don’t think the device backup includes the photos and other components that are separately stored in iCloud. iCloud photos are encrypted but Apple has a key. I’m not sure about iCloud device backup being encrypted but if it is I’m sure Apple also has a key for it. iCloud messages are encrypted end to end so Apple does not have a key for them per my understanding.Are iCloud photos able to de decrypted if iCloud Backup is set to OFF? The way I've always understood things like Photos in iCloud and Messages in iCloud is that if your device is set to having backups OFF while the other two mentioned are ON then there really isn't a way to decrypt those seeing as they aren't accompanied by an iCloud backup that can be decrypted.
Sorry but I feel you are bit off kilter on this and keep drifting from the point. Maybe you missed it.
Client side searching has been around for a long time. I seriously doubt Apple is doing this today. They are reporting far to few CSAM violations for that. All they apparently do today is some Mail scans and other checks when issued a subpeona. The function to take a face, a leaf, a flower and do a “Google” search is far different than this. I suspect you are quite well aware of this.
It isn’t the solution, rather the why do it client side? Despite the potential for misuse, there are easier methods to accomplish this. After all I have reviewed and learned, that single piece stands out. Stands out to a lot of others also.
They have been doing that for 2 years…another article listed here that shows that.
If the "tech was already on the phone", why is Apple even talking about this, right?
They'd just be "doing it"..
I think you're going in circles with that other user honestly.
He/She is off on basic facts here
Can you point that link out? It would be appreciated.
Client side is not listed in any document I read including Apples’. At least not that I am aware of.
I think it’s in this thread somewhere. Link to an article where they found that Apple has been scanning iCloud for the images since 2019.
They have been doing that for 2 years…another article listed here that shows that.
You've completely lost the plot. And you are being called out for it.
You can't link a credible source, because you aren't making credible claims.
We are talking about on-device scanning, which has been established many times already. And without a warrant.
There isn't one, the fake outrage is hilarious - people that have no idea what's been happening on their device for years or have bothered to read the EULAs and Privacy Policy.What…is…the….difference????
I’m only repeating what someone else posted (and I read the article) linked in this thread that showed Apple was indeed scanning iCloud for the past two years for images.
They have been doing on device scanning in one form or another for 5 years.
I copied what they scan for now right off their own website.
My question to you and others, what is the difference between on device scanning for dogs and cats that are hard coded into iOS (along with a slew of other “images”) versus hard coded images from the child porn database?
The answer of course is nothing…other than the subject matter of the hashed images.
If your main concern is them scanning your personal images on your device, why are you upset now and not when they were doing it before?
Why could some government upload other images into the hashed dog and cat database they have hard coded in versus the ones they will be adding in iOS 15?
What…is…the….difference????
They also haven't bothered to educate themselves properly on how the CSAM tech works.
And no, using the "moments" feature is not the same thing. I consent to using that image analysis, and it is done for my benefit. It is not a surveillance feature that phones home to report on citizens of criminal behavior.
But the whole point of @briko was that the carpenter was invited in to your benefit! In other words, what service does the hash algorithm do you as a customer on your device? Moments, face scanning, etc in the photos app does, yet this new tech does not. And crucially, it was never meant to be.Or keep the carpet cleaner in your example…they see something illegal in your house and then report it to the police. Either way, you invited someone in.
One is there to serve me, one exists to check that I do t do anything wrong… where the definition of “wrong” is both time- and place-dependent.My question to you and others, what is the difference between on device scanning for dogs and cats that are hard coded into iOS (along with a slew of other “images”) versus hard coded images from the child porn database?
So you are worried about the illegal content “search” versus the Spotlight “search?”
The issue with your prior example regarding the carpet cleaner and the police officer is that they are two different people.
If Apple is the police in this example, you’ve already invited them in. If I invite the police in my house to check out some furniture I have for sale and they happen to notice child pornography sitting on my table, they don’t ignore that simply because they were there looking at furniture.
I invited them in…I have something illegal in my house…they can arrest me.
The issue here is believing that your phone is indeed comparable to your house simply because you own it. It’s more comparable to your house if you live in a neighborhood with a strict HOA. Hah.
Or keep the carpet cleaner in your example…they see something illegal in your house and then report it to the police. Either way, you invited someone in.
We’ll see what happens when implemented, as I don’t think anyone has a leg to stand on to stop it from happening. No law enforcement or government is involved in this hard coding or even asking Apple to do this.
But the whole point of @briko was that the carpenter was invited in to your benefit! In other words, what service does the hash algorithm do you as a customer on your device? Moments, face scanning, etc in the photos app does, yet this new tech does not. And crucially, it was never meant to be.
One is there to serve me, one exists to check that I do t do anything wrong… where the definition of “wrong” is both time- and place-dependent.
A couple of questions in return: why do you think they feel the need to do this on device? And why, beyond “I have nothing to hide”, do you personally not care about this? Serious questions, I’m not trying to catch you out.
...but others are making a disingenuous distinction between "scanning" and "hashing" as if the former is scary but the latter is "nothing to see here, move along".I don’t think most people understand that as the meaning of scan in this context. Many people in this thread with all the information at their fingertips are still repeating disinformation about the process.
That document is still full of doubletalk - like conflating "identical" and "visually similar" - but makes it pretty clear that the process is more like the ML/AI techniques (that some are trying to use as the definition of "scanning") than the common uses of hashes (for verifying that two files are literally identical) that most people are likely to encounter.The main purpose of the hash is to ensure that identical and visually similar images result in the same hash,
(https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf)
You are reading too much into those words...it is quite clear that they are talking about the SAME IMAGE, just slightly modified by cropping, color levels, etc....but others are making a disingenuous distinction between "scanning" and "hashing" as if the former is scary but the latter is "nothing to see here, move along".
That's true even before you go and read upon Apple "NeuralHash" and how it claims to cope with cropping/resizing, different image quality etc:
That document is still full of doubletalk - like conflating "identical" and "visually similar" - but makes it pretty clear that the process is more like the ML/AI techniques (that some are trying to use as the definition of "scanning") than the common uses of hashes (for verifying that two files are literally identical) that most people are likely to encounter.
Exactly. If a photo is slightly cropped, it should still result in a match. I don't know about you, but I have nothing to be afraid of.You are reading too much into those words...it is quite clear that they are talking about the SAME IMAGE, just slightly modified by cropping, color levels, etc.
They are NOT talking about your private sex pic being so close in size, position, color, etc. to one of the online images. And as they clearly state, it takes more than one image like that to be "similar" PER ACCOUNT. The odds of that happening are so astronomically high, it is not even worth worrying about.