Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Worth noting that your iPhone has been scanning your photos for content, for years now: if I do a search of my photo library for "dog" I get back a list of photos with dogs in them. When I identify a face in a photo with a person's name, I get offered a list of other photos that might be that same person. But I'm pretty sure these results are not sent to Apple: if I do the same searches on my iMac, I get different results. I do use iCloud Photo Library, so there is a mechanism there that could be used to sync, but I don't think they are, I think it's just confined to the device.
 
can you give an example of Apple privacy features that didn’t stay private?
I can give you plenty of incidents where Apple has been fined for deliberately misleading customers, including deliberately slowing down iPhones. Thusfar privacy although often quoted had few features, although Safari has allowed the feature of being able to manage website data, so querying when privacy features that only just surfaced didn't stay private is a bit academic, but I have no doubt should they implement this SURVEILLANCE there will be quite a few cases, no doubt with the flames fanned by Epic, Facebook, Elon etc. etc. giving them a wonderful comeback from a situation that need not be. However Apple was fined for breaching privacy by collecting location data that was meant to be private and a multitude of other fines and in a court of law you don't have to cite an incident where a specific similar event resulted in an illegal activity, but in demonstrating the capacity of an organisation or a company by precedent having broken rules previously rendering them as being potentially deemed unreliable in court .

Also plenty of fines
 
Last edited:
Basically, Apple is saying "Trust us, we're the good guys. We know what's good for you".

Sadly, most consumers are either ignorant to surveillance of their information/activities by corporations or they willingly accept it as the convenience provided by these corporations are just too important to their daily lives. Either way, as long as consumers keep paying, corporation like Apple will do as they please.
 
They've got us hooked on the eco system and now starts the about turn. They'll be launching an opted-in ad service next!
 
I agree with you 100%. However, Apple is not obligated to scan our iPhones. There has to be an alternative way to fight against child pornography.

Privacy is being exposed to its fullest. Especially, now a third party will be involved. Imagine if the information/photos get leak by a third party? Who's held responsible for that.

How can they be leaked?

You receive a photo in a messenger client. The app calls an API and the photo is analysed. The API returns value indicating if it contains too much nudity. The app shows a warning.

I can't really see where the leakage occur which couldn't also occur without such a system.
 
That said hash function can be used to reverse the original image as much as it can. Not all features are preserved but critical ones likely will.

No, hash functions are one way. Maybe it's possible to create a hash function which are symmetric but I have never seen one.

This is a SHA-256 hash:
3a42c503953909637f78dd8c99b3b85ddde362415585afc11901bdefe8349102

How are going to get any information from that?
 
The people who will be subjected to this are the ones who have nothing to hide and just want convenience. Then it will be useful as a mass surveillance tool where government X can insert items into the CSAM database and find out who among the population has those images on their phone.

Except the CSAM detection system is poor at what oppressive governments wants.

Let's look at the CSAM detection system and the system in the Photo app which do face, object and scene recognitions.

Which system is best for finding
A. photos of (illegal) protests? Photo app
B. different kind of pictures of Winnie the Pooh? Photo app
C. the one iconic picture of Tank Man? CSAM dectection
D. all kinds of photos containing criminals? Photo app
E. all kinds of photos containing opposition leaders? Photo app
F. manipulated photos, pictures and memes making fun of the leaders of the government? Photo app
G. any kind of pornography? The new feature in Messages or the Photo app
H. drugs in the photo? Photo app
etc.

You're worrying about the wrong system.
 
  • Like
  • Disagree
Reactions: mw360 and Shirasaki
E.g. in a time not far from now.
Trump Junior, the new president of the United States signs a few new executive orders demanding Apple to flag photos of oranges 🍊 , and hand out details to perform a few punishments against these state rebels.

The CSAM detection system would be extremely bad for doing that. It can't recognise anything.

The algorithms already in the Photo already to this. Try searching for sushi in the Photos app and it will show you pictures of sushi.
 
  • Like
  • Disagree
Reactions: mw360 and Wildkraut
I think the scanning of text on iMessage will come before the end of next year. I think Apple is also going to integrate some kind of safety monitoring with FaceTime as well when the app is turned on.

Apple already does scan all the iMessages you receive and has for many years.

Let's say someone sends you a URL, Apple scans the iMessage, finds the URL, does a lookup and provides an icon/picture for that URL based on the content of the URL.

Apple couldn't do that without scanning. And it happens locally on your device. All the time.
 
Do not equate a human "seeing" the photo with surveillance. Surveillance and privacy invasion occurs when ANY algorithm assesses the data, looking for content. THAT is the search.

But it isn't looking for specific content just trying to determine if the photos are not the same.

Why are you OK with the Photos app doing scanning and analysing of every photo in there?
 
Simply by providing a database full of "alternative" hash'd files to search against rather than the CSAM ones.

At the end of the day it's just a database, the parameters can be tweaked based on the legal requirements of that country. And yes, there will be requirements cooked up.

But why use the cumbersome CSAM detection tool when the Photos already contains feature which does this much better?

Why wouldn't the governments not force Apple to use the most effective algorithms?
 
The problem is, Apple owns the software but they do not own the phone once sold. I imagine a judge is not going to be okay green lighting software manufacturer's installing surveillance software on cellphones.

No problem as long as you accept it.

What do you think anti-virus, firewall and parental control software is?
 
The problem is I do not want to trade it in for another Apple device or gift card at Apple store since I do not want their products if they think it is okay to install surveillance software on their devices. Not to mention that seriously has altered what they advertised and the TOS when I originally bought my device. I seriously suspect Apple is underestimating the legal issues they are going to have with this.

If you read EULA or TOS you'll find that Apple reserves the right to change it unilaterally.
 
The CSAM detection system would be extremely bad for doing that. It can't recognise anything.

The algorithms already in the Photo already to this. Try searching for sushi in the Photos app and it will show you pictures of sushi.
Wow, magically an algorithm designed to identify an object based on input is bad at identifying an orange based on another input. Both searches for patterns in the image. It’s that pattern is encrypted while photos app deal with unencrypted data and doesn’t deal with security tokens that CSAM does. I really don’t see much difference.
 
  • Like
Reactions: 09872738
Leaving aside the rights and wrongs of doing that, they’ve slipped up on the implementation by storing known “bad” hashes on the device. It’s not sustainable because that list will get larger with every iOS update. Ultimately, the hash list will be larger than iOS itself. I’m at a loss to explain why they don’t just pass the computed hash to a web service to get the yay/nay from Apple.

I've read the database contains 200 000 pictures and that's after 20 years.
Also there is no need for Apple to carry everything if it gets too big. They can drop photos which are rarely distributed.


Assuming 200 000 photos and a 1024-bit hash (no idea how big it is) it would only be 25Mb.
 
What it is designed to do and what is does can be worlds apart. I'm sure it was done with the best intentions as Tim Cook has long been an advocate of ensuring child safety, but on this one its done interminable damage to Apple.

Whatever the blurb says, it is a form of SURVEILLANCE, it is an intrusion on PRIVACY, two of the most important aspects that Apple has sold itself on safeguarding.

So how do you feel about the Photos app which also scans every photo you have and does resource intensive analysis of them?
 
The photos are scanned as a prerequisite to being uploaded to iCloud. So everything you just typed is an outright falsehood.

The closest analogy I can think of is being asked to show your ID before you enter a nightclub. Hardly an unreasonable request.
let's see who's typing falsehood:

1-Instead of scanning images in the cloud (similar to google drive,MS,Dropbox) , the system performs on-device search before photos are uploaded to iCloud.

2-all images are scanned and compared to the database but the results of the scan (a hash match or not) are only sent along with the photo when it’s uploaded to iCloud Photos.

3-You can't opt out of this routine search,even if you disable iCloud photos,the on device search still happens.

4-Can the CSAM system be used to scan for other image types?
Apple says the system is only designed to scan for CSAM images. However, Apple could theoretically augment the hash list to look for known images related to other things.

While many other services scan for CSAM images, Apple’s system is unique in that it uses on-device matching rather than images uploaded to the cloud.





E8XUD_CXEAEuken.jpeg
 
Here‘s where you are wrong. Its a snitch on your phone. It scans (hashes) your images and then decides what to do.
If it decides you have incriminating images in possession, it „informs“ Apple. Nothing stays on your phone then.

Only if you are using iCloud Photo Library. Even if they weren't hashing locally, they could do it in the cloud. To me it doesn't matter if they do it on device or in the cloud if a scan is going to happen anyway if I use iCloud Photo Library.
 
I am assuming even with iCloud Photos off the pictures are hashed on your local device and they are just not compared to the list until you upload to iCloud. But that easily can change with a simple software update since the software is on your device not on the iCloud server.

Can you be 100% sure because Apple only uses the word "before" which certainly is open to interpretation.
 
let's see who's typing falsehood:

1-Instead of scanning images in the cloud (similar to google drive,MS,Dropbox) , the system performs on-device search before photos are uploaded to iCloud.

2-all images are scanned and compared to the database but the results of the scan (a hash match or not) are only sent along with the photo when it’s uploaded to iCloud Photos.

3-You can't opt out of this routine search,even if you disable iCloud photos,the on device search still happens.

4-Can the CSAM system be used to scan for other image types?
Apple says the system is only designed to scan for CSAM images. However, Apple could theoretically augment the hash list to look for known images related to other things.

While many other services scan for CSAM images, Apple’s system is unique in that it uses on-device matching rather than images uploaded to the cloud.


Apple has had the capability to reach into your iPhone all the time since its very inception.

One example is the ability to stop from running or delete every app on every iPhone in the world at any time.
For a long time every app could get a list of other apps which were installed on the iPhone, making 3rd party able to reach into the iPhone.
Apple has the ability to stop every version of iOS from running on any iPhone in the world at any time.
Every setup of a phone requires contacting Apple servers.
Also Apple logs lots of information locally todaywhich you can't stop.
 
let's see who's typing falsehood:

1-Instead of scanning images in the cloud (similar to google drive,MS,Dropbox) , the system performs on-device search before photos are uploaded to iCloud.

2-all images are scanned and compared to the database but the results of the scan (a hash match or not) are only sent along with the photo when it’s uploaded to iCloud Photos.


(1) directly contradicts Apple’s own FAQ.

Does this mean Apple is going to scan all the photos stored on my iPhone?

No. By design, this feature only applies to photos that the user chooses to upload to iCloud Photos, and even then Apple only learns about accounts that are storing collections of known CSAM images, and only the images that match to known CSAM. The system does not work for users who have iCloud Photos disabled. This feature does not work on your private iPhone photo library on the device.

I am subscribed to stratechery myself, but it’s clear that Ben Thompson has no idea what he is talking about when it comes to covering Apple-Related news.

Update - this is further clarified in an interview with TechCrunch.

 
Last edited:
Nobody at Apple is looking through the photos on your iPhone either. That’s what this technology is all about. It’s happening on your phone, and it stays on your phone.
I don't want anything analyzing or nosing through the photos on my device. If I upload them onto a server that I don't own? Fair game. But the photos stored on the hard drive of a device I own? No thanks. That's just too far for me.
 
  • Like
Reactions: deevey and 09872738
I've read the database contains 200 000 pictures and that's after 20 years.
Also there is no need for Apple to carry everything if it gets too big. They can drop photos which are rarely distributed.


Assuming 200 000 photos and a 1024-bit hash (no idea how big it is) it would only be 25Mb.
Source, please. I read that it’s now up to several million photos.
 
I don't want anything analyzing or nosing through the photos on my device. If I upload them onto a server that I don't own? Fair game. But the photos stored on the hard drive of a device I own? No thanks. That's just too far for me.
Bad news, Photos has been analysing the content of your pictures for years. You’ve been standing on the slippery slope all this time.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.