Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
We might agree to disagree on that. I'm not saying it's right, but we already give up a level of privacy when we upload to another company's servers. I'm ok with that for backup/organization reasons. I trust it won't be abused - and if it is, there will be a mass exodus.

I am not ok with any false positive in the realm of CSAM - even if it takes multiple to get me reviewed by a human at Apple (which is a whole other can of worms).
Actually you are ok since you have accepted their term of services.
 
Why would it be illegal to integrate checks in the procedure to upload photos to Apple’s servers, especially if they’re targeted at illegal content, and done in a privacy friendly way where no data leaves your device?
Because history shows us what appears to be a privacy friendly way, rarely stays that way.
Likewise even in Apple's own blurb it constantly refers to 'designed not to' etc., which means nothing. How many things designed not to end up doing just what they were designed not to.

Apple you really have shot yourselves in both feet with this idea however great the idea is about safeguarding kids.
 
I've been measuring my own reaction to all of this. I do not believe this is the end of the world, because I think Apple has done as well as it can to do this and still maintain your privacy. Apple is NOT "scanning your photos." On the Message feature, the iPhone is doing a scan of images sent to and from an affected account (child under 13, part of a Family Plan, feature opted in), but the scan is happening on the phone itself and not on Apple servers. And even the alert that goes to the parent, if it goes via iMessage, is encrypted, and Apple itself couldn't see its content to know it's an alert.

The iCloud Photo Library thing is, I think, a creative way to do something lawmakers and many users (I have no idea what percentage, YMMV what "many" means) are asking for: help in the fight against child pornography. It doesn't scan your photos for content, it does everything on the phone anyway and not on a server.

It actually reminds me a lot of the contact-tracing infrastructure Apple and Google put together: neither company would be able to figure out anything about you in this context; it's all kept on-phone and anonymous.

The weak link is what some are reacting to, the potential ability to turn this feature to point to a database of photos that isn't the CSAM database, so that governments or malicious actors (I'll let you determine for yourself if those are the same thing) can leverage it to look for other content. But that still begs the question of how difficult that would be. How abstracted is the list the iPhone is checking against? Is it coded directly into the phone, hard-linked to the CSAM database? Is it accepting a generic "list" sent by Apple? Would it require an iOS update? I don't know.

But what does bother me is this: this technological mechanism to, let's say "evaluate" and not "scan" your photos didn't exist, now it does. Apple says its answer to governments that want to use it differently is a flat "no," and I believe that is their intent. But prevention of that is not longer a matter of TECHNOLOGY ("you can't do that because the system as designed doesn't allow it") to one of POLICY ("you can't do that because we say so"). That is a step in the wrong direction. How big a step? That's beyond my experience to evaluate.


EDIT:

Though this post is now an eternity old in the time span of these things, it's still a pretty complete expression of my feelings about this, and I wanted to update it to reflect further conversations I've had on this thread.

(Disclosure: If you find it valuable to consider how many likes or dislikes a post gets, the first four reactions to this post came before this edit)

Someone pointed out to me that no matter how you conceive it, your photos ARE being scanned...because the phone has to do so to generate the fingerprint hash in the first place. This is a fair point, but I would also point out two things:

(1) Whatever the process is for determining a new photo's fingerprint, that process certainly outputs the hash. But I do NOT know if that same process outputs more AI evaluation of the content of the photo. I suppose it's possible that the act of creating the hash cannot also examine the photo, "scanning" it in the way I think most people mean. I do not know, perhaps someone else here can enlighten me.

(2) The iPhone is already scanning your photos for content, and has been for years, whether or not you use iCloud Photo Library. If I go to the photo library on my phone and search for "tree" I get photos in my library that contain trees. It's not a very extensive list: my library contains over 15,000 photos and searching for "tree" got me 182 results. As I take many photos of birds, I expect I have many more than 182 photos with trees in them. So clearly it's not a very comprehensive scan, but that's a different topic. The same search on my Mac library (exactly the same library) yields a different number or results, so I infer that this search, as well, is done on the phone and doesn't leave it.

"People" is another example. Once I've identified a face, then Photos will suggest other photos with the same face. It's taking a scan of a photo and then applying the results of that scan to other photos! Again, searching for the same "Person" on my Mac yields a different number of results, because I think that data remains on the phone and is not sent to Apple.

There might be those that question whether this distinction even matters: Apple makes the phone, so Apple is scanning the photos. But I do think that this is a real distinction. If data isn't sent to Apple's servers, then Apple doesn't have the data.
 
Last edited:
Because history shows us what appears to be a privacy friendly way, rarely stays that way.
Likewise even in Apple's own blurb it constantly refers to 'designed not to' etc., which means nothing. How many things designed not to end up doing just what they were designed not to.

Apple you really have shot yourselves in both feet with this idea however great the idea is about safeguarding kids.
And their implementation is privacy friendly, when other (google and co) scan your photo on their server without warning you at all.
 
Apple *is not* scanning your photos.
That’s not how it works.
There’s a One and 1 trillion chance that anyone from Apple will ever see any of your images
Do not equate a human "seeing" the photo with surveillance. Surveillance and privacy invasion occurs when ANY algorithm assesses the data, looking for content. THAT is the search.
 
But what does bother me is this: this technological mechanism to, let's say "evaluate" and not "scan" your photos didn't exist, now it does. Apple says its answer to governments that want to use it differently is a flat "no," and I believe that is their intent. But prevention of that is not longer a matter of TECHNOLOGY ("you can't do that because the system as designed doesn't allow it") to one of POLICY ("you can't do that because we say so"). That is a step in the wrong direction. How big a step?

It's a really really big step and a huge mistake to be making.

I don't even know why they talked about "saying no to Governments" - they know damned well that they'll have no choice if a Government wants to force their hand. They're literally not able to keep up their promise of "flat no" there.
 
Have their TOS been updated to include this topic?
not for now but if you clic on accept during the update to iOS 15 it will be done and you will say yes I accept this scan, and read any other update 14.8 because it can be in it.
It's a really really big step and a huge mistake to be making.

I don't even know why they talked about "saying no to Governments" - they know damned well that they'll have no choice if a Government wants to force their hand. They're literally not able to keep up their promise of "flat no" there.
Go tell that to the FBI who have failed to force them to open a breach in iOS
 
not for now but if you clic on accept during the update to iOS 15 it will be done and you will say yes I accept this scan, and read any other update 14.8 because it can be in it.

You then get into the rights of the owner of the physical device vs the owner of the software.
 
  • Like
Reactions: peanuts_of_pathos
‘Steve’s vision’ contributed to many clangers during his tenure at Apple.

Macintosh 128k? In making it an appliance, it lacked many features that would have otherwise helped make it attractive to large industries. It wasn’t until he left Apple that they changed it from an underpowered toy to a creative asset.

G4 Cube? An overpriced, flawed mid-range Power Mac.

OS X Lion and the reduction of the file system? Let’s not go there.

Bing? Haha.
Absolute rubbish, and you obviously never knew Steve or even the history of Apple.

When Steve decided to leave to pursue NeXT Apple tried to put a brave face on it, and ironically the very Coca Cola man at the helm then had lawsuits and press reports about Steve being fired...

Steve then worked on NeXT which as a computer was pretty miserable in its first attempt, but where the operating system and his vision are basically what Apple is now, and without him being ushered back to Apple with them paying him $429 million dollars for NeXT, you'd have no iPad, no iMac, indeed no Apple device as you know them today, as his vision of an operating system via NeXTSTEP is still at the heart of every Apple operating system still.
 
Benefits to whom though?

I'm not looking for my data and my phone to be part of a society wide dragnet to solve issues as determined by others.

This needs to be FULLY opt in required in my opinion.
Any benefits though should be choice of the owner of the phone, not Apple. They have really crossed a significant line and need to draw back urgently, to recognise its a mistake as you cannot have media campaigns about security about privacy etc. etc., then open any door violating it. However much Apple protests after the event that it was designed only to serve the function they state, the fact is history is littered with similar comments that proved in the fullness of time to be bullcrap.
 
  • Like
Reactions: peanuts_of_pathos
You are going to equate Epic with the distribution of illegal pictures?
Illegal pictures today, but then its not going to be accurate as they don't have access to the world's database on child abuse photos? This renders it useless.

Illegal pictures of children today
possibly illegal pictures of something else tomorrow
ANY pictures the day after
 
Help me understand, because I clearly don’t get why everyone is so angry about this

If I offered an app with the ability to upload images or whatever, and I’m legally and morally compelled to prevent child abuse images, why would I not want to use a free service from Apple that does this in a way that maximizes privacy.

I get no corporation can be trusted. Facebook, Boeing, every single Wallstreet firm, et al….all scum. I get the slippery slope arguments. But even if you don’t trust Apple’s leadership and their intentions, trust how they make money. They sell premium devices which will expand more and more into health and financial transactions. There is a lot more money for Apple there then selling you a new iPhone or Mac every 3 years. And this strategy needs rock solid security and privacy. Plus, they need consumer trust. (You would be nuts to give your medical info to Facebook, Google, Amazon, Verizon.)

I think the truth is far more simple and benign:

1. Apple wants end-to-end encryption on iCloud. They haven’t done it because people lock themselves out of their account losing access to a lifetime of memories and critical date and because law enforcement bullies tech companies.
2. So Apple solves these problems under their terms. At wwdc we see ability to designate a person who has access to your iCloud. Yes, it was framed as post-life, but it’s a safety mechanism for any scenario when someone can’t get access. And they decide to focus on child safety because it‘s the law, it’s morally correct, and because this is an area where they can have the most impact.

facebook, Google, yahoo, Adobe, Amazon, you name it, all scan your uploads for child porn. All of them. Again, it’s the law.
A prediction I make is that whilst many may decide not to download software that whatever way you look at it despoils Apple's espoused intention to safeguard privacy.

After then making a strategic mistake over this matter, users will be forced to update software at the next security update.

A very very slipper slope Apple, and I've been using Apple kit for decades
 
FBI is a small player. The real constitutional violations are overseen by the FISA court. And, NO, Apple cannot say "no" to them, nor can they EVER disclose what spying they are required to conduct on behalf of the U.S. government.
If it was this simple we will not having this conversation because the law cannot force an entreprise to do anything who can arm it's activity (Microsoft have win a court Agains USA government with that).

If FISA have this much power why FBI/NSA and so on don't have access ? because even FISA is restricted by the constitution and the law.
 
  • Disagree
Reactions: peanuts_of_pathos
This is why it feels like Apple is bending to some kind of hidden state coercion.

If so, it most likely the Chinese threatening market access or the US DOJ whispering about an anti-trust breakup.
I was thinking this or they already look at all our photos and caught someone with massive amounts of csam and the apple lawyers went into panic mode worried about liability.
 
A prediction I make is that whilst many may decide not to download software that whatever way you look at it despoils Apple's espoused intention to safeguard privacy.

After then making a strategic mistake over this matter, users will be forced to update software at the next security update.

A very very slipper slope Apple, and I've been using Apple kit for decades
Actually I have disabled automatic updates on all of my Apple devices and I am not installing any updates unless Apple backtracks on installing this surveillance software on my physical device.
 
  • Like
Reactions: peanuts_of_pathos
The question is though, is there a better reasonable alternative to iPhone? Isn't it still more secure and has better privacy than Android phones? I mean, I know there are some obscure alternatives but then you lose apps and possibly great phone cameras as well.

So what are we supposed to do, we're stuck between choosing either a bad option (Apple) or an even worse option (Google) :///

But if anyone has any suggestions, I'm all ears!

Security wise just follow safe internet practices and keep your apps up to date. You'll be fine.

Privacy, it's hard to say. I can set privacy settings on Android to reduce or eliminate sharing with Google. Everything else is an app per app basis. At this point I believe them as much as I believe Apple.

Anyways privacy and security also depend on the Android brand. By default. I'd consider the Google Pixel to be the best on both fronts. No, not because I think Google is an angel. It's just the Google Pixel is just vanilla Android and the base Google software set. Every other brand is going to be the Google software set, custom Android ROM and that manufacturers software set. More software and tweaks to the android ROM means more potential privacy and security holes to fill.

Samsung is probably the second best. Even though there's are heavily customized. They also are far more diligent than most in patching holes. Privacy is tougher to say. I'm not too familiar with Samsung's privacy practices. But they replaced a lot of Google Software with Samsung software. Maybe it is better.

I certainly wouldn't trust any brands coming out of China. I'd stick with US, EU, South Korean and Japanese brands.

You may also go down the rabbit hole. By choosing a privacy focused Android ROM or even a Linux distro for phones. Neither will be fun for the uninitiated. You have to choose your phones carefully for compatibility. The Android ROM will be easier but one which is de-googled will make getting apps more of a pain without the Play Store.

Linux on phone is only for the bravest of cyber warrior. One who accidentally calls his wife (palm of hand) Grep, speaks fluent Klingon and owns every Dungeons and Dragons piece.:p
 
If it was this simple we will not having this conversation because the law cannot force an entreprise to do anything who can arm it's activity (Microsoft have win a court Agains USA government with that).

If FISA have this much power why FBI/NSA and so on don't have access ? because even FISA is restricted by the constitution and the law.
I'm not sure what that first sentence means.
But, regarding FISA, NO, it is not "restricted" by law or constitution, because, by definition, its actions are not reviewable by any other court. In other words, if someone wanted to challenge a FISA ruling, there is no place to do so, since it is illegal for any court to review FISA actions. Therefore, FISA is above the law and the constitution.
 
  • Like
Reactions: peanuts_of_pathos
And their implementation is privacy friendly, when other (google and co) scan your photo on their server without warning you at all.
Google reading emails and scanning photos is written in their TOS Apple will include a detailed version in their TOS too.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.