Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
There’s an interesting dynamic change with how Apple is approaching their customer base.

Apple was treating its customers as if they were law abiding.

Apple is now assuming its customers may be doing illegal things, and they feel some obligation to police them.

The debate being whether that policing is a little or a lot, and whether that policing will be abused.

Another way to look at is that Apple believes in some way that their products and ecosystem have somehow enabled more of this criminal behavior, and they feel responsible (or have been silently pressured) to curtail this.

What I find really puzzling, Apple could have fixed the problem forever without this scan and report framework. Hashes for images on the watch list, could just be loaded onto the device, and the device display could be made to fail to render any of these illegal pictures.

This is similar to how Photoshop handles images of currency, it watermarks them in the UI and on file/print output, to prevent counterfeiting.

Also remember last year? Apple charged headlong early into panic, and partnered with Google to roll out a fully dubious and shoddy contact tracing system.

Whatever good intentions Apple had, that same tool is now being leveraged as part of serious plans by several governments to forcibly segregate populations. That’s a dangerous tool, and always a very dangerous sentiment to foster.

Can you imagine being told that Steve Jobs’ company was going to go on to help make control systems for concentration camps? Like literally.

Not one person in Cupertino has raised an eyebrow yet? Not a great look. Destroying trust, then asking for more than ever before. Not great marketing.
 
To all those gung-ho about going after innocent people in search of crimes, consider this.

If the iPhone came out in 1921, this conversation would be about Apple implementing iOS features that root out homosexuals, considered sex offenders in ALL states and most countries then… and as recently as 2006 in some parts of America. And why stop there when they could help fight Communism in 1961 and Jews in 1931?
 
At this point, someone in Apple must have been regretting the decision to publish this feature, which understandably creates such a big wave of backlash.
Next time Apple doing this, I am not sure if they will just keep their mouth shut and deny any allegation like they always did regarding major hardware failures (antenna, battery, bend etc).
 
First big difference: By just knowing your phone number your could get send matching images on your device. There are apps that would put those images into Photos which then syncs to iCloud. Not much fun...
Second big difference: Your iPhone may be more easily hacked than iCloud.
I'm not convinced by your two arguments - you should be able to decide which applications download images directly to your photo roll and you'll need to bring me evidence about how an iPhone could be easily hacked than iCloud.

and your first point would be either bad if it's client or server side anyway.


my real issue with this outrage is : if client side scanning is not ok, server side scanning is not ok either (less evil is still evil). both are fugly if they can't be peer reviewed in any way.

I understand people being skeptical to the least with Apple's approach but I'm yet to find voices getting angry about the status quo that is server side scanning.
 
Oh boy, I'm having flashbacks lately over this whole thing.

I remember being at a dinner party back in the 90's, possibly. The topic of the moment was 'scanning the internet for porn, and female physically abusive images'. I was introduced as an 'internet expert' by someone in the discussion at the table, and they did me absolutely no favors.

'So, what do you think?'

It's a waste of time. *GASP* from the people there. Well, yes, it's a waste of time. The internet is a massive relatively unregulated space, that is really nearly totally incapable of being regulated. The time it would take to scan the 'entire internet' would be useless. 'So you are okay with sexually exploitive content being on the internet?' No, I'm probably just as offended by that as anyone else, but what happens when you find something. How are they going to be 'banned' from the internet, and what stops them from opening up a new site and having the same stuff out there. It's so easy to start an internet website, I'd bet that most people that are caught will just start a new site, and it could take hours, in some cases.

And the bleating started, and the demonization of me, and the 'entire IT industry' for tolerating that.

Aside from being physically impossible to 'scan the entire internet', the whole idea of being able to 'police' it is absurd. Should everyone with an iCloud account, and using their iCloud photo library be subject to searches? Well... If they are going to do it, then any other service that saves images on the internet should have to do it too. Either everyone does, or no one does. I'm still sickened by the kiddie porn I found on a clients computer when I was working as a subcontractor. I'm even more horrified that the guy didn't go to prison. Should people be subject to searches. The 'founding fathers' thought no, but they were fighting against an occupying military that would search your house, and person, for anything that could be deemed against their continued rule of the original colonies in the new America. They didn't have the internet, or even a way to save pictures in a way to share with others. Does that mean I'm in favor of having my photo album searched? Not exactly, but if someone is dumb enough to save their obviously illegal images on a service like that, they deserve to be exposed. That won't be popular with many, but csa is a real damaging thing for children to have to live with. I don't think there is a way to balance security and privacy with the crime of csa. It happens in private, and the perpetrators depend on the coercion of their victims to keep it 'private'. And just like a website, the people saving such images are just going to delete them, close their account, and open another one on another service, or just go 'old fashioned' and save it all locally.

Call me 'jaded'? Having seen the worst of people, and knowing that it's out there, I'm not so sure that there shouldn't be *some way* to root it out. The damage the perpetrators do to their victims is lifelong.

Should there be a better way? I don't know if there is, but the damage is done. Apple has probably lost many customers over this. Ignoring the the problem isn't going to make it go away. And we are all subject to a massive amount of surveillance already, all in the name of 'national security'. I don't know, it's an issue that is emotionally charged. Like scanning the entire internet...
 
Not nervous. Angry.

It does seem hypocritical.

A web hosting company wants to be immune from what their customers post, or host, and now Apple wants to 'police' their iCloud member's data. It does seem like an odd thing to want to do. So *should* hosting companies be held liable for their content? How can a hosting company 'police' all of their customers and eliminate 'objectionable content', and who decides what's objectionable. Seeing multiple angles on this doesn't help...
 
I`m glad I am not the only one who thinks the blur thing could be used by homophobic parents trying to block everything a kid would see. And they could still bypass the age registered in the system to control their teenagers.
As a member of the LGBTQ community, I am dismayed by opponents of the parental controls in iMessage because of a presumed disparate impact on homosexual youth. The parental notification of potentially explicit imagery views on kids devices are for only kids under 13yo. Parents of all kids, including LGBT kids, deserve to protect these young kids from potentially being targeted by child predators.

Further, LGBT kids in repressive households or communities are MORE susceptible from grooming by child predators. We shouldn't limit parental controls because there are some parents with unhealthy parenting skills. These parents could just take their kids devices and check all the messages regardless.
 
  • Like
Reactions: DougieS
My point was that you trust that the scans of your photos will remain for your benefit. Doesn't all the same "slippery slope" arguments apply to the scanning that Apple is already doing. And, further, it is doing it for all photos not just those that are being uploaded to iCloud Photos.
Prior to this, I trusted Apple's privacy stance and so I was not bothered by activities that could potentially turn into surveillance. This breaches that trust. Look at all of the core items that rely on a balance between privacy and surveillance: Homepod, Siri, your airtags using my phone connection, findmy. Apples Contact Tracing, photo identification, Homekit Secure Video, Maps, icloud mail, Password storage, health. I'm sure you can figure out more.

I now am minimizing my usage of each of those items because I cannot trust Apple's privacy stance. My Christmas list of Apple products is now much much smaller.
 
I'm surprised to see Apple doing this because they seem to be the front runners of this whole privacy mantra. It counterpoints everything where Apple stands for.

This is exactly how we know that Apple's "Privacy" initiatives are Marketing and not real. Internally and privately, they scan and give all governments what they want.

So the marketing department knows they are already doing it and are just trying to get some good PR out of it. Unfortunately, in the process they revealed the reality of what happens behind the scenes.

Not much else makes sense. Unless of course, they are really that ignorant of their user base.
 
Why make a huge drama out of something that's very simple? Nobody is looking at your photos.

Privacy doesn't give anyone the right to commit a crime.

People do deserve their privacy. This is not like having a camera watching their every move. It's just an AI filter that checks for illegal images, images used to commit a crime. If it finds any, it will raise a flag.
Apple is trying to protect themselves from being accused of protecting criminals and getting involved into a lawsuit.
There are people who think that by using an Apple device they can take and store illegal pictures.

Governments want Apple to create a backdoor, something that would affect the security and privacy of the devices. What Apple is trying to do is implement a non-invasive way to help combat crime, so certain governments and haters don't complain that Apple protects criminals.

Maybe as you suggest, Apple should put this to rest, and let people continue with their business causing pain and suffering to helpless children. It may not bother some people at all, until it happens to someone close to them. Then they will be actively in favor of what Apple is trying to do here.

Just like with the use of a face mask. People are complaining saying "is their body and their right not to wear a mask", without any concern for the people around them. The mask is to protect those around you and protect yourself.
Same thing in schools now, Mothers against having their children wear masks "because is their right not to do so". This is ridiculous!
Just wait until they or their children get sick with COVID-19, their mind will change and will be rallying in favor of wearing the mask. It's already happening, just search in YouTube. Many of those against it, or those who used to say that COVID-19 doesn't exist, are now pleading to people to take it seriously.
Privacy does also NOT mean the ability for a private company to invade it.

Again this seems another poor effort to hide the fact that it is surveillance.

If people do deserve their privacy then you can't in the next breath contradict it.

There is such a thing as due process and law enforcement are bound to abide by that process. Sometimes yes its a pain in the butt but it is necessary.

Law enforcement agencies have no problem obtain permission through the courts if there is perceived to be a reasonable ground for suspicion, but private companies who ironically guard their privacy much more than most, and then send out threatening letters to some who even comment about potential new equipment can hardly be arbiters for what THEY check out and on whom.

Now if governments change the law and Apple are forced to engage in these machiavellian attempts to install a backdoor, then Apple should say so, then at least the electorate (in countries that allow voting) would be able to give a mandate or vote out governments that the electorate gave a mandate to govern.

Allow this in the name of child abuse/child pornography and install software on every Apple device eventually and you may not at some stage even have an election that means anything, as sure as eggs is eggs, one thing politicians love is power, and if the potential exists to do what Apple suggest, then the potential exists to extend it, and history shows us how when these things evolve they often end up being the tools of power for politicians and extreme governments, rather than any crime fighting excuse made at the outset.

Amazing Iceman. From your post it would appear you may be happier living in China or with a regime that dictates everything, or why not have mandatory chips placed in each of us as well as obviously free will is not something you particularly like and which is fundamental to freedom?

Presumably you like to choose what you wear, what car you drive, what TV channels you watch, where to go on vacation, etc. etc., well guard those freedoms well, as history shows what happens when surveillance is extended as it inevitably is, even if it starts out in the name of something as emotive as child protection.

For the record I've had both vaccines, still wear a mask to shops, because it is my choice based on the information I have available.

Its my choice whether to turn on iCloud on my hardware, my choice with most of the features on Apple iPhones, iPad, iMacs and other devices where settings and other controls allows you to choose even ironically privacy settings, but I have no choice when someone else puts software on my systems and usurps whatever choice I ever had....whether it starts off in the name of Child Safety it is a slippery slope taking away choice.

We even command about freedom and cherish it, presumably you cherish your freedom, well look after it as state sponsored surveillance en masse without reasonable cause or even via a company acting as a police force, erodes your freedom and as some have found it may finally remove whatever freedom you once had.

In an age of behemoth IT companies that wield more power than some governments, and are unelected, and even decide apparently what tax to pay and where and its ironic these organisations guard THEIR freedom, their privacy very well indeed.

So many posters still distorting the fact that it is surveillance and turning off I cloud doesn't stop that, as the software is on the hardware. It will not just be iPhones, it will be all devices.

It is alleged this software was already in, in which case I expect a wad of lawsuits, as it would seem to contravene existing policy which may have provided the decision to purchase.

Likewise even the smallest amount of data in a has, still requires processing speed, and as we know Apple has already been fined for slowing down machines without users knowledge, and this will slow down machines, albeit probably a small amount, but where they've allegedly introduced it where its been suggested elsewhere it was spotted hence its now been made public.
 
Last edited:
This is exactly how we know that Apple's "Privacy" initiatives are Marketing and not real. Internally and privately, they scan and give all governments what they want.

So the marketing department knows they are already doing it and are just trying to get some good PR out of it. Unfortunately, in the process they revealed the reality of what happens behind the scenes.

Not much else makes sense. Unless of course, they are really that ignorant of their user base.

So Apple users are pedophiles hiding their pics on iCloud? Careful...

Who knows why Apple decided to get a conscience. And who knows how widespread csa is. Maybe looking for it will answer that question. I've heard from some of the local docs I know that more kids are coming in with injuries and complaints consistent with being sexually assaulted. What's the one site that has prepubescent girl pictures on it? I was shocked to see stickers from that site on a few cars in the area I live. (Is it chive?) They should be closed down...
 
  • Wow
Reactions: bobcomer
About the "Messages" thing.. it doesn't break end to end encryption because the processing is done before or after the photo is sent/received, and it stays on the device, so Apple doesn't have any access to that material.
 
the problem with human review is that it can be swamped if - as may be the case - they've badly underestimated the number of flagged cases. Partly because there's just more of those images going around but also... that would be the *point* of being able to produce innocent images that trigger the algorithm: To swamp the human reviewers with false positives. a Denial of Service attack. And then shortcuts get into the process and mistakes are made, both ways. And as for the human reviewers, see Facebook's experience: there's a human cost to *having* to view the kind of images this system is meant to trap, and a shockingly high burnout rate. Especially if they're swamped and can't even keep up.
.....
the problem with human reviewers is.....
private photos weren't meant for anyone else's eyes.
Its like Polaroid getting to review your instant photos that you never intended to take someplace to have developed.
 
  • Like
Reactions: eltoslightfoot
This demonstrates agencies involved in pursuing criminals already have access to material, but only with due process and reasonable cause.

I would rather hear Apple tell us who if anyone has insisted on this rather than Apple implement this 360degree turn in its policy, without any consultation with the people who bought Apple hardware in good faith, mindful of Apple's stance on privacy.

If its not been requested shelve it via hardware, as that's not reasonable cause, its intrusion, its unreasonable. You can't have mass surveillance on a billion plus apple devices via their hardware then suggest its of no concern, even if in the name of fighting Child pornography, which it will not do.




 
  • Like
Reactions: BurgDog and Schismz
About the "Messages" thing.. it doesn't break end to end encryption because the processing is done before or after the photo is sent/received, and it stays on the device, so Apple doesn't have any access to that material.
But its still surveillance...which appears to be what many are trying to ignore.
 
.....
the problem with human reviewers is.....
private photos weren't meant for anyone else's eyes.
Its like Polaroid getting to review your instant photos that you never intended to take someplace to have developed.
That's the thing though. Human reviewers don't see your photos until 2 things happen. You have to have 30 CSAM matches. Once 30 vouchers are generated during image upload, then a second separate process on Apple's server perceptually scans the photos and generates a new hash, then those hashes are compared to the CSAM database, and THEN only THOSE matches are sent to human review. That's a lot of steps for someone to see your pictures and you also deserve it for harboring those disgusting images in my opinion.
 
But its still surveillance...which appears to be what many are trying to ignore.
Surveillance by whom? Not Apple. Also, the Messages thing is a child safety feature that is opt-in by the parent. It's no different than blocking certain channels on your cable TV service.
 
Just remember it is partial content scanning. What does that mean? It means in this case that the code is running an algorithm that looks for specific similarities in a subset of the total picture matching a code or set of codes.

Now why does that sound just like facial recognition. Think before your react. I did not say it WAS facial recognition, I said it was nearly the same algorithm. Now think about how well Apple's facial recognition does NOT work.

1) This is so near facial recognition that there is no way to stop it from being applied to that end without users never being aware. In fact, I'll bet the government is forcing Apple to do this and Apple is trying to make a positive out of it before they get caught.

Next it will be, "we are only looking for terrorists." And there is the problem, a terrorist is anyone the government does not currently like. Why? Because the government has discovered (thanks to President Bush and the Patriot Act) they don't have to follow any laws as long as the person in question is classified as a terrorist. A terrorist can now be held without due process and without human rights.

2) Remember how well Apple's facial recognition does NOT work. Sure its fine for families that don't matter if it is wrong.

But you really want SWAT showing up at people's home based on Apple's picture recognition AI (and I know someone is supposed to review so don't get me started on the problems with that).

Come on, get real. Apple's spell checker is awful. Apple's Siri AI is awful. Apple is great at providing tools, but not in the actual end implementation.

To get that right Apple will have to port over code from the CIA, FBI, NSA, etc. If they have not already.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.