Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Just a comment about NCMEC, as they do good work and I wouldn't want this situation to reflect badly on their association, albeit where much of their work is also about abductions and where its worth having a look at their stats, as the hash database is tiny for child pornography compared with the billion+ users of Apple equipment.

I would urge anyone with information that could assist them to do so, as I doubt many people will be against fighting child pornography.

However, Apple's moves do not and will not assist in that fight, and if anything will make that fight harder for all agencies who are involved in dealing with this problem as they have telegraphed the intention to MONITOR YOUR HARDWARE as a pre iCloud check and those involved in hideous acts against children will as a result be harder to track, and harder to apprehend as a result of this either misguided action by Apple, or much worse if it were by deliberation to set the precedent for the future Orwellian features.

When we have in most cases elected government we do at least have the opportunity of voting, that does not apply to Apple which sadly in common with other tech behemoths now wields power without being accountable to the electorate, and not really being accountable to those they have sold hardware to on the basis of privacy they have shouted from the rooftop about and had TV and ads based on it as a selling point and criticised other IT behemoths for their privacy policies.

I suspect this action may now cause them to lose many battles, including Apps platform, and with its opponents now being able to turn the tables on Apple as Apple have in my opinion lost the moral high ground they had on privacy.

Apple: You can defuse this easily because there is no excuse for embedding this software on users devices, devices you have sold them. No one can stop you having software on iCloud that would do what you suggest it will do via hardware, and then at least its not compromising users equipment. Customers can choose to use iCloud or not, knowing that function, but customers having bought your hardware are not in a position to choose whether to use the device they paid for!

My own systems often hold sensitive data, including on occasions concerning Apple, and indeed I've been instrumental in other Apple devices being used in government and strategic services and no doubt some are used in fighting the very child pornography you have cited as the reason for compromising users hardware rather than your own iCloud, which would make using Apple equipment untenable, and the undertaking we have to give clients over security would be hard pressed to be fulfilled ironically by your own actions if you change operating systems to include what you have suggested!

Remove it from operating systems which maintains the integrity of the hardware, and if you choose to put it on iCloud which I still believe would be a mistake, as it will in my opinion hinder efforts to counter child pornography by telegraphing that those involved in it will go underground and take preventative measures making it harder for agencies to do their job.
 
Last edited:
  • Like
Reactions: BurgDog and 5105973
What is your theory on why Apple, after years of positioning themselves as leaders in privacy, would choose to implement this policy now?
Tbh, I can’t make my mind.

I have a few theories why that could be the case:

1.) one option is they want to implement end-to-end encryption and that is why they have to scan on iphone

2.) the legislation is being prepared for this kind of surveillance. Taking into account the EU report I linked here. The legislation might be coming up in more places than just EU at the same time. Might be something theh have discussed at some summit

3.) there might be real interests in eroding digital privacy. Who would be the actors or lobbyists that would want that, I truly have no idea. There might be multiple actors pushing this.

4.) there are legitimate actors who think that in the digital age we have a way of combating child pornography and find such invasion acceptable. Considering it is veiled under hashes, etc. This actors actually act on a novle cause and are just misguided by the zeal of it’s privacy concerns. So I think of this as not malicious in their intent, but unaware of the consequences.

In the end I don’t think this is an Apple agenda in its own. They might be either trying to find the least harmful solution or being forced into this. I can’t fathom what personal interest would Apple have in this. Except as someone mentioned - copyright.

And lastly this is done only on iphones. How will they manage this on macs? Seems that is a big oversight in detecting CSAM.
 
Tbh, I can’t make my mind.

I have a few theories why that could be the case:

1.) one option is they want to implement end-to-end encryption and that is why they have to scan on iphone

2.) the legislation is being prepared for this kind of surveillance. Taking into account the EU report I linked here. The legislation might be coming up in more places than just EU at the same time. Might be something theh have discussed at some summit

3.) there might be real interests in eroding digital privacy. Who would be the actors or lobbyists that would want that, I truly have no idea. There might be multiple actors pushing this.

4.) there are legitimate actors who think that in the digital age we have a way of combating child pornography and find such invasion acceptable. Considering it is veiled under hashes, etc. This actors actually act on a novle cause and are just misguided by the zeal of it’s privacy concerns. So I think of this as not malicious in their intent, but unaware of the consequences.

In the end I don’t think this is an Apple agenda in its own. They might be either trying to find the least harmful solution or being forced into this. I can’t fathom what personal interest would Apple have in this. Except as someone mentioned - copyright.

And lastly this is done only on iphones. How will they manage this on macs? Seems that is a big oversight in detecting CSAM.
I might be mistaken, but I believe when the new Mac OS, called Monterey(?) is released, it will possess the same technology as the OS for iPads and iPhones.
 
  • Like
Reactions: one more
Tbh, I can’t make my mind.

I have a few theories why that could be the case:

1.) one option is they want to implement end-to-end encryption and that is why they have to scan on iphone

2.) the legislation is being prepared for this kind of surveillance. Taking into account the EU report I linked here. The legislation might be coming up in more places than just EU at the same time. Might be something theh have discussed at some summit

3.) there might be real interests in eroding digital privacy. Who would be the actors or lobbyists that would want that, I truly have no idea. There might be multiple actors pushing this.

4.) there are legitimate actors who think that in the digital age we have a way of combating child pornography and find such invasion acceptable. Considering it is veiled under hashes, etc. This actors actually act on a novle cause and are just misguided by the zeal of it’s privacy concerns. So I think of this as not malicious in their intent, but unaware of the consequences.

In the end I don’t think this is an Apple agenda in its own. They might be either trying to find the least harmful solution or being forced into this. I can’t fathom what personal interest would Apple have in this. Except as someone mentioned - copyright.

And lastly this is done only on iphones. How will they manage this on macs? Seems that is a big oversight in detecting CSAM.

I would expect "1". This would at least make sense.
 
As not an engineer, I am curious as to if it is possible to release the technology ONLY on US based OS's. From what I have been able to read, this technology seems incredibly complex, and needs to be "baked in" to the OS.

Would I be paranoid to think the technology will be in every OS for every country, but dormant? And at the appropriate time, Apple will simple activate the technology? For example Apple decides it is appropriate, necessary, or for other reason, and send a simple minor upgrade (remember Apple is wonderfully opaque in detailing update contents).
 
  • Like
Reactions: BurgDog
Tbh, I can’t make my mind.

I have a few theories why that could be the case:

1.) one option is they want to implement end-to-end encryption and that is why they have to scan on iphone

2.) the legislation is being prepared for this kind of surveillance. Taking into account the EU report I linked here. The legislation might be coming up in more places than just EU at the same time. Might be something theh have discussed at some summit

3.) there might be real interests in eroding digital privacy. Who would be the actors or lobbyists that would want that, I truly have no idea. There might be multiple actors pushing this.

4.) there are legitimate actors who think that in the digital age we have a way of combating child pornography and find such invasion acceptable. Considering it is veiled under hashes, etc. This actors actually act on a novle cause and are just misguided by the zeal of it’s privacy concerns. So I think of this as not malicious in their intent, but unaware of the consequences.

In the end I don’t think this is an Apple agenda in its own. They might be either trying to find the least harmful solution or being forced into this. I can’t fathom what personal interest would Apple have in this. Except as someone mentioned - copyright.

And lastly this is done only on iphones. How will they manage this on macs? Seems that is a big oversight in detecting CSAM.
Thank-you for taking the time to write a thoughtful post.

The vast majority of the discussion on this issue does not seem to want to examine Apple's motivations for such an apparent heel turn on privacy.
 
I have asked some of our legal team, albeit not constitutional lawyers, to explain to me how this does not trample all over the 4th amendment:

**********
The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.

**********

The consensus seems to be on one hand, it is a violation of constitutional rights, but on the other hand, Apple must have vetted this were some well respected experts.
I’m not a lawyer, expert or even an American, but even I know the 4th amendment doesn’t constrain the actions of private companies. I’m fairly sure that nothing else in The US Constitution does either.
 
  • Like
Reactions: BurgDog
What is your theory on why Apple, after years of positioning themselves as leaders in privacy, would choose to implement this policy now?

I'd probably say: advertising.

They've released loads of documents and wheeled out their immaculately coiffured spokesperson, but I suspect we've all become so polarised either for or against this, that we're not hearing the message:

We're going to install a scanner on your phone, but your privacy is still protected.

Apple (along with Microsoft, and Google, and Facebook and Twitter …) already scan their servers for CSAM images and report when they find it. So I guess the real question here, is why does Apple need to do this on the client?

Advertising.

User tracking is needed for advertising, but user tracking breaks privacy. Apple wants to get into the ad game, but it chafes uncomfortably against its stance on user privacy.

But suppose they came up with a way of supporting user tracking, but also maintaining privacy: an OS extension that runs on every device and logs which apps you use and which websites you visit. Now, rather than just spitting this raw information out, the extension removes all the personal stuff and creates an Advatar™: a restricted personae that an API exposes to apps and the private relay. Now your activities are tracked, but your personal information stays with you. Personalised ads can be served and your responses logged, without revealing your real personal data.

Apple realised that telling people that future OS activities would be logging your activities for user tracking (even if it is more private than current methods) wouldn't go down to well, so they came up with a plan to get people used to the idea of client-side tracking: everyone wants to protect children, so who in their right mind would object?

Well … just about everyone with a pulse, as it turns out, and since most of the people who disagree with client-side tracking, are happy with their photos being scanned server-side, Apple has found itself in the strange position of releasing documents and making changes, desperately trying to get people to accept it.

What should they have done? Well, assuming I'm right, then they should've been honest about what this client-tracker was for from the outset. They tried to tie it to child protection, which tied it to law enforcement, which tied it to foreign law enforcement, which tied it to foreign governments, which tied it to oppressive foreign governments. and the possibility that it could be subverted into a surveillance mechanism (which, of course, it could).

A better approach might have been to go with the truth first:

To protect your privacy, we're adding a gatekeeper for the personal data you keep on your phone. The gatekeeper will provide a sanitised, privacy-locked version of your activities for advertisers to use. We at Apple believe that … blah blah blah.

Personally, I think that sounds a lot better than the hot mess of horse manure they're shovelling out at the moment.


p.s.

Not saying I support this; I'm just throwing out an idea of what might've gone wrong.
 
  • Like
Reactions: BurgDog
As not an engineer, I am curious as to if it is possible to release the technology ONLY on US based OS's. From what I have been able to read, this technology seems incredibly complex, and needs to be "baked in" to the OS.

Would I be paranoid to think the technology will be in every OS for every country, but dormant? And at the appropriate time, Apple will simple activate the technology? For example Apple decides it is appropriate, necessary, or for other reason, and send a simple minor upgrade (remember Apple is wonderfully opaque in detailing update contents).
Yes, the surveillance CSAM feature will be dormant on every iPhone, it’s the AppleID country setting which triggers the activation. If you’re one of these guys who created a US based AppleID to have access to other content, few people used to do that in the past, then you’ll be affected.
 
  • Like
Reactions: BurgDog
Or rather you tunnel vision so hard you have given up foreseeing the trend of how thing would go.
CSAM scans images, which can be repurposed to scan any other type of images in the future. The technology is ready in mass scale.
Next, same technology can be used to scan video with some modification. Or maybe new technology would need to be developed. I dunno, but it's only a matter of time.
Text scan and analysis was and will never be any big deal of a computer as powerful as iPhone 12.
Apple sells devices across hundreds of countries around the world, whereas government generally only has full control over people residing in a city, county etc. Apple as a private company will not need a warrant to search anything on your device and stuff stored over iCloud, whereas warrantless search is not super popular in western countries.
But yeah, just keep holding on what YOU believe, but I hope someone else can get what I want to say.
Again


for the 100th time.

if you are talking slippery slope hypotheticals. WHY hasnt this already happened considering cloud storage and tech companies including apple giving cloud data to other countries has been happening since 2005. Why would governments find something NEW to scan when the data available TO scan on device AI is the exact copy of cloud data. all the other data is still locked out by the user with the secure enclave?

no one is answering this question directly because it collapses their narrative that fundamentally there is ZERO difference between cloud and device data available to scan. There is ZERO new data for device to scan compared to cloud data and there are no new ways to scan that governments havent already requested for cloud

So your concern about scanning video. Why hasnt video scanning been requested for clouds already. HINT: it already been scanned on cloud drives.
 
Apple’s system doesn’t need a back door. It has a massive front door that’s wide open.

Want to know what’s on someone’s phone? Okay well just replace this CSAM database with whatever other database you’re interested in. Done.
That would be a back door as they’re is no simple way to replace that database.

Also, even if a hacker manages to replace the database (which is part of the OS and definitely protected against tampering), what will you achieve? That a flag is sent to Apple. Where does that get you?

If a hacker has to jump through all these hoops, they may as well just hack your photo library directly.
 
  • Like
Reactions: mw360
I have asked some of our legal team, albeit not constitutional lawyers, to explain to me how this does not trample all over the 4th amendment:

**********
The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.

**********

The consensus seems to be on one hand, it is a violation of constitutional rights, but on the other hand, Apple must have vetted this were some well respected experts.

The fourth amendment applies to the government and not private entities like Apple.

Also you can authorise anyone you want to search your belongings which you will do if you use iCloud Photo Library in the future.
 
  • Like
Reactions: BurgDog
Apple’s system doesn’t need a back door. It has a massive front door that’s wide open.

Want to know what’s on someone’s phone? Okay well just replace this CSAM database with whatever other database you’re interested in. Done.

You know a much easier solution which Apple could have used for a decade?

Turn on iCloud backup secretly and you get access to almost everything.

Why have you not worried about iCloud backup?
 
Maybe it’s time to support open source smartphones with focus on privacy and sustainability.

At the moment there is too much power in the hands of a few tech companies.
 
  • Like
Reactions: Philip_S
Exactly, they’re not scanning iCloud anymore, they’re doing something worse which is building a technology that is capable of scanning your stuff on YOUR property irrespective of whether or not you have iCloud enabled. This sets a worse precedent than cloud scanning.

How about iCloud backup? Same problem there.

It's capable of copying almost everything you have on your phone and Apple can do it secretly by turning on iCloud backup without telling you. Once in iCloud they can turn it over to the police.
 
Sticking some backdoor checker into my phone against my will=bye bye iPhone, hello dumb phone. I've used smart phones for a decade now, but this is the last straw for me. These companies (hardware and software in nature) have been slowly removing my privacy, little by little, for years now, and it's obvious that it's not going to stop until some sort of federal court makes them stop. And which court is going to do that? Welcome to oblivion people.
 
  • Like
Reactions: BurgDog
How is it increased? They know MORE about the content of your images.

There probably is political pressures on Apple to do something about these kind of images. They reported very few cases last year compared to Facebook and Google.

They might one time in the future be forced to scan everything in iCloud which they don't like. Faced with this possibility they developed this system where they scan on device for users of iCloud Photo Library.

If your photo is going to be scanned anyway, it doesn't matter where the scanning happens. If it happens in the cloud you have no control or can't see indirectly at all what's happening.

This system will also allow Apple in the future to implement end-to-end encryption for iCloud and still be able to say that there is very little child pornography in iCloud, thus alleviating some of the pressures from US government who want full access to iCloud which they have today using the legal system.
 
You mean like when Apple forced people who do not use icloud, to upload their photos without their consent by enabling icloud stream on by default ??? (without asking user's permission...)

It doesn't matter how you try to twist it. Apple tried to pass a very specific message about privacy all those years.

If you do not mind, please answer this. Why do you think apple should protect children from child pornography? Do you think it's apple's role?

If yes, should it protect them from searching other topics, like suicide, guns etc? I would really appreciate if you would answer those questions please.
 
Again


for the 100th time.

if you are talking slippery slope hypotheticals. WHY hasnt this already happened considering cloud storage and tech companies including apple giving cloud data to other countries has been happening since 2005. Why would governments find something NEW to scan when the data available TO scan on device AI is the exact copy of cloud data. all the other data is still locked out by the user with the secure enclave?

no one is answering this question directly because it collapses their narrative that fundamentally there is ZERO difference between cloud and device data available to scan. There is ZERO new data for device to scan compared to cloud data and there are no new ways to scan that governments havent already requested for cloud

So your concern about scanning video. Why hasnt video scanning been requested for clouds already. HINT: it already been scanned on cloud drives.
Again, for the 100th time.

Your tunnel vision amazes me.

Slippery slope basically is a form of future prediction based on history. It's not always right but machine learning technology trivializes the creation of the new tool or expands the functionality of existing ones. Also, iCloud backup isn't really one-on-one device backup either, and some choose to not enable it, whereas local scan will always be able to find data cause surprise, it is on the device.

No one answers your "question" because we have moved on from questioning the detail to questioning the creation of tools itself and the possible consequences of such move. Just to pretend to wake you up, we are now talking about PEOPLE instead of just technology, because guess what? PEOPLE behind the technology can make either good or bad use of it. In this case, government agencies and hostile nations.

But ok, in case you ignore that (which I believe you will), users can choose to not include some app data as part of the iCloud backup, meaning there is a chance what Apple gets is not exactly what's left on the device.

I am not saying video hasn't been scanned in the clouds already, or the Youtube copyright strike would not be a thing at all. I am saying the video stored LOCALLY could also be scanned in the same manner for "illegal" materials. Next step would be either said video gets deleted automatically ON THE DEVICE, or even better yet, prevent such video from being taken at all. Apple Live Text is one of such use, as well as Google's camera translation feature. The technology is already there where you would not even be able to take pictures of critical moments or certain objects. It is just not really enabled yet.

But ok, ok, ok. You will ignore all of those, which is fine. Hopefully someone else has a better idea what I am worried about.
 
What I don't understand is why Apple chose to make this feature in the first place. They have always been about privacy, and even took measures to make it easier for them to fight government requests for information. These features don't benefit the end-users (although I suppose one can make a case about the sexting detection for parents) and it completely opens them up to government requests. Where did this come from?

There is probably pressure from several entities for Apple to start actively scanning iCloud especially for this kind of material.

This increases the chances that they won't need to do that in the future.
 
  • Like
Reactions: BurgDog
You mean like when Apple forced people who do not use icloud, to upload their photos without their consent by enabling icloud stream on by default ??? (without asking user's permission...)

It doesn't matter how you try to twist it. Apple tried to pass a very specific message about privacy all those years.

If you do not mind, please answer this. Why do you think apple should protect children from child pornography? Do you think it's apple's role?

If yes, should it protect them from searching other topics, like suicide, guns etc? I would really appreciate if you would answer those questions please.
You will not get what you want, and I am not joking.
 
Also, iCloud backup isn't really one-on-one device backup either, and some choose to not enable it, whereas local scan will always be able to find data cause surprise, it is on the device.

The problem is that Apple can easily turn it on in the background without telling you. And they can make subtle changing to it to copy every file.

Or Apple could combine the detection algorithms in Photo together with iCloud backup.

There are so many more effective ways Apple could do this and no one is worried about those.
 
Can I please clarify this, as I've been listening to this topic all week on podcasts.

There are 3 stages to this, and the one part of this which seems to be considered worse is this one......

If you are a child (when I say child, I mean child as classified by current laws in whatever country you care to pick) decides, they wish (for whatever reason) to take a image of some body parts to send to a close friend, then in the future their iphone will scan and detect their private photo they took, and if this scanning thinks it's detected something "naughty" Then the image will be sent to a human at apple to confirm if it is indeed a photo that is actually "naughty"
And then action/s can/may be then taken?

That's correct isn't it?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.