Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I was just listening to an interview with Tim cook on the pod cast SWAY, this product is against so much of what he claimed during that interview. I would love to hear someone ask him in an interview how they have backsteped from privacy and how the reconcile this move to a surveillance infrastructure. This is a police state product, Apple should have nothing to do with it.
 
I don’t think Apple is stopping this. There’s been so much blowback, they’ve ruined their privacy reputation, this scanning will make them no money, and yet they’ve refused to stop. Keep pushing people, but I don’t think it’s working. Apparently the hash has been in iOS 14.3 already, so this has been planned for a long time. I’ll be voting with my wallet.
 
Why make a huge drama out of something that's very simple? Nobody is looking at your photos.

Privacy doesn't give anyone the right to commit a crime.

People do deserve their privacy. This is not like having a camera watching their every move. It's just an AI filter that checks for illegal images, images used to commit a crime. If it finds any, it will raise a flag.
Apple is trying to protect themselves from being accused of protecting criminals and getting involved into a lawsuit.
There are people who think that by using an Apple device they can take and store illegal pictures.

Governments want Apple to create a backdoor, something that would affect the security and privacy of the devices. What Apple is trying to do is implement a non-invasive way to help combat crime, so certain governments and haters don't complain that Apple protects criminals.

Maybe as you suggest, Apple should put this to rest, and let people continue with their business causing pain and suffering to helpless children. It may not bother some people at all, until it happens to someone close to them. Then they will be actively in favor of what Apple is trying to do here.

Just like with the use of a face mask. People are complaining saying "is their body and their right not to wear a mask", without any concern for the people around them. The mask is to protect those around you and protect yourself.
Same thing in schools now, Mothers against having their children wear masks "because is their right not to do so". This is ridiculous!
Just wait until they or their children get sick with COVID-19, their mind will change and will be rallying in favor of wearing the mask. It's already happening, just search in YouTube. Many of those against it, or those who used to say that COVID-19 doesn't exist, are now pleading to people to take it seriously.

“Nobody has the intention of building a wall” - Walter Ulbricht - June 15, 1961

Construction of the Berlin Wall was commenced by the German Democratic Republic (GDR, East Germany) on 13 August 1961.
 
Last edited:
Privacy doesn't give anyone the right to commit a crime.
Let's strip search random people on the streets then. You're bound to find some illegal drugs or weapons!

Governments want Apple to create a backdoor, something that would affect the security and privacy of the devices. What Apple is trying to do is implement a non-invasive way to help combat crime, so certain governments and haters don't complain that Apple protects criminals.
Apple has built that back door. The FBI just has to contact their partners to get pictures added to the databases, and have a NSL sent to Apple ordering them to report certain pictures to the FBI.
Just like with the use of a face mask. People are complaining saying "is their body and their right not to wear a mask", without any concern for the people around them. The mask is to protect those around you and protect yourself.
Same thing in schools now, Mothers against having their children wear masks "because is their right not to do so". This is ridiculous!
Just wait until they or their children get sick with COVID-19, their mind will change and will be rallying in favor of wearing the mask. It's already happening, just search in YouTube. Many of those against it, or those who used to say that COVID-19 doesn't exist, are now pleading to people to take it seriously.
I'm with you there, but you have to put things into perspective. Covid has killed 600000+ people in the US alone, and kills a 1000 more people every day at the moment. And that's AFTER all the mitigation efforts.
 
Last edited:
If what you say is true, then why fight crime?
• Let's just get rid of Law Enforcement, because it's not going to make any difference.
• Don't put locks on your doors, because the burglars will break them anyway.


People downloading these images, either for a price or for free, are encouraging those who abuse children to provide more material.
By your logic, it shouldn't be illegal to carry a controlled substance with you as long as you don't give it to someone else. But it is!

I carry controlled substances on my person all the time. They’re called prescription drugs.

And once again, if cameras didn’t exist, do you think child abuse would not exist? And the whole claim that people looking at these photos encourages more abuse is total hogwash. The abuse has existed as long as humanity has existed, photos or not. This CSAM scanning crap does not prevent child abuse from occurring in any way. Like I said, you could snap your fingers and make cameras cease to exist. Child abuse will still occur.
 
There is a huge practical and conceptual difference here. Up till now all the scanning/AI was for the user benefit. It made the phone more useful. This does not specifically benefit the user at all (general societal benefit though). There was also zero information shared off the device, Apple no matter how small and limited has now changed that.
Turn off iCloud Photo and nothing is shared off the device.

I guess the question is, what is a better method of preventing the distribution of child abuse images on online services? Mass scanning of all images in the cloud, with opaque algorithms by Google, Microsoft, and Facebook? Currently, Apple identifies far fewer of this harmful and illegal content than other cloud providers.
 
  • Like
Reactions: gnomeisland
So, before this announcement, Apple was trusted with the "scanning" all photos for associations and locations on every iOS device? Are people less trustful of the more limited scenario of Apple "scanning" for images that are identified as child abuse image and, importantly, limited to iCloud photos?

To your point, I, personally, believe that Apple has earned trust for me and that it is clear they have attempted to develop a privacy-centric solution. But more than that, I think their solution allows for easier auditing since it IS happening on device and NOT on the cloud, like every other cloud storage provider.
Apple scans my photos for cats for my own benefit. Scanning my photos to notify the cops is not for my benefit
 
I am done with this company, and fully understand that majority of normal users will accept this just because.
I am sharing this with hope to help the minority of Apple loyal users (like me, until this stupidity) to understand that its time to tame our collective addiction of "conveniences" and learn important lesson: Never trust your private data to a closed software/hardware company again.
I'm also in the process. MacOS has already been replaced by Linux, Safari has been replaced by Firefox, and iCloud replaced by pCloud. I'm just waiting on my replacement phone to ship from eFoundation to complete the migration. Most Apple users have nothing to hide, but they might have something to stand up for. This backlash is for nothing if Apple knows they can just wait out the bad press until people move on to something else. I don't think refusing software updates is going to do much, as they can eventually find ways to force that issue. If the profits start to recede, then Apple might actually start to panic a little. I just get this strange feeling that there's more to this story, as Apple seems very committed to implementing this feature and isn't backing down.
 
Why make a huge drama out of something that's very simple? Nobody is looking at your photos.

Privacy doesn't give anyone the right to commit a crime.

People do deserve their privacy. This is not like having a camera watching their every move. It's just an AI filter that checks for illegal images, images used to commit a crime. If it finds any, it will raise a flag.
Apple is trying to protect themselves from being accused of protecting criminals and getting involved into a lawsuit.
There are people who think that by using an Apple device they can take and store illegal pictures.

Governments want Apple to create a backdoor, something that would affect the security and privacy of the devices. What Apple is trying to do is implement a non-invasive way to help combat crime, so certain governments and haters don't complain that Apple protects criminals.

Maybe as you suggest, Apple should put this to rest, and let people continue with their business causing pain and suffering to helpless children. It may not bother some people at all, until it happens to someone close to them. Then they will be actively in favor of what Apple is trying to do here.

Just like with the use of a face mask. People are complaining saying "is their body and their right not to wear a mask", without any concern for the people around them. The mask is to protect those around you and protect yourself.
Same thing in schools now, Mothers against having their children wear masks "because is their right not to do so". This is ridiculous!
Just wait until they or their children get sick with COVID-19, their mind will change and will be rallying in favor of wearing the mask. It's already happening, just search in YouTube. Many of those against it, or those who used to say that COVID-19 doesn't exist, are now pleading to people to take it seriously.

I believe you missed my point.

I'm not arguing for or against this. I've made my position known in earlier threads.

You mentioned what "people think" and "haters". That's my point. This is no longer about the technology, it's about the perception. Apple flubbed this TOTALLY.

I absolutely guarantee you all the perverts and pedos that would get caught with this all ready know about it, and how to avoid it. So it's kind of a moot point as far as catching those types.

I can see where the technology could be used against political dissidents, LGTBQ, and other minority groups in not so human rights friendly regimes and countries.

My phone is full of pictures of my cats, me and my wife at the beach, trains, and my motorcycle, so I don't care personally WHO sees my pictures. I would hand my phone to anyone, anywhere and not worry what they find including text messages, pictures, and mapping info. I lead a very boring life. More so than anyone could ever imagine possible.

My point is, this is a PR failure, and the reward for the damage done is minimal. It's not a judgement call, or a technological assessment. It's just bad PR. Especially when you're all about privacy, privacy, privacy as a company.


This is the sort of hubris and lack of intuition as to the demands of the marketplace that damned companies like Sears and Bear Sterns.

It's a business administration observation, not a technical or moral judgement.
 
I guess the question is, what is a better method of preventing the distribution of child abuse images on online services? Mass scanning of all images in the cloud, with opaque algorithms by Google, Microsoft, and Facebook? Currently, Apple identifies far fewer of this harmful and illegal content than other cloud providers.

Only an Apple apologists can claim Apple's implementation isn't "mass scanning" - all future iphone camera roll photos (plus all previously uploaded icloud photos) vs all cloud uploaded photos. There isn't much of a distinction there. Further, its convoluted structure doesn't make it any less opaque - especially since it is using same NCMEC database.

Apple is trying to "put lipstick on a pig" by taking what everyone else is doing, making it even more complex, and then putting it on your phone and somehow that is inherently better? nope.
 
well I guess it's due to US laws.

if you want privacy, US laws are not that convenient, not only regarding CSAM. for example, one of the selling point of this Swiss based cloud service is : NSA non-compatible


my point being: the discussion about privacy is important but shouldn't focus only on this on device mechanism, client side. how could a server side scanning be better in any way?
First big difference: By just knowing your phone number your could get send matching images on your device. There are apps that would put those images into Photos which then syncs to iCloud. Not much fun...
Second big difference: Your iPhone may be more easily hacked than iCloud.
 
The previous poster implied that people would not sign such protest letters if they had read Apple's documentation. However, people have read and understood* the documents and still protest. Experts alone protesting would not prove much, but the objections are easy to understand, and the documents do not invalidate them.

Claiming that protestors just lack understanding plays into Apple's people-are-confused-ruse. People know and understand what Apple intends to do, that is why they protest. That is also why adding more documentation without addressing the core objections comes off as a very blatant diversionary tactic: people do not care whether the gun operator has good references or whether the safety only fails in 1 out of a trillion times - they simply do not want that gun pointed at them.

*As far as the documents allow - Apple glosses over many critical aspects of the system. The system is objectionable under any interpretation, though. More details might make it even worse.

You aren't thinking different enough? (A rip on 'you are holding it wrong')
 
Apple scans my photos for cats for my own benefit. Scanning my photos to notify the cops is not for my benefit
My point was that you trust that the scans of your photos will remain for your benefit. Doesn't all the same "slippery slope" arguments apply to the scanning that Apple is already doing. And, further, it is doing it for all photos not just those that are being uploaded to iCloud Photos.
 
Lots of "maybes" but no agreement on reducing child pornography or better ways that it could be done.
If you are going to complain, at least suggest improvements. Continuing to do nothing is does not improve the lives of child victims.
 
I carry controlled substances on my person all the time. They’re called prescription drugs.

And once again, if cameras didn’t exist, do you think child abuse would not exist? And the whole claim that people looking at these photos encourages more abuse is total hogwash. The abuse has existed as long as humanity has existed, photos or not. This CSAM scanning crap does not prevent child abuse from occurring in any way. Like I said, you could snap your fingers and make cameras cease to exist. Child abuse will still occur.
There are a few things about your reply that seem problematic to me, but I will focus on just this one. These processes by cloud services are not to prevent all child abuse. (I wish that were possible!)

These are solutions to prevent the mass distribution of the illegal and harmful content. Inexpensive digital cameras and free or cheap cloud storage and sharing means that this content is far more pervasive and far-reaching.
 
Where’s the warrant for explosive sniffers devices at airports?
Where’s the warrant when doctors report a gun wound or a beaten wife?
Where’s the warrant when my bank report me because somebody wired me 200k€ from abroad or because I deposited 10k€ cash?
Where’s the warrant when Google/Facebook/MS/others mass-scan for CSAM my private pictures (not actively shared with others) on their clouds?

Sometimes companies have to report illegal activities happening on their premises, be it virtual premises or physical ones.
The differences, this is on My property, not theirs. And I have the presumption of innocence in my own house
 
There are a few things about your reply that seem problematic to me, but I will focus on just this one. These processes by cloud services are not to prevent all child abuse. (I wish that were possible!)

These are solutions to prevent the mass distribution of the illegal and harmful content. Inexpensive digital cameras and free or cheap cloud storage and sharing means that this content is far more pervasive and far-reaching.

And mass surveillance technology doesn’t change a thing.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.