Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.


Apple's plans to scan users' iCloud Photos library against a database of child sexual abuse material (CSAM) to look for matches and childrens' messages for explicit content has come under fire from privacy whistleblower Edward Snowden and the Electronic Frontier Foundation (EFF).

appleprivacyad.jpg

In a series of tweets, the prominent privacy campaigner and whistleblower Edward Snowden highlighted concerns that Apple is rolling out a form of "mass surveillance to the entire world" and setting a precedent that could allow the company to scan for any other arbitrary content in the future.



Snowden also noted that Apple has historically been an industry-leader in terms of digital privacy, and even refused to unlock an iPhone owned by Syed Farook, one of the shooters in the December 2015 attacks in San Bernardino, California, despite being ordered to do so by the FBI and a federal judge. Apple opposed the order, noting that it would set a "dangerous precedent."

The EFF, an eminent international non-profit digital rights group, has issued an extensive condemnation of Apple's move to scan users' iCloud libraries and messages, saying that it is extremely "disappointed" that a "champion of end-to-end encryption" is undertaking a "shocking about-face for users who have relied on the company's leadership in privacy and security."The EFF highlighted how various governments around the world have passed laws that demand surveillance and censorship of content on various platforms, including messaging apps, and that Apple's move to scan messages and iCloud Photos could be legally required to encompass additional materials or easily be widened. "Make no mistake: this is a decrease in privacy for all iCloud Photos users, not an improvement," the EFF cautioned. See the EFF's full article for more information.

The condemnations join the large number of concerns from security researchers and users on social media since Apple's announcement of the changes yesterday, triggering petitions to urge Apple to roll back its plans and affirm its commitment to privacy.

Article Link: Privacy Whistleblower Edward Snowden and EFF Slam Apple's Plans to Scan Messages and iCloud Images
Here’s what I find problematic: How do they know the intentions of the photographer?

I have two kids, and like many, many other parents, when they were babies and toddlers I used to take pics of them playing in the house, outside or in the bath in diapers or underwear, or even nude. Obviously the intention is to capture memories of my kids doing hilarious things, and to privately save them for the future memories. But how would apple know this? Unless I’m missing something this is a grotesque over reach and one destined to scoop up innocent people.
 
I keep hearing people say what I bolded in your post above, and I'm trying to figure out why you guys are so confused about this. Look at the very part of the article you quoted: "Apple then manually reviews each report to confirm there is a match." So unless you have confirmed child porn images on your device that you then upload to iCloud, your life will not be ruined because nothing will come of false positives. Why would Apple report an innocent image after review? Makes no sense. I don't understand your concern here at all.

The false positive rate is not the probability that Apple will report you for an innocent photo.

How many paedophiles are stupid enough to want to backup their collection to the cloud? Versus how many people would think it funny to mess with someone by adding flagged images to their library?

This could start a new era of digital swatting.

Maybe I’m overestimating the intelligence of paedophiles but if they’re successful enough to buy Apple products and subscribe to iCloud it seems like they’d know enough to turn off auto upload.
 
Here’s what I find problematic: How do they know the intentions of the photographer?

I have two kids, and like many, many other parents, when they were babies and toddlers I used to take pics of them playing in the house, outside or in the bath in diapers or underwear, or even nude. Obviously the intention is to capture memories of my kids doing hilarious things, and to privately save them for the future memories. But how would apple know this? Unless I’m missing something this is a grotesque over reach and one destined to scoop up innocent people.
And who is gonna view these images to ensure the AI is accurately reporting? Actual people??? No f**king way!
 
Here’s what I find problematic: Hiw do they know the intentions of the photographer?

I have two kids, and like many, many other parents, when they were babies and toddlers I used to take pics of them playing in the house, outside or in the bath in diapers or underwear, or even nude. Obviously the intention is to capture memories of my kids doing hilarious things, and to privately save them for the future memories. But how would apple know this? Unless I’m missing something this is a grotesque over reach and one destined to scoop up innocent people.

It’s not about photos and vids of your kids.

it’s about being able to identify everyone who has photos with a forensic signature matching selected signature on a database. The excuse is “it’s for the children”.
 
I keep hearing people say what I bolded in your post above, and I'm trying to figure out why you guys are so confused about this. Look at the very part of the article you quoted: "Apple then manually reviews each report to confirm there is a match." So unless you have confirmed child porn images on your device that you then upload to iCloud, your life will not be ruined because nothing will come of false positives. Why would Apple report an innocent image after review? Makes no sense. I don't understand your concern here at all.
Here’s why: I don’t want ANYONE looking or “manually reviewing” my photos. Ever. For any reason. Period.
 
It’s not about photos and vids of your kids.

it’s about being able to identify everyone who has photos with a forensic signature matching selected signature on a database. The excuse is “it’s for the children”.
I see, I understand what you’re saying. Still don’t you think the “manual review” by a human is off putting? Will they not be reviewing private photos? Because if they are that’s horrible.
 
Apple should reverse course and not give in to politicians who don’t understand the dangers of this kind of rollback on privacy. In the long run this will cost everyone.
 
It has nothing to do with Apple's intentions. We've seen time and again that Apple will compromise on their ideals to "comply with the law" in authoritarian and human-rights-abusing countries. This is a system that can scan every single photo on every single (iOS) device on the planet, and compare them against an unreadable and infinitely alterable black list of banned images...just imagine what this tech could be leveraged to do in China, Russia, Saudi Arabia, Nigeria, etc. without users' knowledge or consent.

Apple already has the means to do exactly that, however they don't and people seem to forget. This implementation is very specific and targeted, it isn't AI scanning of cloud images.
 
  • Disagree
Reactions: lysingur
The false positive rate is not the probability that Apple will report you for an innocent photo.

Why the hell would they do that? And even if they did accidentally by some gross oversight, law enforcement investigators would obviously catch the error ("Hey, this isn't child porn") and you'd never be even contacted by law enforcement, let alone arrested.
 
I don't follow this slippery slope thought process. He is essentially saying he doesn't trust Apple not to abuse this functionality. You could literally apply this to anything if you believe Apple has ill-intentions.

If we believe Apple has some master plan to abuse this then this opens open every part of their ecosystem to the same "it could be abused if they want to" statement.

Spot on. It's the "anything is possible" trope.
 
My problem is parents taking a picture of a Father/Mother kissing their child in cute situation that would reported in this new system! I sure hope Apple realizes this until it's to late!
Except that is not how this system will work, not at all.
 
  • Angry
Reactions: satcomer
I see, I understand what you’re saying. Still don’t you think the “manual review” by a human is off putting? Will they not be reviewing private photos? Because if they are that’s horrible.
I do think the manual review is abhorrent.

But my primary concern remains that they forensic signature matching technology would be used to identify saved images of politically charged memes, politically damning evidence, or just simple wrongthink (I.e. vaccine denial etc). All we need is the right crisis in order to justify it (this January 6th riot/protest)
 
But my primary concern remains that they forensic signature matching technology would be used to identify saved images of politically charged memes, politically damning evidence, or just simple wrongthink (I.e. vaccine denial etc). All we need is the right crisis in order to justify it (this January 6th riot/protest)

You really think Apple is that stupid? C'mon.
 
Here’s why: I don’t want ANYONE looking or “manually reviewing” my photos. Ever. For any reason. Period.

That wasn't an answer to my question. But if this is a hill you're willing to die on, then I'd suggest not uploading any photos to iCloud, because there's a less than 1 in 1 trillion chance that a photo will be falsely flagged and even then nothing will happen unless you have even more that are falsely flagged and uploaded to iCloud (they haven't disclosed the threshold number of flagged photos at which point these flagged photos are reviewed). Apple employees aren't going to be perusing your iCloud photo library of innocent photos. While I respect your stance on privacy and I agree with it, you're misrepresenting what's actually happening here. Please get some balance instead of thinking emotionally.
 
Good intent but Apple will have really work on this and get it down so no false accusations and claims from people having pictures of their kids in bath tub or something.

What happens when you get taken to a police straight out of your work because there was a mistake. Good luck getting any reputation back

What bugles me more is the fact that there’s going to be a new department at Apple of people that will most likely have access to anyone’s iCloud and not just flagged accounts.
 
  • Like
Reactions: xpxp2002
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.