Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
59,680
23,850


In protest of the company's now delayed CSAM detection plans, the EFF, which has been vocal about Apple's child safety features plans in the past, flew a banner over Apple Park during the iPhone 13 event earlier this month with a message for the Cupertino tech giant.

eff-apple-park-plane-1.jpg

During Apple's fully-digital "California streaming" event on September 14, which included no physical audience attendance in Cupertino in favor of pre-recorded segments live-streamed, the EFF decided to fly a plane over Apple Park with the message "Apple: Don't scan our phones! EFF.ORG/APPLE."

The EFF says it opted to use this form of "aerial advertising" to make sure that Apple's CSAM plans don't "fade into the background" and that Apple "hears" them. The EFF also flew the same banner over 1 Infinite Loop, Apple's previous headquarters that it largely vacated four years ago.

eff-apple-park-3-min.jpg

Apple announced in August its plans to use on-device machine learning and its custom-built "NeuralHash" system to detect images of known CSAM images on iPhone users' photo libraries. Following its announcement, privacy advocates and groups, including the EFF, were vocal about its potential privacy risks.

Unlike Google and others who scan for CSAM, or child sexual abuse material, in the cloud, Apple's system instead uses on-device processing to identify CSAM images. The EFF is, however, unsatisfied and has previously called on Apple to abandon its plans entirely.

On September 3, Apple announced it would be delaying CSAM detection, which was meant to roll out later this fall, to "collect input and make improvements before releasing these critically important child safety features." The EFF, in a blog post, says it will independently be holding events with "various groups" to collect research and suggestions, some of which it says could be helpful to the tech giant amid the delay.
Now that Apple's September event is over, Apple must reach out to groups that have criticized it and seek a wider range of suggestions on how to deal with difficult problems, like protecting children online. EFF, for its part, will be holding an event with various groups that work in this space to share research and concerns that Apple and other tech companies should find useful.
Apple's child safety feature plans, besides CSAM detection, includes enhanced protection of children from unsolicited images. To learn more about Apple's plans, read up on our guide.

Article Link: EFF Flew a Banner Over Apple Park During Last Apple Event to Protest CSAM Plans
 
Last edited:

sh20

macrumors newbie
Oct 30, 2019
17
118
London
I don’t understand Apple on this one, they have been pitching themselves as the privacy friendly company amongst tech giants and still claim that as one of their advantages; yet working on a feature that scans all your photos..? it just doesn’t make sense to me.
 

matrix07

macrumors 604
Jun 24, 2010
7,646
4,450
I don’t understand Apple on this one, they have been pitching themselves as the privacy friendly company amongst tech giants and still claim that as one of their advantages; yet working on a feature that scans all your photos..? it just doesn’t make sense to me.
I chuckled every time Apple said “Privacy” in their iPhone keynote.
 

LV426

macrumors 68000
Jan 22, 2013
1,659
1,857
I don’t understand Apple on this one, they have been pitching themselves as the privacy friendly company amongst tech giants and still claim that as one of their advantages; yet working on a feature that scans all your photos..? it just doesn’t make sense to me.
Scanning photos on-device isn't an issue, in my opinion. The phone can happily do that, as it has done for years, so that it can identify my cat pictures or my friends.

Where I take issue is scanning photos and sending a red flag warning back to Apple HQ so that the Thought Police can investigate further. As so many have pointed out, this won't be watertight or safe.
 

anxious-koi-5855

macrumors newbie
Sep 24, 2021
7
7
At one time they tell how reliably the iPhone protects your privacy and do everything so that you would go to the store every year for another iPhone.
And when you are already completely in their ecosystem, it turns out that your iPhone can become an excuse for the police to come to you, even if you did nothing and you were just hacked and uploaded a couple of photos to iCloud.
 

Michael Scrip

macrumors 604
Mar 4, 2011
7,697
11,872
NC
Unlike Google and others who scan for CSAM, or child sexual abuse material, in the cloud, Apple's system instead uses on-device processing to identify CSAM images. The EFF is, however, unsatisfied and has previously called on Apple to abandon its plans entirely.

So cloud scanning is still OK according to the EFF?

You'd think they wouldn't like that either!

Scanning is scanning.

Whether your photos are on your device... or on a server somewhere... it's still scanning.

Device Scanning: "Don't violate my privacy!"
Could Scanning: "Come on in! Look at all my photos!"

:oops:
 

rachel_norfolk

macrumors newbie
May 18, 2021
5
35
King's Lynn, Norfolk, UK
So cloud scanning is still OK according to the EFF?

You'd think they wouldn't like that either!

Scanning is scanning.

Whether your photos are on your device... or on a server somewhere... it's still scanning.

Device Scanning: "Don't violate my privacy!"
Could Scanning: "Come on in! Look at all my photos!"

:oops:
Strangely, this does makes sense for the EFF. They are all about the ownership and control of the software that runs on the things we own. So, software running on our phones that we can't control would fit very much into EFF's interests.

Cloud services are not really part of their interest - but there are plenty of other orgs who would be interested in that!
 

Pillbory

macrumors newbie
Mar 16, 2021
5
-2
What part of this is NOT a good idea!!! You do realise don't you your own precious Cat, Dog or Family photo will not be actually looked at!!! - it simply compares the Hash for the photos against already identified hashes of indecent images of children!! It doesn't actually look for what it thinks are indecent images, just the hash!
 

singularity0993

macrumors regular
Oct 15, 2020
138
680
What part of this is NOT a good idea!!! You do realise don't you your own precious Cat, Dog or Family photo will not be actually looked at!!! - it simply compares the Hash for the photos against already identified hashes of indecent images of children!! It doesn't actually look for what it thinks are indecent images, just the hash!
Did you even read EFF’s statements?
 

sullixan

macrumors newbie
Sep 24, 2021
5
42
I think we should allow Apple to implement this software! What could go wrong? I believe in them. They will never be pressured by any Government to scan our images. Just like they wont be pressured to create software that displays our personal medical information (inoculation), for strangers (businesses) to view. Or having a Judge rule they must unlock a persons cell phone due to a shooting. In case anyone disputes this, although apple "drew the line" at assisting with the unlocking of shooters cell phone, apple still provided "gigabytes of information” to investigators, including “iCloud backups, account information and transactional data for multiple accounts.”

Hopefully if you've made it this far, you've recognized the sarcasm. Call it conspiracy, but the Government has been known, to create a crisis, just to take away our freedoms. All the Government has to, and will do, is create a crisis once apple has implemented this feature, to further surveil. Do your research, and look up "The Lawful Access to Encrypted Data Act". They WANT and NEED to get into our phones, and into our lives. It has nothing to do with safety, and everything to do with compliance and control.

Safety is for you and I to control, not a business, not the Government. All our safety's are located in the Constitution.

Continue to fight against this surveillance state!
 

russell_314

macrumors 603
Feb 10, 2019
5,370
7,669
USA
I think Apple will skip this one. The risk for them is too great. Makes no sense. Why use the photo library of nearly billion devices to catch few criminals? Let the people who get paid by our taxes to do their job without using our devices.
I don’t think so. This has nothing to do with CSAM but rather pressure from governments like USA and China to gain access to iPhones. Both countries are having problems with terrorist and their own citizens going against the government so they want to be able to monitor what is going on better.

I think Apple will just quietly implement the feature once the media frenzy has died down. I’m fairly certain the code is already an iOS 15 so it’s probably just throwing a switch remotely to activate it. There is no indication on the users phone so you wouldn’t know if it was running or not
 

sullixan

macrumors newbie
Sep 24, 2021
5
42
We’ve all given up so much data already to tech corporations. Which the United States government kindly uses anytime they please. It’s funny that now people are suddenly questioning privacy. I’m sorry folks that ship sailed a long time ago.
I completely understand. I even agree, partly. But this is how they continue taking what they want. With the attitude of "they already". As if there isn't any room left for us to stop the continuation of their intrusion. I know plenty of people, personally and from afar (online presence), that fight hard to keep our data secure. Hope isn't lost, and it's better to be apart of those resisting, even if small in number, than to just let someone take control of something you own.
 

russell_314

macrumors 603
Feb 10, 2019
5,370
7,669
USA
I think we should allow Apple to implement this software! What could go wrong? I believe in them. They will never be pressured by any Government to scan our images. Just like they wont be pressured to create software that displays our personal medical information (inoculation), for strangers (businesses) to view. Or having a Judge rule they must unlock a persons cell phone due to a shooting. In case anyone disputes this, although apple "drew the line" at assisting with the unlocking of shooters cell phone, apple still provided "gigabytes of information” to investigators, including “iCloud backups, account information and transactional data for multiple accounts.”

Hopefully if you've made it this far, you've recognized the sarcasm. Call it conspiracy, but the Government has been known, to create a crisis, just to take away our freedoms. All the Government has to, and will do, is create a crisis once apple has implemented this feature, to further surveil. Do your research, and look up "The Lawful Access to Encrypted Data Act". They WANT and NEED to get into our phones, and into our lives. It has nothing to do with safety, and everything to do with compliance and control.

Safety is for you and I to control, not a business, not the Government. All our safety's are located in the Constitution.

Continue to fight against this surveillance state!
To be fair Apple has the legal obligation to turn over whatever information they are capable of accessing when a warrant is issued. This applies to all countries where there is the USA or China or Saudi Arabia for instance

If it’s a shooter then that’s a reasonable request because they want to find out more information for the investigation. Apple didn’t refuse but said they were incapable of unlocking the iPhone or accessing data on the iPhone. If they had the capability of accessing the data they would’ve had to turn that over. They are able to access data on iCloud so they turned over whatever data that they had access to.

Were this gets sketchy is when countries like China or Saudi Arabia or perhaps even the USA request data on citizens that they feel are a political threat. With Saudi Arabia it might be women demanding rights, with China it might be their Uighur Muslim population, with the USA it might be Trump supporters. You can disagree or agree with any one of those specific groups but there’s always a possibility you might get added to a new group so while you may dislike X group you’ve become part of Y group. It won’t be Apple’s decision to say OK you can investigate the Trump supporters but leave the Uighur Muslims alone.
 

urbZZ

macrumors member
Sep 1, 2016
36
152
... I’m fairly certain the code is already an iOS 15 so it’s probably just throwing a switch remotely to activate it. There is no indication on the users phone so you wouldn’t know if it was running or not ...
That is why I will not upgrade. We have already seen a lower adoption rate for iOS15, guess why. And I hope, it's not because the button has been moved or people are not forced to upgrade.Of course, they could also inject this into iOS14 at any time, but this is about making a statement to apple.
I love my apple devices, but Apple has seriously grown unappealing in the last months to me. Their culture of ignoring customers and their handling of mistakes is very bad for the brand. Why are they doing this?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.