Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
People really need to stop insisting that just because we object to something, we must not understand it. One more time for the people in the back: we understand how it works, and we don’t want our devices spying on us.
How are they spying on you?

They're running an algorithm that doesn't actually analyze the content of your images, it simply checks to see if the same identifier comes up as known CSAM

If an image has the same identifier, a flag is raised.

If enough flags are raised (20) a human moderator checks the images to confirm they are CSAM and alerts the authority.

Apple has the right (and a moral obligation) to ensure child pornography is not hosted on their services. They could do what every other provider does and just scan every single image on their servers, instead they keep it on device and set strict thresholds to try and protect user privacy.
 
The invasion of privacy must be removed... no compromise, no reimagining... removed completely, the pressure has to be maintained. Don't fall for the, "We'll introduce something so onerous it enrages everyone, then later pull it back a little so it appears we are being reasonable." trick. We cannot accept this to happen at all or they'll just keep pushing this in the future.
 
People really need to stop insisting that just because we object to something, we must not understand it. One more time for the people in the back: we understand how it works, and we don’t want our devices spying on us.
One more time for the people (in the front, I guess?): spying implies you have no idea it's happening -- this was, and never will be, spyware.
 
The question is surely why does Apple not consult more widely in the first place about this societally-impacting issue? Even as an Apple fanboy, I am getting quite cross about Apple's arrogance and it is opening itself up to all sorts of criticism when, if it had some deference to people outside the glass donut, it may make better, more harmonious decisions.

I can't help thinking Apple is going down this road in order to appease the FBI or NSA, so that it can keep its encryption in place but slowly assist more with dynamic filtering of potentially illegal content. Chilling either way.
I think that some engineers simply thought their hash/encryption stuff is real cool -- they did it with Contact Tracing, they did it with Airtag stuff, and they were just trying to figure out something else they could hash. Who could be against stopping CSAM stuff, lets hash that!! it'll be awesome and we'll get bonuses. I don't think it was Apple's master plan to destroy their privacy marketing push, just cool math to add to software.
 
what-happens-on-your-iphone.jpg
 
“Delay” is underwhelming response to say the least, and this response from Apple took weeks too long, but a win is a win — after the last two years, I can’t be picky.
 
Ok, they "delayed" it, for now.

The question next would be: will Apple announce the implementation of a "revised" CSAM system? Or go completely radio silent and just install it in a point update without telling anybody else, much like "batterygate" a few years ago?

And, how much effort they will put to "revise", to what extent? Would the attempt be just mitigation or some sort of overhaul which I doubt? Will they change the underlying technology?

Last but not least, will the feature be released worldwide by the time they think they are ready to push the CSAM system out or still start from US and slowly roll out to other countries?

Not to mention how much of those issues pointed out by researchers, universities, institutions, advocate groups etc will be addressed or ignored.
On-device scanning isn't going to be good for sales, and Apple knows that.
After the Christmas sales they'll reintroduce this with some cosmetic change.
 
I think that some engineers simply thought their hash/encryption stuff is real cool -- they did it with Contact Tracing, they did it with Airtag stuff, and they were just trying to figure out something else they could hash. Who could be against stopping CSAM stuff, lets hash that!! it'll be awesome and we'll get bonuses. I don't think it was Apple's master plan to destroy their privacy marketing push, just cool math to add to software.
This hashing system has existed for years and was not produced by Apple. It is completely unrelated to Contact Tracing and AirTags.
 
So this will either be quietly released later, or be delayed indefinitely until everyone forgets and it's quietly killed.

Or the third option, they launch it in a less free country that will be far less vocal, and beta test it there before rolling it out in the US. I am against this "feature" but the US was always going to have more of a problem with this kind of feature than say China or Australia.
 
Last edited:
🙄 public opinion strikes again
As it should be. This Apple taking up the mandate that was handed to law enforcement by the people. This is effectively surveillance of the people at large. Even if the scan is initially done on the phone, that it goes through review, that it uses public CSAM databases, was built with privacy in my and with the best intentions that's what it is. If public surveillance is being built it needs to subject to public opinion, not built and run by a private company of unelected people.
 
I wonder how many additional children will be victimized from now until then? Apple the greatest company in history with the greatest humanitarian intentions forced to deal with grandstanding ignorant politicians and self centered selfish advocacy groups. It’s unbelievable!
You just called more than 90 civil rights groups "self centered selfish advocacy groups." Before I list some of those, please keep in mind that this plan of Apple's would have allowed not only Apple to scan the contents of all of your devices for anything they want to find without your consent or knowledge as well as submit to government requests to scan the content of your devices. Live in a country where anti-government speech is illegal? Apple could be compelled to search for anti-government speech and turn it over to the government. Hong Kong already does this to find dissidents, and arrest them. China does this to find and kill Uyghurs.

Some of these "selfish" groups include the ACLU, Canadian Civil Liberties Association, Australias's Digital Rights Watch, the UK's Liberty, the entirety of the German parliament as well as

Advocacy for Principled Action in Government (United States) African Academic Network on Internet Policy (Africa)
AJIF (Nigeria)
American Civil Liberties Union (United States)
Aqualtune Lab (Brasil)
Asociación por los Derechos Civiles (ADC) (Argentina) Association for Progressive Communications (APC) (Global) Barracón Digital (Honduras)
Beyond Saving Lives Foundation (Africa)
Big Brother Watch (United Kingdom)
Body & Data (Nepal)
Canadian Civil Liberties Association
CAPÍTULO GUATEMALA DE INTERNET SOCIETY (Guatemala) Center for Democracy & Technology (United States)
Centre for Free Expression (Canada)
CILIP/ Bürgerrechte & Polizei (Germany)
Código Sur (Centroamerica)
Community NetHUBs Africa
Dangerous Speech Project (United States)
Defending Rights & Dissent (United States)
Demand Progress Education Fund (United States)

Derechos Digitales (Latin America)
Digital Rights Foundation (Pakistan)
Digital Rights Watch (Australia)
DNS Africa Online (Africa)
Electronic Frontier Foundation (United States) EngageMedia (Asia-Pacific)
Eticas Foundation (Spain)
European Center for Not-for-Profit Law (ECNL) (Europe)
Fight for the Future (United States)
Free Speech Coalition Inc. (FSC) (United States)
Fundación Karisma (Colombia)
Global Forum for Media Development (GFMD) (Belgium)
Global Partners Digital (United Kingdom)
Global Voices (Netherlands)
Hiperderecho (Peru)
Instituto Beta: Internet & Democracia – IBIDEM (Brazil)
Instituto de Referência em Internet e Sociedade - IRIS (Brazil) Instituto Liberdade Digital - ILD (Brazil)
Instituto Nupef (Brazil)
Internet Governance Project, Georgia Institute of Technology (Global) Internet Society Panama Chapter
Interpeer Project (Germany)
IP.rec - Law and Technology Research Institute of Recife (Brazil) IPANDETEC Central America
ISOC Bolivia
ISOC Brazil - Brazilian Chapter of the Internet Society
ISOC Chapter Dominican Republic
ISOC Ghana
ISOC India Hyderabad Chapter
ISOC Paraguay Chapter
ISOC Senegal Chapter
JCA-NET (Japan)
Kijiji Yeetu (Kenya)
LGBT Technology Partnership & Institute (United States)
Liberty (United Kingdom)
mailbox.org (EU/DE)
May First Movement Technology (United States)
National Coalition Against Censorship (United States)
National Working Positive Coalition (United States)
New America's Open Technology Institute (United States)
OhmTel Ltda (Columbia)
OpenMedia (Canada/United States)
Paradigm Initiative (PIN) (Africa)
PDX Privacy (United States)

PEN America (Global)
Privacy International (Global)
PRIVACY LATAM (Argentina)
Progressive Technology Project (United States)
Prostasia Foundation (United States)
R3D: Red en Defensa de los Derechos Digitales (Mexico)
Ranking Digital Rights (United States)
S.T.O.P. - Surveillance Technology Oversight Project (United States) Samuelson-Glushko Canadian Internet Policy & Public Interest Clinic (CIPPIC) Sero Project (United States)
Simply Secure (United States)
Software Freedom Law Center, India
SWOP Behind Bars (United States)
Tech for Good Asia (Hong Kong)
TEDIC (Paraguay)
Telangana (India)
The DKT Liberty Project (United States)
The Sex Workers Project of the Urban Justice Center (United States)
The Tor Project (Global)
UBUNTEAM (Africa)
US Human Rights Network (United States)
WITNESS (Global)
Woodhull Freedom Foundation (United States)
X-Lab (United States)
Zaina Foundation (Tanzania)
 
Oh God! Don’t just delay it. CANCEL THIS, Apple. Can’t you see… people won’t be ordering the new iPhone 13 if you launch this child safety crap.
YES>

It needs an official cancelling. Apple thinks we're completely stupid ... delay ... launch iPhone 13, make sales and boom launch CSAM scanning.

Not. Good. Enough. CANCEL it.
 
How are they spying on you?

They're running an algorithm that doesn't actually analyze the content of your images, it simply checks to see if the same identifier comes up as known CSAM

If an image has the same identifier, a flag is raised.

If enough flags are raised (20) a human moderator checks the images to confirm they are CSAM and alerts the authority.

Apple has the right (and a moral obligation) to ensure child pornography is not hosted on their services. They could do what every other provider does and just scan every single image on their servers, instead they keep it on device and set strict thresholds to try and protect user privacy.
So it’s actually 30, not 20. They can meet their moral obligation without searching my device for illegal images. They can implement a server-side solution that doesn’t involve my device searching for illegal activity. Rene Ritchie had a whole video on a solution that would still find illegal images without client-side scanning.

The NCMEC reporting mandate essentially deputizes digital communications companies (I work for one and had to train to it), and having my device look for illegal activity is a violation of my privacy.
 
If I were to be Craig, I would just implement the local scanning mechanism for anything that is going to Apple's servers, and when a kiddie porn match is found, iPhone would isolate the kiddie porn into a folder/album. Then, let the user know due to regulations Apple is not able to host this content and reject the specific upload to iCloud. This match should neither be logged on the device nor should Apple be notified. Therefore, no outside party should be notified.

This implementation saves the fire on Apple's butt without snitching on users.

I think most reasonable people, including Snowden, would agree that this is not a backdoor. This is just like, for example, scanning for viruses before uploading an attachment, or scanning for unsupported image formats, or checking if the file is too big to be sent over an email, etc.
 
Maybe a happy compromise for those in countries enjoying a certain threshold of freedom/privacy would be for Apple to enable this only in countries who regularly abuse freedom/privacy. Those countries can have end-to-end encrypted cloud storage enabled, meaning they can store all the Winnie-the-Pooh photos or anti-government content they could dream of without fear of retribution.
 
They will end up CANCELLING IT. 🤫

It’s been a hot mess. Even good ol’ Craig (executive) admitted it.


Wrong.

Craig acknowledged ONLY the confusion amongst media and Apple iOS users.

He was STEADFAST in upholding CSAM scanning - watch the interview with Bloomberg.
 
So this will either be quietly released later, or be delayed indefinitely until everyone forgets about and it's quietly killed.

Or the third option, they launch it in a less free country that will be far less vocal, and beta test it there before rolling it out in the US. I am against this "feature" but the US was always going to have more of a problem with this kind of feature than say China or Australian.
Its interesting what characteristics of generations carry over between countries. The millennials of China dump everything on social media just like the US. This makes them easy to track. Get Z is different in both countries and it is a lot harder to track them by their online profiles. They do not connect their IRL and online lives the same way. And this is because of how interested governments are in tracking this stuff (US included). I think this is going to make features like this significantly less exciting for governments in the long run.
 
Have to wonder if the delay is to get people to upgrade to new iPhone or install iOS 15 on existing devices, get them trapped in and then half a year later implement it
 
  • Like
Reactions: jseymour
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.