Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
For those of you in this thread who are frustrated by Apple's actions, what's next?
Have you already disabled iCloud? Switching to Linux? Buying a PinePhone

Curious about actions others are taking.
 
  • Like
Reactions: Euronimus Sanchez
Simple question: if I take a photo of my son in the tub and you can see his thingy, will that image get 1. Reviewed by an apple employee, and/or 2. Reported to the police?

No, because - as I understand it - the system is being used to identify existing known CSAM images. It doesn't identify flesh, regardless of how much (or little) there is. Or genitals. Or faces. Or anything. It does not look at the photo. It is a hash - a signature of an existing image known to the authorities. The entire point of the system is to stop sharing KNOWN abusive images. If these perverts create new content which isn't known and identified by the system, it can't stop that until the appropriate authorities have generated a hash for it and the signatures on services that use them have been updated.
 
Quarterly profit chasing Tim cook and Apples crisis management team are worried that the share price will tank when trading opens today.





This pathetic damage control faq by the clueless crisis department is all because Apple massively underestimated the backlash from people who didn't expect Apple to curb stomp their privacy with this BS "protect the children" trojan horse. Sadly we accept that it is the norm for Facebook, alphabet and amazon to rape our privacy and datamine every byte of our existence but not Apple



My main concerns after the complete destruction of privacy is that this system isn’t foolproof, one example: go to Microsoft and search for “kinki kids”- a multi platinum selling musical duo from Japan from the kinki 近畿地方 region situated in south central Japan. Bing search ai assumes the user misspelt kinki as kinky and thus searching for "kinky kids" and you get a child abuse warning under the search bar. If the ‘geniuses’ at Microsoft who pioneered the ai behind CSAM cannot implement it on their platform without clear dumb false positives I guarantee scanning trillions of icloud photos will flag innocent pictures as abuse.

I cannot wait until I see a false positive news story of (1 in a trillion is a made up non audited number to make the system appear perfect) bubba getting a no knock armed raid by the fbi at 3am because skynet ai made a false positive on his collection of pictures of his petite Filipino wife..



Also what is stopping power hungry authoritarians like the ccp abusing this to flag and arrest people with pictures of winnie the pooh xi memes, pro democracy literature, tankman, pro Taiwan Hong Kong independence, Uyghurs in camps etc all under the pretence of protecting national security? Protecting the children is a perfect catch all trojan horse (if you are against that you must be a sick monster with something to hide right?) to seed ‘extra’ functionality at the flick of a switch..


This is complete PR disaster by Apple who are destroying their customers trust and band value as a "privacy focused " company
More likely answer..."people are stupid and can't read basic articles, so let's come out with an even dumber FAQ to help them...and then they'll not read that as well...."
 
For the first time Apple are continuing to support iOS 14 and iOS 15 simultaneously and users can decide to stay on version 14 without being pressured to upgrade. I had wondered what might appear in iOS 15 that would warrant people to avoid upgrading. I now have my answer.

This can and will be expanded to scan for much more and Apple will simply cower and say "we can't fight government demands".
 
Why do Apple think their customers might be child molestors and why do Apple think they are some sort of police unit who need to scan people's phones? It's insane, an extraordinary abuse of power, and shows how out of touch with reality this company is.

I don’t think it’s so much that Apple thinks that a sizeable portion of their users are child molesters, but that’s more to show law enforcement that their users aren’t.


Apparently, the main reason why we still don’t have encrypted iCloud storage is because the FBI “requested” that Apple hold off on such a feature, presumably because it would stymie their own investigation efforts.

My current theory is that Apple wants to eventually offer encrypted cloud storage as a feature, and the only way they can get law enforcement off their backs is to show them that in the very least, the images iphone users are uploading to iCloud do not contain child pornography. And because you can’t scan the files that have already been uploaded, the next best thing is to scan them prior to being uploaded.

It’s a reasonable trade off, IMO, if this is indeed what Apple is planning, and would help differentiate themselves from the competition who are unable to do a similar thing because they do not control the underlying hardware, or carry it out in a manner which would not downgrade the user experience.
 
Anyone that believes Apple will stick to their script of only scanning what they said they will scan needs to have 'FOOL' tattooed on their forehead because it is a well known fact that tech companies lie not only to consumers but to governments. Google, Facebook, Apple, Microsoft, Amazon have all been caught out at one time or another lieing, sometimes on numerous occasions. It's in a companies nature to tell the comsumer they are doing one thing when in fact they are doing the complete opposite.

A few months or even a year or years from now, icloud users will start to complain that their stored pictures have been erased and Apple will deny it's their fault. Then some security researchers will get involved and they will report that Apple's CSAM scanning as quietly been scanning everyone's icloud account and been accidently erasing users pictures due to a glitch with the scanning process. Then people will say 'hold on Apple, you said you wasn't going to do this type of invasive scanners on users icloud accounts'. Apple will then issue a PR release saying 'sorry, we will update our procedures so something like this does not happen again'.

It's going to happen, you just know it will

It’s becoming tiring to read this kind of pointless “stuff could happen” speculation.
Companies have a track record.
Actions have consequences.
Use your brain to infer the actual chance of specific stuff happening.
Just saying “stuff could happen” is a copout.
A copout for sloppy slippery sloppers.
 
If you need to ask a question like that, I cannot convince you otherwise in only a few words. Go and live a bit more, understand the world, understand some history, some context, and think about it, even if it takes a while.
Ah-ha. So your baseless assumption is fact?

I will take you on your word and go visit Expedia for some cheap flights. I pray it will help me to understand the world more and place me on a higher pedestal.
 
  • Love
Reactions: MozMan68
Simple question: if I take a photo of my son in the tub and you can see his thingy, will that image get 1. Reviewed by an apple employee, and/or 2. Reported to the police?
Assuming Apple has implemented the solution as advertised, without critical software issues, and in accordance with released technical papers the probability is low for both 1 and 2. Low but not zero.
 
For the first time Apple are continuing to support iOS 14 and iOS 15 simultaneously and users can decide to stay on version 14 without being pressured to upgrade. I had wondered what might appear in iOS 15 that would warrant people to avoid upgrading. I now have my answer.

This can and will be expanded to scan for much more and Apple will simply cower and say "we can't fight government demands".
Wrong again...do people even read anymore?

The only thing they introduced was the ability to have the updated ad privacy and encryption enhancements created for iOS15 in iOS 14 without having to update to iOS 15 to get these very important upgrades. It has nothing to do with giving people the ability to "avoid" this. hah.
 
Judging by your comments you obviously support all kind of restrictions and invasions in the name of false "greater good". I do not think we can find the "common ground" here. I do not support CCTV implementation all over the place as it is used for malicious intent very often, speed cameras don't bother me as they take photo only if you speed. However also find them somewhat annoying. Law enforcement is a completely different level of discussion...
Same for the Apple CSAM-matching system.
Security vouchers are unaccessible to any human in the universe until you speed a number of times. They may as well not exist.
 
Could governments force Apple to add non-CSAM images to the hash list?
Apple will refuse any such demands…

Yeah, well, the NCMEC (as being well-connected to the government) may not.

Once the hash is in the database, it becomes merely a question of having an entry (exploit) into the user’s device to check for any matches. The three-letter agencies probably have that already - or will buy it (such as Pegasus).
 
  • Like
Reactions: Euronimus Sanchez
Assuming Apple has implemented the solution as advertised, without critical software issues, and in accordance with released technical papers the probability is low for both 1 and 2. Low but not zero.
Chance of being hit by a meteor
Although no human is known to have been killed directly by an impact, over 1000 people were injured by the Chelyabinsk meteor airburst event over Russia in 2013. In 2005 it was estimated that the chance of a single person born today dying due to an impact is around 1 in 200,000.

You better stay in your house then since it is 5 million times more likely to die getting hit by a meteor than uploading multiple innocent pics and having them incorrectly hashed as similar child porn images.
 
People do realise that companies such as Google, Adobe, Facebook et. al already use some form of automated technology to scan for and detect CSAM? Adobe does it with Creative Cloud:


That's just one example.
This isn't the point. The point is Apple acted 'holier than thou' in comparison to all these other companies when they were lying the whole time. It comes off way worse when you're attacking your competitors making them look bad when you're not even being honest.
 
Simple question: if I take a photo of my son in the tub and you can see his thingy, will that image get 1. Reviewed by an apple employee, and/or 2. Reported to the police?

Neither, unless that image has been flagged by law enforcement as child pornography and is residing in NMEC database, in which case you have worse problems on your hands than simply being paid a friendly visit by the police.
 
Took me 3 days to copy all my photos and videos from iCloud, I just started deleting my library about 10 minutes ago. Apple was the last company I trusted not to violate my privacy. I’m already saving money because this stopped me from buying that ridiculously priced keyboard with touch ID for my mini m1. I ain’t paying $179 for no keyboard and $129 for a trackpad from no company that violates my privacy. This $39 wired keyboard and mouse combo been working fine.
 
Took me 3 days to copy all my photos and videos from iCloud, I just started deleting my library about 10 minutes ago. Apple was the last company I trusted not to violate my privacy. I’m already saving money because this stopped me from buying that ridiculously priced keyboard with touch ID for my mini m1. I ain’t paying $179 for no keyboard and $129 for a trackpad from no company that violates my privacy. This $39 wired keyboard and mouse combo been working fine.
A lot of folks were planning to nearly "All-In" on cloud solutions. Now local storage about to make a huge comeback lol. That incident with the Apple repair team leaking that poor girl's personal photos should've been a wake up call to Apple's stance on privacy.
 
  • Like
Reactions: Dionte
Took me 3 days to copy all my photos and videos from iCloud, I just started deleting my library about 10 minutes ago. Apple was the last company I trusted not to violate my privacy.
Congrats! I appreciate this sentiment. In retrospect it seems trustless, E2E encrypted systems are the only way. This explains why Apple never implemented E2E iCloud - now we know it was never a possibility to begin with.
 
  • Like
Reactions: DesignTime
[...]

Thus, this system effectively is a backdoor on your personal device capable of searching anything on it without probable cause, a warrant or any suspicion whatsoever. This is a type of mass surveillance that stretches even beyond the imagination of the worst dictators in history.
That may be your opinion, but it isn't a correct one. It seems there are on device hashes of known images. So unless your photos has icloud turned on and the hash of the known image matches one in the hash table, there is some threshold that is applied.

As we found out, Google is already doing this, so I'm not shocked that Apple is. But no you statement is not factual and hyperbolic.
 
A lot of folks were planning to nearly "All-In" on cloud solutions. Now local storage about to make a huge comeback lol. That incident with the Apple repair team leaking that poor girl's personal photos should've been a wake up call to Apple's stance on privacy.

Self-hosted cloud storage is an interesting option. Storing "Notes" and "Photos" in the cloud isn't rocket science - but existing solutions aren't great. From poking around on NextCloud it's an interesting option, especially if services offered turnkey solutions.
 
  • Like
Reactions: Techwatcher
I suppose you're the type that thinks reduced sugar groceries are an awful idea...
Bad comparison.

It's more like this:

You buy food from the grocery store, and then they spy on you to make sure you're not secretly eating McDonalds.

(1) Why is the grocery store spying on you?
(2) Why do they think it's their job to see if you're eating McDonalds?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.