Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
For those of you in this thread who are frustrated by Apple's actions, what's next?
Have you already disabled iCloud? Switching to Linux? Buying a PinePhone

Curious about actions others are taking.
I never used iCloud. But I will not upgrade to OS15. My phone is good another couple years. Than I will think about the solution. There are some (someone mentioned CalyxOS) that are reasonably secure. I understand that all companies are trying to get all the information that can (legally or not) but I try to make it hard. I do not use google or any google apps, I don't have gmail etc etc.
 
Being struck by a lightning is 2 million times more probable.
A single photo isn’t enough.
You missed a lot of pieces of the puzzle.

That's the probability stated by Apple, and they refuse to justify that number.
 
I never used iCloud. But I will not upgrade to OS15. My phone is good another couple years. Than I will think about the solution. There are some (someone mentioned CalyxOS) that are reasonably secure. I understand that all companies are trying to get all the information that can (legally or not) but I try to make it hard. I do not use google or any google apps, I don't have gmail etc etc.

Sounds like you're ahead of the curve. Are there any cloud/connected services that you do use, or is none of it up to a good standard? ie Duckduckgo vs Google. Curious what you've found that works.
 
Obviously I have no sympathy for child abusers and could care less what happens to them. I am also suspicious of all big tech companies, however I still think that Apple is as much the "Good Guys" on it as far as it is possible to be. Anyone who thinks that any cloud provider is a good place to store illegal stuff regardless of what it is is a freaking moron. The original stuff that I read about this said that Apple was doing on device scanning, but this reads like that is not the case. Are they or aren't they? I would draw the line of whether I was pissed about them scanning with this at that: My device is my device and I don't want local scanning of my content under any circumstance.
 
Last edited:
A step too far. Control control control. Sometimes you just have to let people be people and let the chips fall. The real purveyors of child abuse are not going to be caught out by this. Yes it seems like a noble endeavour but the road to hell is paved with good intentions. Systems like these will 100% be adapted at some time down the line and used to target and apprehend those committing wrongthink.
never understimate the stupidity of criminals.
A lot of folks were planning to nearly "All-In" on cloud solutions. Now local storage about to make a huge comeback lol. That incident with the Apple repair team leaking that poor girl's personal photos should've been a wake up call to Apple's stance on privacy.
This is a sort of pseudoscience, or false equivalency.
The incident to which you refer was isolated. It was one iPhone repair involved and one repair tech at Apple's repair partner. There's nothing inherent in Apple's devices, software or processes that posts people's photos or even routinely allows other humans to look at them without your explicit permisison.
It would be like taking that isolated incident of the woman burned by McDonald's coffee and saying "see, McDonalds wants you all to be hurt every by hot coffee."
A series of bad decisions by one person is not indicative of a larger conspiracy.
 
  • Like
Reactions: giggles
That's helpful, thanks! If you don't mind sharing, what files do you keep there, how well does it integrate into Mac OS and iOS?

I really like how Notes and Photos integrate into iCloud. Trying to understand how I can replace that....perhaps Sinilogy has its own Notes and Photos apps on iOS?
I don't know about notes. I just don't use them enough to care. But yes you have Synology application that can be used for storing photos etc. However the decent 4 bay NAS is around 700-1000USD and you need HD as well. The total cost of my Ubiquiti/Synology system is probably around 4000-6000USD. But Im independent from any service except from ISP of course. All files I encrypt and always use VPN. Also all HomeKit devices I host on my space so camera etc are only recording to my HD.
 
I do have to say that after learning more about it, especially in Gruber’s post on Daring Fireball, I feel better about things than I did initially. That being said, I think it’s an incredibly slippery slope and governments around the world will definitely want to take advantage of that.

I imagine them scanning for certain images and memes that could mean you believe in certain things and use that against you. Fortunately the way it works they couldn’t track down specific new photos you take, just copies of photos that are not allowed that already exist in a public space. Still pretty lame.

I doubt this system will catch many child predators either, especially after all the public attention this has received. It may catch the less violent creeps who collect images online for their sick collections, but will do nothing to stop someone abusing actual children who has a private collection and is actively doing harm. And that’s the scarier next step: using machine learning AI to scan for certain patterns in your photos. You know that has to be coming next.

I don’t like how there’s no oversight for this either. You could just suddenly have police show up because someone decided you’re a threat and there would be no way to prove otherwise because they would just say you deleted the evidence. This could be used to silence certain people. We can’t let the pursuit of justice destroy democracy. We are supposed to be protected from unlawful search and seizure as we have committed no crime. But Apple owns these servers so they can do what they want. Fortunately we can’t, as in we can’t replace iCloud with our own solution. Apple must be forced by their customers to open up these APIs. I have gigabit at home and should be able to use my file server as a remote volume for backing up my photos and files automatically.
 
To those saying this is similar to other companies scanning for CSAM on their cloud servers:

This is wholly different because the scanning occurs *on your device*. Your iPhone and Mac (they said this will run on iOS and macOS) will be running a payload from Apple which may incriminate you and report you to the police. Think about that.

As an analogy, think of the iPhone as your home and the cloud as the public square. If you do something in the public square (or public space at all), you understand that is subject to the scrutiny of the public. But you have an expectation of privacy for what you say & do in the privacy of your home. Unless the government specifically gets a court order to surveil you for reasonable suspicion of wrong doing, the police cannot monitor or surveil inside your home. By Apple running an algorithm on your phone to scan your files, it is like the building contractor that built your home boring a peep hole in the side of your home and allowing the police to monitor constantly what you do. They say the police are trained only to look for this one thing, and they will not look for anything else. And you shouldn't be worried about this if you're not doing this one thing anyway!

But obviously, if (read: *when*) the government ever decides they want to monitor your home for more than just that, they can simply whisper to the cop peeping into your home to now look for other activity. So now it's CSAM pictures, and obviously & predictably in future it's "terroristic speech," and then it's "domestic terrorism speech" – and then it's "hate speech."

We've seen this over and over and over again in history. Why do you think we're different today? It can and will happen if we let it. This is a total betrayal by Apple, who seemingly up until recently was a champion for security & privacy and suddenly has done a 180 on us.
 
1 in a trillion chance.
Multiple offences needed.
Not looking at numbers is anti-vaxxer grade stuff.

Plus, the image included in the security voucher (unlocked ONLY after multiple offences, basically impossible to happen by accident) is a low-res version of your pic.

Check your facts.
The fact is no matter how you sell it that they are snooping in your device. All the rest are details.
 
Your tortured analogy only reinforces the fact that you have no idea what you are angry about, and even less of an idea what you are talking about.

Read the description Apple released. Learn that the issue is much more limited and nuanced than you think it is.
Imagine being such a fanboy you think a multinational scanning the images on your phone just in case you're a child molestor is not worth getting angry about.
 
  • Haha
Reactions: crymimefireworks
🤣 Apple can't even competently review their apps with "human review" to insure they are safe and spyware free.
A company which thought it would be a good idea to effectively hack into people's phones to put unwanted music on them (remember the U2 album they secretly copied to your phone?) want to now monitor you just in case you're a pedophile. If this was a movie, it wouldn't be believable.
 
  • Love
Reactions: Mydel and MrDerby01
I don't know about notes. I just don't use them enough to care. But yes you have Synology application that can be used for storing photos etc. However the decent 4 bay NAS is around 700-1000USD and you need HD as well. The total cost of my Ubiquiti/Synology system is probably around 4000-6000USD. But Im independent from any service except from ISP of course. All files I encrypt and always use VPN. Also all HomeKit devices I host on my space so camera etc are only recording to my HD.
Got it, so about as trustless as you can get. Congrats, and I hope this tech can be simplified for the rest of us. Also curious about how Tor can be part of the toolchain.
 
People do realise that companies such as Google, Adobe, Facebook et. al already use some form of automated technology to scan for and detect CSAM? Adobe does it with Creative Cloud:


That's just one example.
Exactly! People dont do their research. They just impulsively post about Apple smh.
 
Google has been doing this with GMail since 2014


No one bats an eye for that

But when Apple does it, NOW everyone gets upset

Why is this?
Probably because everyone expects 0 privacy with Google yet Apple's whole schtick is "privacy is a fundamental right and "what happens on your phone stays on your phone."
 
  • Like
Reactions: crymimefireworks
Could governments force Apple to add non-CSAM images to the hash list?
Apple will refuse any such demands…

Yeah, well, the NCMEC (as being well-connected to the government) may not.

Once the hash is in the database, it becomes merely a question of having an entry (exploit) into the user’s device to check for any matches. The three-letter agencies probably have that already - or will buy it (such as Pegasus).

Then what?
Suppose a non-CSAM picture is injected by the US govt in the NCMEC repository, then what?
Suppose it even escalates to human review inside Apple, then?
Apple’s reviewer would see it’s not kiddie p0rn and discard it.
 
  • Like
Reactions: Michael Scrip
It can and will happen if we let it. This is a total betrayal by Apple, who seemingly up until recently was a champion for security & privacy and suddenly has done a 180 on us.
Right you are. So what are you going to do about it? What can we do to not let it happen? And since it does seem to be happening, what then?
 
Sounds like you're ahead of the curve. Are there any cloud/connected services that you do use, or is none of it up to a good standard? ie Duckduckgo vs Google. Curious what you've found that works.
I use duckduckgo. Its really OK. Mail I use proton mail (except of work mails of course), threema as communicator although I also have signal. Never had Facebook or instagram which I consider the gold mine for companies and the 0 net value for the user. I have of course iCloud but only syncing bookmarks between the phones and computers in safari. 1 password I also use and sync via my own NAS as I know that apple can decrypt any files on iCloud and although its encrypted 2 lauers I still have doubts. The photos that I find questionable (private, gf etc etc I store only offline).
 
People do realise that companies such as Google, Adobe, Facebook et. al already use some form of automated technology to scan for and detect CSAM? Adobe does it with Creative Cloud:


That's just one example.
Please ALSO realize that they are performing the scan on THEIR servers. Apple intends to perform the scan on my device. One is scanning their equipment (perfectly fine), one is scanning MY equipment, which is absolutely not fine.
 
Could governments force Apple to add non-CSAM images to the hash list?
Apple will refuse any such demands…

Yeah, well, the NCMEC (as being well-connected to the government) may not.

Once the hash is in the database, it becomes merely a question of having an entry (exploit) into the user’s device to check for any matches. The three-letter agencies probably have that already - or will buy it (such as Pegasus).

This slippery slope is going to age as well as people claiming that iPhone 5s users were going to have their fingers chopped off by criminals to unlock their phones.

Could it happen? Not impossible, but extremely unlikely (this is real life, not mission impossible), to the point where I don’t it’s something anyone should be losing sleep over.
 
Then what?
Suppose a non-CSAM picture is injected by the US govt in the NCMEC repository, then what?
Suppose it even escalates to human review inside Apple, then?
Apple’s reviewer would see it’s not kiddie p0rn and discard it.
Wow you have put a lot of time into this thread, many many replies. Curious what's important about this topic for you? How do you want the system to work? Is there something you all want us to do or believe? I'm not getting it so figured I would ask.
 
Sad by this big privacy violation. I feel to be in a Black Mirror episode.
I am also resigned to the fact that this is going to be how “investigative work” will be carried on in the future. Detectives will learn SQL and will solve cases through few clicks from a desk 😂

The final human review is an even worse privacy violation… I hope it won’t happen something like with Siri again, when Apple allowed partners (humans) listening private conversations https://www.theguardian.com/technol...-hear-confidential-details-on-siri-recordings

Anyway, all companies are doing this. Let’s accept it and enjoy some extra help in protecting vulnerable children.
 
My simple question to these questions; explain to me how you could have a picture so close to one that has been marked as child pornography without it actually being child pornography? Now explain how you could have more than one picture like this to the point where it would actually trigger a review from an Apple employee? And then try to imagine this will happen one in one trillion times and even comprehend what that number means in real world terms.

Well a picture is just a bunch of pixels. And there is some leeway to the hash. Therefore an adult/legal subject could have similar colors, scenes, pose and other similarities.

Granted I have NEVER seen an example, nor do I want to. So I’m not sure what types of pictures we are talking about here. So that is good ignorance on my part I guess! Adult subjects can be naked in the bathtub and it might produce a hash that fits in the threshold for example.

Also, how does the manual review verify the subject is legal? I have seen people in their 20s that look 16. Heck I looked 15 until I was in my 30s!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.