Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I wonder how many people on this thread slamming Apple about privacy 1) have an email address that ends in @gmail.com 2) viewing this page using Google Chrome browser instead of Brave or Firefox or FireFox focus, 3) have Nest and Ring equipment installed in their homes 4) Have Alex or Google Home devices installed in their homes? 5) Don't use a VPN. 6) Post their whole life on Facebook, Instagram, Twitter etc etc.
 
Last edited:
No, I don’t think they will. Here’s what I think the problem is.

Google, Facebook, Amazon and Microsoft all scan for CSAM on their servers because they own the infrastructure that their services run on.

Apple runs iCloud on third party cloud services that they pay usage fees for. The cost of scanning every single picture that goes to iCloud would be eye-watering. So what I think Apple has done is pass the processing cost onto the customers by running the scanning service on device. So Apple customers are picking up the tab for this in terms of battery life and processor usage.

Now all they have to do is convince everyone that this is more private.

At the moment, it doesn’t appear to be sticking.
Very insightful argument! It certainly could be a bullet in their list of objectives.
 
I wonder how many people on this thread slamming Apple about privacy 1) have an email address that ends in @gmail.com 2) viewing this page using Google Chrome browser instead of Brave or Firefox or FireFox focus, 3) have Nest and Ring equipment installed in their homes 4) Have Alex or Google Home devices installed in their homes?
You are correct. I think the issue most people have though is that Apple based their business model on privacy and was starting to walk it back. Whereas Google and Amazon never cared to begin with so people don't give them as much grief lol.
 
I just don’t see how Apple thinks this is even feasible. How do they expect to ignore the laws of a local government?

The whole idea of scanning content locally on someone’s phone is a terrible idea that will eventually be abused.

And literally a hour later Apple takes down Koran app in China

 
Rhetorical analysis 101, appeal to logos, ethos, and pathos. Where do you get these writers MR? Who are these researchers? Every good writer knows you need to identify the sources, otherwise your argument lacks credibility lol. Even the NYT article you cite does not say who the researchers are, why is it now acceptable for "journalists" to cite each other without actually citing a real credible source...surely they exist.
 
Rhetorical analysis 101, appeal to logos, ethos, and pathos. Where do you get these writers MR? Who are these researchers? Every good writer knows you need to identify the sources, otherwise your argument lacks credibility lol. Even the NYT article you cite does not say who the researchers are, why is it now acceptable for "journalists" to cite each other without actually citing a real credible source...surely they exist.

Not only the NYT and MacRumors: most articles reporting on this fail to link the actual paper.
 
  • Like
Reactions: Sciomar
I just don’t see how Apple thinks this is even feasible. How do they expect to ignore the laws of a local government?

The whole idea of scanning content locally on someone’s phone is a terrible idea that will eventually be abused.
I literally just saw a story earlier today about Apple taking down a Quran app in China because the authorities demanded it. Does anyone think Apple wouldn't cave in and give China the keys to the castle over there the moment the CCP threatened to ban them from the country?
 
  • Like
Reactions: _Spinn_ and dk001
Here is the thing that people need to realize. If the government wants it. Apple has two choices. Implementation or leave the market. Would Apple still be around if they can’t sell in any country?
 
  • Like
Reactions: jamajuel
This is an opt-in feature that parents can enable or disable. So, your analogy is not analogous.

A correct analogy would be a speedometer app that tells you if you are driving over 30 km/hr, runs on your phone, does but not share your speed to the government or Apple, and can be disabled if you don’t want it.

You might still think that’s a bad idea, but don’t misrepresent what it is.
Your the one confusing things. Your speaking about messaging scanning not iCloud. 🙄🙄
 
  • Like
Reactions: dk001
I wonder how many people on this thread slamming Apple about privacy 1) have an email address that ends in @gmail.com 2) viewing this page using Google Chrome browser instead of Brave or Firefox or FireFox focus, 3) have Nest and Ring equipment installed in their homes 4) Have Alex or Google Home devices installed in their homes? 5) Don't use a VPN. 6) Post their whole life on Facebook, Instagram, Twitter etc etc.
I don’t have any of those. Other than a throwaway gmail for a few forums
 
Hash scanning was already in the ecosystem. Hash scanning is a core technology in computing and has been for at least 50 years. So worrying about the implications of hash scanning in 2021 is patently ridiculous. It's nothing new and governments/corporations have had the ability to use it for good/bad for half a century or more.

As for the "on device" part, the files that go to the cloud don't change so it doesn't actually matter from a legal or rights perspective. If you're an iCloud user, you first have to agree to Apple's terms of service. Those terms include specific language about Apple reserving the right to scan files that are intended for the cloud. Next, you would then choose which applications would be using iCloud for file storage. Once you make those choices, all files from that application will be scanned per the user agreement with Apple. It doesn't matter whether they're on device or on a server. The exact same files will be scanned.

I went looking and was unable to find the specific language that says Apple has the right to scan files intended to load on the iCloud on my device. Sure, you could potentially read that into it via interpretation, but the specific verbiage? Could not find that.
 
One of the arguments put forward by some members is that having CSAM on one's phone will not stop child poronography. What these people fail to understand in my opinion is that if CSAM catches some people, some of those caught will have a history of looking at such illegal images which means the police will be able to investigate that history. Things like where else did the person go to look at images, names of people who helped with the images. You catch the small players to get to the big players and I feel that is what will happen with CSAM. Those who are caught will eventually lead the police to the big players who are creating the material.

The police has the same principle with the drugs trade, go after the dealers to get to the creators of the drugs. Remove the creators, you remove the problem. Same with child porn, go after the watchers to get to the creators. Remove the creators you remove the problem.

While contextually you are correct, it is the method being used that has been called into question, from a privacy, ethically, and potentially legal standpoint.
 
I think it's likely that Apple may just shelve the idea (for now). The possibility of a state actor hacker using the system to look for things political is why Apple may not implement it, unless the technology improves.
 
I still think the CSAM angle here is partially a smokescreen to have an acceptable narrative to start scanning user content on device...

...to ultimately go full media/content DRM and subscription license verification on device.

Wouldn’t shock me a bit.

Tim’s whole thing has been transitioning the entire company to subscriptions for everything, and media companies would LOVE draconian control over what you see...how...where...for how much and for how long.

They don’t want customers so much as money slaves on auto-billing
 
So another point here is that the solution is not only privacy-invasive, but also ineffective.
After all, any person in his or her right mind won't upload illegal pictures to cloud services. Supporters for the CSAM plan tend to ignore this.
The thing is people are stupid. So even if one bad pedo is doing what is smart and not turning on iCloud or augment the photos to try and not get flagged. Often the people whom they associate with (other pedos) aren’t always as careful. On a large scale people will get caught with the illegal photos. This will start an investigation on to where they came from and also add their photos to the CSAM database which in turn can flag others.

So while this can’t catch everyone it can definitely make it harder to use and share those images. Also keep in mind that another thing Apple is likely trying to address is they do not want these photos on their servers and it almost entirely certain those photos do exist on their servers. I’d bet you that if this was not an issue with their servers they wouldn’t be bothering with this feature.

I’d be interested to see the actual data these organizations did for research. I willing to wager they fall short of pointing out that Apple’s system uses humans to validate actual CSAM and that the organizations are driven by agendas. Once these people who don’t like a thing get the data and it doesn’t always support their beliefs they’ll still publish what they want to make it sound like they are covering all the areas of concern. Again I’d like to see their data. I’m certain they have left a lot of it out (my opinion).
 
That old chestnut. If you have nothing to hide, you have nothing to fear.

By the same logic, if you have nothing to say then you won't mind if your Government bans free speech, right?
It is even worse than this. The law in the US is no longer the law. If you have the correct political ideology, then you can break the law whenever you want. If you have the wrong political ideology, then you cannot even walk in public without being arrested and thrown in jail.

Any more US government authoritarianism from this point on is about politics and control, not the law. If you are foolish enough to believe otherwise then you're going to be real disappointed in another 10 years or so, when a civil war starts and you get everything taken away from you; family, job, food, phones, etc.

The only way an authoritarian government becomes less authoritarian is via war, violence, and bloodshed.

Unless of course, people start waking up early and push back.
 
  • Like
Reactions: BurgDog
No, it isn't new. Apple's terms of service for iCloud always specified that it should not be used for illegal activity and that Apple reserved the right to scan files as a result. Major cloud services have the same terms. The user agrees to Apple's terms of service and then personally chooses which apps have files backed up in the cloud. There's no doubt about the user "opting in".
Actually yes this is. It’s not tracking pixel for pixel hash results. Look it up. Edits to the image are also matching.
 
  • Like
Reactions: dk001
Because it doesn't do anything to catch anyone harming children....

(and if you don't understand why, feel free to read one of the dozen or so articles in the NYTimes or WaPo or one of the many research papers)
I believe you are incorrect and that it will help reduce the amount of illegal images going around and will also catch some of the less intelligent people who do this and don’t understand how technology works. This will in turn possibly catch other people. It will not eliminate the problem anymore than a speed limit sign prevents people from going over it and tax laws prevent people from breaking them. However it is a system that if honest will be effective. Your fear and paranoia are seeking out any article that supports your belief. Take some time and look at how the system Apple proposes works. Once you have that than your only argument should be weather or not you trust Apple to do exactly what they say they will do. My guess it will be no but then that is where the discussion should be.
 
  • Disagree
Reactions: rme and KindJamz
No one is surprised.

Worst idea Apple ever had. A total train wreck beginning to end. The PR nightmare is laughable.

Tim knows supply chain. Good thing he doesn't have to sell it to the public!
 
  • Like
Reactions: KindJamz
You are correct. I think the issue most people have though is that Apple based their business model on privacy and was starting to walk it back. Whereas Google and Amazon never cared to begin with so people don't give them as much grief lol.

Yes and No.
While this is looking at privacy, the other players are not looking to scan and report to authorities their findings on your device.
Big difference when you look at intent.
 
Last edited:
I believe you are incorrect and that it will help reduce the amount of illegal images going around and will also catch some of the less intelligent people who do this and don’t understand how technology works. This will in turn possibly catch other people. It will not eliminate the problem anymore than a speed limit sign prevents people from going over it and tax laws prevent people from breaking them. However it is a system that if honest will be effective. Your fear and paranoia are seeking out any article that supports your belief. Take some time and look at how the system Apple proposes works. Once you have that than your only argument should be weather or not you trust Apple to do exactly what they say they will do. My guess it will be no but then that is where the discussion should be.

It isn’t Apple doing or not doing that is concerning many of us. It is whether or not the State actors will give Apple a choice. At least here in the US the Government cannot force Apple to build this, they can however legally repurpose functionality already built. I do not know the legal aspect of published design.
 
  • Like
Reactions: Philip_S
Apple has repeatedly used the constitutional argument with the US government you can't force us to write new code and add it to iOS to provide a backdoor around end to end encryption. Once you've written the code and included it in iOS, you've lost control over how a Government might dictate what images it must scan for. The only solution now is to never install the code, and they may lose the argument against a government order compelling Apple to install it in the future, because the code already exists and Apple has demonstrated the technology integrated into iOS.
 
lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
I think the experts know what's going on and aren't confused, seems you might yourself
 
  • Like
Reactions: BurgDog and dk001
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.