Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This is an opt-in feature that parents can enable or disable. So, your analogy is not analogous.

A correct analogy would be a speedometer app that tells you if you are driving over 30 km/hr, runs on your phone, does but not share your speed to the government or Apple, and can be disabled if you don’t want it.

You might still think that’s a bad idea, but don’t misrepresent what it is.
Good. Wait until government pressure that feature to be mandatory, just like they enforce speed limit. Except, way worse. If you still fail to understand the extent of how dangerous it is, there is no Hope.

Enjoy reading things at its face value, cause that’s what government always want us to do no matter what.
 
  • Like
Reactions: BurgDog and dk001
I still think the CSAM angle here is partially a smokescreen to have an acceptable narrative to start scanning user content on device...

...to ultimately go full media/content DRM and subscription license verification on device.

Wouldn’t shock me a bit.

Tim’s whole thing has been transitioning the entire company to subscriptions for everything, and media companies would LOVE draconian control over what you see...how...where...for how much and for how long.

They don’t want customers so much as money slaves on auto-billing
Heck, I am fully confident that every single corporation, company, government, just want to milk people to death and get free money with nothing to provide.

Right now I’m already imagining a draconian and grim future that some of us here will simply call me paranoid. Wait for 3 years and we will see who is right, tho expanding here is off topic.
 
How can you even use the CSAM detection system for this unless the people the government are after share iconic picture in their Photo Library.

Are people engaging in such illegal activities really stupid enough to store on their phone.

If you were into organised crime, what kind of picture would you store in your Photo Library which the government has a copy of and which people not into organised crime don't have.

I can't understand what kind of picture it would be. And how would the government get a copy of the picture to put it in a database?
 
Last edited:
I just don’t see how Apple thinks this is even feasible. How do they expect to ignore the laws of a local government?

The whole idea of scanning content locally on someone’s phone is a terrible idea that will eventually be abused.

What kind of pictures do people in organised crimes store on their devices and share only with other people doing organised crime which would identify them?
How would the government obtain a copy of it?
 
No, they’re not. Cloud providers only scan files when they hit their servers. Apple is scanning the files on your phone.

It doesn't matter to me if photos are scanned locally or on Apple's server since every photo is going up to Apple's servers anyway.

In fact, I have more data in the cloud than on my phone, so scanning on my phone would result in less scanning.
 
  • Sad
Reactions: KindJamz and dk001
Critics: "This will expand way beyond CSAM. They'll start with terrorism and other crimes."
Supporters: "No. Apple cares. It's about the children."
Europe: "Hold that thought."

Apple didn't even implement this and it's already starting to expand beyond CSAM like wildfire.

How would they even use the CSAM Detection System to root out terrorism?
It would only work if terrorist shared meme photos and they would soon learn not do that.

The system can't be used to detect a category of photos only a specific set of photos and close derivatives.
 
  • Sad
Reactions: KindJamz
Apple is not getting one more cent from my family and I until this crap is officially ended.

We have supported and used Apple products for 30+ years. We’ve never had a reason to completely abandon them until now and the only way they will listen is if we speak with our wallet.

We’ve already canceled all services we had including Apple TV+, iCloud, Apple Arcade, and Apple Card. I’ve gotten rid of both our Apple TVs. We went with Sony XM4 headphones instead of AirPods Max.

My son has already jumped the Apple ship completely and my wife and I are not far behind. I will not purchase any new Apple hardware. If we absolutely need to replace our iPhones, iPads or MacBook it will either be used or outside of the Apple echo system.

Tim Apple and co have lost the f-n plot. Not one more cent.
You’re not going to be able to get out from this in the long run, unless you roll your own phone and ecosystem.
 
What if you buy a car and then the car company installs a device to alert the police if you are speeding or if they randomly search your car for drug/alcohol and then notifies police if they find something. They are then just an arm of the police without the restraints. After all, this is in the name of safety.
What about after buying a house, would you be okay if the builder randomly inspects it to check for anything illegal? What if they discover you have been smoking in the house and you have children? Call the police in the name of child safety.

I would have no problem with buying such a car if I was driving under the influence or smuggling.

I would oppose it if it reported speeding.

The reasoning is very clear. I do speeding almost every time I drive, but I have never driven under the influence or smuggled.
 
  • Haha
Reactions: KindJamz
I can understand how you could think that a casual forum user was misinformed. But you think that a dozen prominent cybersecurity experts in an independent study, not to mention privacy advocates, security researchers, cryptography experts, academics, politicians, and even employees within the company are all also wrong? Please expand - do you think all of these separate entities are uneducated or have something to gain?

I think a lot of them don't understand the CSAM Detection system since it would be quite bad at detecting what they fear it will detect.

Let's say they want to catch people doing human trafficking. What kind of picture are shared among human traffickers which let you find them?

The CSAM Detection System can't be used to find child pornography in general so how could it be used to find photos of human trafficking or drug smuggling?
 
Why are we considering CSAM risk differently than, say, anti-malware or other agents on a device that specifically are scanning all (in-scope) files for signature based identification? Who is to say or know that GA's haven't already used the vector used in CSAM for surveillance?
 
  • Like
Reactions: hans1972
According to the researchers, documents released by the European Union suggest that the bloc's governing body are seeking a similar program that would scan encrypted phones for both child sexual abuse as well as signs of organized crime and terrorist-related imagery.
Well, there you go. Where's that pinky promise Craig?
It's obvious this will happen when the system is in place, because why not?
 
lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
The database of hashes is baked into iOS15. It's on-device local scanning. Go read Apple's own documents.
 
  • Like
  • Sad
Reactions: dk001 and BulkSlash
What Apple can actually do does not only depend on the EULA, but also on the jurisdiction.

As example, in the EU Apple would need to receive explicit opt-in consent to perform these scans. Since such scanning is not necessary for the functional aspects of iCloud, Apple would also be prevented to tie consent to access to iCloud itself, meaning that Apple would have to allow iCloud access to non-consenting users without detriment.

Apple created this system because of veiled threats from (a) Republican senator(s).

If Europe doesn't want this system, I don't think Apple would force it. GDPR probably forbids it right now, but EU parliament just voted this summer to make an exception for child pornography scanning.
 
Except the Apple version isn't working with pixel/byte based hash values. The hash value is generated by extracted features for which a neural network is used. This can be a simple feature detector, but also an object detector. These features could also be based on semantic image content. Bit of a difference there to the good old hashing.

No, the CSAM detection system doesn't recognise objects or what kind of picture it is scanning. The CSAM detection system can't find child pornography at all. The neural network wasn't trained on CP pictures at all.

Which is exactly why it is such a poor system for detecting photos in a specific category based on content.
 
Nope: that only works for data processing which is deemed necessary for the service being provided. If the data processing is not necessary for the service, it needs consent via explicit opt-in and said consent needs to be freely given. The EU does not consider said opt-in consent "freely given" if not providing the consent prevents the user from using the service.

Note that consent would need to be specific: if Apple has the consent e.g. to process the data to perform backups, it does not mean they can use said consent for a different reason.

So, no fear of this ever being implemented legally in EU then?
 
No, I don’t think they will. Here’s what I think the problem is.

Google, Facebook, Amazon and Microsoft all scan for CSAM on their servers because they own the infrastructure that their services run on.

Apple runs iCloud on third party cloud services that they pay usage fees for. The cost of scanning every single picture that goes to iCloud would be eye-watering. So what I think Apple has done is pass the processing cost onto the customers by running the scanning service on device. So Apple customers are picking up the tab for this in terms of battery life and processor usage.

Now all they have to do is convince everyone that this is more private.

At the moment, it doesn’t appear to be sticking.
I'm sure the pennies Apple would have to pay to scan cloud side will put them out of business 🙄🙄
 
  • Like
Reactions: dk001
It isn’t Apple doing or not doing that is concerning many of us. It is whether or not the State actors will give Apple a choice. At least here in the US the Government cannot force Apple to build this, they can however legally repurpose functionality already built. I do not know the legal aspect of published design.
Than that should be your argument. Since you, and admittedly also me, do not know all the laws around this than our time should be spent educating ourselves on what those laws are. One thing I feel comfortable with (my opinion) is that if Apple were pushed to do something they felt was morally wrong they would say no to them publicly. If they were gagged by the government that doesn’t mean they have to abide by what a government tells them they want them to do. They can still say no. Companies tell the governments no all the time. Apple is large enough to get away with it as well. The government can push them out of the country which in of itself would cause a lot of suspicion if that were to happen.
 
So, Apple has caved and taken a Koran app down at the request of the Chinese government. Why should we trust them not to cave and start adding additional privacy invasions to iOS now?


Apple really have completely and utterly destroyed the trust people had in them to protect their privacy.
 
No, the CSAM detection system doesn't recognise objects or what kind of picture it is scanning. The CSAM detection system can't find child pornography at all. The neural network wasn't trained on CP pictures at all.
Doesn't need to be trained on CP pictures, just needs to work on similar features. https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf

I'm not going to go over all this again, have done so on past threads. Probably worth it to read the leading papers on adversarial attacks that have been published.
 
Scanning isn't new. The way Apples implementation of scanning/detection works is new. One can clearly see this when reading the technical documentation summary document from Apple, where they describe how the technology works. Not in full detail, but deep enough to understand it's new. If you don't agree that it's new, please let us know what other provider (cloud service or not) is using the same underlying algorithms/technology as Apple to scan for CSAM.

Microsoft PhotoDNA uses similar algorithms. This software is used by many others also since Microsoft donated it.

Although Apple's implementation is their own, the types of algorithms they are using wasn't invented by them.

Convolutional neural networks has decades of history: https://en.wikipedia.org/wiki/Convolutional_neural_network#History

Locality sensitive hashing can be traced back to the late nineties, but a lot of work happened 2008-2012 in particular: https://en.wikipedia.org/wiki/Locality-sensitive_hashing

Hyperplane LSH which is the type of LSH algorithm Apple is using goes back to about 15 if not 20 years. Here is a dutch paper from 2017 on its effectiveness to solve the near neighbour problem: https://drops.dagstuhl.de/opus/volltexte/2017/8092/pdf/LIPIcs-MFCS-2017-7.pdf

What Apple did was innovation, taking known algorithms and technologies from others and putting into a semi-new use.
 
Why are we considering CSAM risk differently than, say, anti-malware or other agents on a device that specifically are scanning all (in-scope) files for signature based identification? Who is to say or know that GA's haven't already used the vector used in CSAM for surveillance?

These features are under user control and if something is found it doesn't report you, the device owner to law enforcement.

Now if Apple made this into an app that users could install (or uninstall) you would likely see a different take.
 
  • Like
Reactions: BurgDog
I'm pretty sure Apple, Google, etc. are already scanning our devices whenever they want to. Maybe that's why we still don't have phones in 2021 that can last 4-5 days on a single charge with moderate to heavy usage.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.