But is it ok with you that you can never delete emails, post's, tweets and more from Apple, Google, Twitter and others?
No, which is why I don't use Google (as far as I can possibly avoid them) or Twitter, and use Apple's unencrypted cloud services only for things I want to make public or for communication with them.
Someone will be able to get into whatever you think you are protecting. Your location, everything.. its all tracked. You're fighting for an illusion.
Since privacy is an illusion, please send me photos of your credit card, birth certificate, and passport, film any hot members of your household engaged in sexual activities, and let me have a rummage through your papers and IT equipment in case there's anything interesting there.
Yes it is apples job. Apple isn’t legally allowed to store child abuse images. They are responsible for ensuring this.
This means it’s Apples job to scan for abuse photos before they are uploaded to the cloud..
Private companies are already required BY LAW to monitor their users.
They are required to report images or videos of child abuse (and related images) if they discover it, but they have no obligation to go looking.
The law you are referring to here is the CyberTipLine Modernization act of 2018.
That expanded the previous law from (arguably) pictures to (explicitly) videos, and allowed them to report suspected imminent violations, but it didn't create a new mandate to go looking.
Google and Facebook already report 1000s of illegal images to authorities every day.
I don’t recall any huge issue with the topic until now.
I for one don't send them any of my pictures.
It will protect children by catching perpetrators who make illegal child images
Once those pictures are widely-enough circulated to come to law enforcement's attention, and then can be tracked back by upload date.
Apple, Google, MS and Facebook have been running server side scans for ages. But it appears that Apple doesn’t seem to catch anywhere near as many images as the others. So either pedophiles don’t use iCloud (unlikely) or Apple’s server side scan doesn’t work as well as the competition.
Google, Facebook, and MS all do content-analysis scans on images uploaded to their servers, and have much better training data because they incorporate all user data into the learning system. They've probably discovered that some of their categories represent child porn (and certainly some faces would raise red flags, eg Traci Lords). Apple only scans against the NCMEC database and foreign equivalents (albeit using a less sophisticated hashing algorithm than NeuralHash).
If a government wanted to abuse the phone in your pocket then they would just do it. The existence of this system would have zero bearing on it.
In America (and a few other countries) the government can require companies use capabilities and technologies they have to help law enforcement, but they don't have to go out of their way to create new capabilities (eg insecure versions of iOS to use to try to unlock an iPhone).
Apple would have to sanction any "horrible" use of this technology, which--while obviously possible--I have absolutely no fears that they will.
Until they get a court order, National Security Letter, or similar.
How does that relate to the topic?
That's an example of a service that tried to protect user privacy first being required to collect user data they didn't want to, then being forced to hand it over (to prevent "terrorism").
What exactly do you think those countries could ask Apple to scan for?
How about photos that appear in the press showing war crimes that were secret, or photos taken on farms that are protected by Ag-Gag laws, or the photos of protesters that got put into FBI anti-terrorism datasets so they can identify who was behind the camera as well as in front?
This has to be the most ridiculous most customer hostile idea apple has ever come up with.
Allowing apps running on M1 macs to individually encrypt files without the user being able to decrypt them except with that app might turn out to be worse. I assume the point is for DRM keys, but you can image how Adobe or someone worse might use it, and it could be a disaster for interoperability when used thoughtlessly.
Apple’s solution is a more privacy oriented solution IF CSAM SCANS MUST BE DONE
But there isn't any such obligation.
Apple is progressively moving towards fully end-to-end encrypted iCloud.
Then why didn't they announce that first, and say this was to prevent it becoming a haven for CP? For that matter, why aren't they concerned about people sharing CP in iCloud Drive?
Heck, I even work with kids where we aren't allowed to release them to their parents without the parents presenting a pickup tag that matches the child's. 99% of parents understand and appreciate that level of security and concern for the safety of children. They don't say, "So you think I'm some kidnapper or molester?! You don't trust me?!"
There's a difference between satisfying your obligations and playing at detectives.
I bet you don't download lists of stolen cars and do ANPR just to check them all, or whatever.