Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
You think CSAM is more prevalent than domestic abuse or drunk driving?
(or pick any of a zillion other issues and crimes out there)
Ummm. Apple can’t do anything about one person beating their kid or a person drinking and driving. This is about them trying to make their platform the way they want it, as fine of a line as it is.

The irony is that Apple promotes hookup apps on their App Store where people have been hurt or worse.

The moral of the story is that society as a whole is filled with lots of garbage. You can take your pick of issues. This is what apple is choosing to zero in on with their platform.
 
But the stakes are much higher. This feature has the potential to ruin your whole life, even if you did nothing wrong, there are several possibilities it could fail:

- The hash algorithmen could be attacked and images with harmless content and the same hash as a image in the library could be constructed. The algorithmen is not published and can not be checked.

- someone could send images from the library which get automatically uploaded to the iCloud library, e.g. Threema messanger has this functionality and I bet others as well

- Apple could just make mistakes when reporting users

- there could be software bugs on all levels, wo all know how bad Apple software can be (after all we are talking about the company that wrote the dammed photoslibraryd daemon)

And than there is the huge potential for abusing this feature by countries like China and a lot of others.

Even with a chance smaller than winning in the lottery (whatever this is) and do not want to bet my reputation on this “feature“ working without a flaw. And always remember, even if it is unlikley that you win in the lottery, it is very likley that someone does…
Exactly. Rather than going after the bad guys with regular, constitutional law enforcement tools, they are turning YOUR devices into snitch tools. The police could find every drug dealer, too, if they didn't have to get warrants and just invade and search every house. This is wrong and evil and an incredible betrayal of Apple customers. They will rue the day they did this.
 
Last edited:
Ummm. Apple can’t do anything about one person beating their kid or a person drinking and driving. This is about them trying to make their platform the way they want it, as fine of a line as it is.

The irony is that Apple promotes hookup apps on their App Store where people have been hurt or worse.

The moral of the story is that society as a whole is filled with lots of garbage. You can take your pick of issues. This is what apple is choosing to zero in on with their platform.
According to FotoForensics, a private on-line photo research company, the amount of CSAM they discover in over a million images per year is 0.056%.
 
With the greatest respect, having this attitude of only worrying about what is clear and present now and not considering what the currently active path may lead to in the future will almost certainly mean that, by the time you arrive at a new point on the path, there will be little or nothing you can control as you are always behind the curve.

With that said, I do understand your point about not wanting to tie your thoughts up with uncertain things, but if there is not some forethought then you may as well resign yourself to not having control over anything because - almost inevitably - by the time you become aware of a change in policy or direction it will be ingrained in and there will be little to nothing you can do. By being aware of what is coming down the pipe you MAY at least - along with enough others - have a chance of changing the direction of either yourself or the incoming "thing"...just a thought...

You have to study this subject a lot more which you apparently have and I have not. I simply don’t fully understand it, it’s all new to me. The tech has just been announced and is far from a rollout, so I’ll just wait and see. If there are better solutions and they’ll change course or if the EU won’t allow it at all, I would’ve worried about nothing. I will keep an eye on it but it won’t get affect my sleep. Not completely oblivious, but also not burning my iPhone.
 
I'm personally not defending either side but I will say a member has a right to defend Apple if they want and others here don't have a right or just cause to question them. It's a discussion forum not us against the evil giant called Apple. 🙄
Sure and your privacy is still being exposed, so we're ok now then?

Why do you get outraged whenever the internet gets outraged?

Read my post history via using the search function. I've bashed Apple many, MANY times on this forum. This isn't me taking sides.

And people complaining how the technology works is literally the point. Unless you just want to bash Apple because it makes you feel good, then maybe how the technology works isn't the point *to you*. In that case, I have nothing more to say to you.
Outraged? :D?

If you want to believe Timmey the almighty squeezer & his team are our saviors by all means go for it, not trying to convince you, after all, you seem to bring up people don't understand how the technology works when it has nothing to do with how, but why.

What would be the point of how it works?

Why did we go from "a fundamental human right" to "if you didn't do anything illegal you shouldn't have to worry"?

Are you ok with tech giants being this hypocrite? Don't get me wrong, they all do it, but since Apple who apparently were "standing in favor of privacy" now flipped like a pancake, we should all check how it works and be fine with it?

You are paying "premium" for a device which will use it's hardware/software to scan private content to see if you are a pedo? Why not just release the iPhone CSI edition or maybe get Tim & his minions to apply for the FBI altogether?

You can spin it however you want, but this shouldn't even exist coming from Apple. Google / Microsoft may data mine you to death but at least they don't fake it at it this level.
 
Last edited:
We get it you don't have a technical background, leave alone a PhD, but you have to understand this couldn't be simpler. User to Apple connection is secured, no one else involved or can see data. Upload data to Apple (can go to a temporary buffer), then scan it there. If the result is ok, Apple can proceed as normal and put the file on a users iCloud account. If they have a hit, not put it on a users iCloud account, instead delete or store the content as evidence and submit a report.

This isn't rocket science. One of my former colleagues, a security and network expert, is regularly consulting on CP cases and he's often tasked with securing forensic evidence.
And what happens with false positives?
 
  • Like
Reactions: centauratlas
“I don’t like being scanned for …”, “Oh, my privacy, my privacy!”…
Sorry guys, but in the moment you use technology and internet, most of your privacy ended, specially on those here complaining about Apple, but using Tik Tok, FaceBook, Messenger, WhatsApp, Google Apps, and anything that connects to the internet.
Want absolute privacy? Get ride of your Internet and Cell Phone.
Privacy is one thing. But Apple is taking it a step further and censoring.

today it is Apple scanning for child pornography and reporting you to the government. Tomorrow it’s Apple scanning for and removing pirated music/books. Or scanning messages/email in an attempt to thwart an attack. Apple is the new Xiaomi.
 
There are three choices:

- The device scans the image and should prevent it being transmitted. The user has control and Apple doesn’t have illegal content on their servers.

- The cloud scans the image and that means no user control. That means Apple gets the illegal content and the user is surveilled.

- No protections at all for children.

First of all, this system does nothing to prevent transmission of CSAM. Assuming that iCloud Photo Library is turned on—a big "if," considering the attention this is getting—illicit photos are still uploaded to iCloud, they just include a "voucher" that could lead Apple to eventually investigate. If Photo Library is turned off, child pornographers can transmit—via iMessage, AirDrop, WhatsApp, Telegram, Signal, Dropbox, etc.—all the CSAM they want.

Users also have no control under this system. Other than the "choice" of disabling iCloud Photo Library—really a Hobson's choice for innocent iOS users—we'll have no idea if or when one of our photos is tagged with a voucher, no idea if or when new databases are added to our phones, and no idea if or when Apple decides it's kosher to peruse our private photos.

Your third option is such a ridiculous straw man that it hardly warrants a response. "No protections at all for children" other than billions in law-enforcement resources, multiple high-profile advocacy organizations, and universal social opprobrium.

And there's another option you're missing—Apple can gtfo of all our photo libraries and continue complying with lawfully-obtained warrants for suspected child pornographers as they do now.
 
Last edited:
Is Tim Cook about to retire, and like a disgruntled employee walking away from a dumpster fire they started, leaving this mess for someone else to inherit?

This isn’t the same Apple I’ve bought from for almost three decades.
as said above, they are up to something or their hand is being forced. Caring for children? Then why do album covers with near nudity get prominence on the Music app browse page?
I think that there has been a struggle between the new "woke" hires and the Jobsian old guard and the wokesters won. Too often, the younger generation views the purpose of corporations is social activism on issues just like this, rather than making insanely great products for customers. I am shocked that more people at Apple don't see this, but apparently they don't or are just out numbered and out flanked.
 
2e6ca93338badafa2e9df4913617143b--freedom-is-judges.jpg
 
First of all, this system does nothing to prevent transmission of CSAM. Assuming that iCloud Photo Library is turned on—a big "if," considering the attention this is getting—illicit photos are still uploaded to iCloud, they just include a "voucher" that could lead Apple to eventually investigate. If Photo Library is turned off, child pornographers can transmit—via iMessage, AirDrop, WhatsApp, Telegram, Signal, Dropbox, etc.—all the CSAM they want.

Users also have no control under this system. Other than the "choice" of disabling iCloud Photo Library—really a Hobson's choice for innocent iOS users—we'll have no idea if or when one of our photos is tagged with a voucher, no idea if or when new databases are added to our phones, and no idea if or when Apple decides it's kosher to peruse our private photos.

Your third option is such a ridiculous straw man that it hardly warrants a response. "No protections at all for children" other than billions in law-enforcement resources, multiple high-profile advocacy organizations, and 100% social opprobrium.

And there's another option you're missing—Apple can gtfo of all our photo libraries and continue complying with lawfully-obtained warrants for suspected child pornographers as they do now.
Excellent point. It's not like there are not powerful and influential forces combating this evil already.
 
The only thing Apple is doing with this, is using all of us as Beta Testers for a more sinister system. It's not about "the children" really. It's about fine tuning a system that is already deployed and adaptable to suite the needs of whoever asks to use it. As long as they pay Apple their asking price.

Don't fool yourself. I am a die hard Apple user and loved them for their privacy stance, however this is a full 180 from all their videos/ads about privacy. It's a joke to think child abuse is rampant enough to invade the privacy of MILLIONS of users. Say whatever you want about "the children" - this is not good on any level.
 
...and how long before the "scanning iMessages" isn't just for kids under parental controls?

I can't believe people are blind to the bad precedents and paths we are going down here.
But isn’t this optional? I must be missing something, but if it’s disabled by default I don’t see the problem.

Even more considering apple has refused in the past to unlock devices at police requests in the name of privacy.
 
But the stakes are much higher. This feature has the potential to ruin your whole life, even if you did nothing wrong, there are several possibilities it could fail:

No.

- The hash algorithmen could be attacked and images with harmless content and the same hash as a image in the library could be constructed. The algorithmen is not published and can not be checked.
- someone could send images from the library which get automatically uploaded to the iCloud library, e.g. Threema messanger has this functionality and I bet others as well
- Apple could just make mistakes when reporting users

- there could be software bugs on all levels, wo all know how bad Apple software can be (after all we are talking about the company that wrote the dammed photoslibraryd daemon)

1. You'd need to find or generate images that would create a collision. You have a better chance of finding a UUID collision which would take 80 years to do on today's computers to find just ONE.
2. This is why Apple set an extremely high threshold on the amount of before an Apple employee can even decrypt the images. Meaning potentially you'd need to find hundreds of collisions.
3. Apple stated that the chances of a mistake is one in a trillion
4. Assuming you're unlucky enough to be that one in a trillion mistake, Apple will manually review those images in question and will correct the mistakes.

And than there is the huge potential for abusing this feature by countries like China and a lot of others.

China iCloud data centers are already being reviewed by the government. This is why you get a warning if you fly to China with your iPhone and switch your country setting to China, Apple will tell you you're on China servers which are treated differently.

Even with a chance smaller than winning in the lottery (whatever this is) and do not want to bet my reputation on this “feature“ working without a flaw. And always remember, even if it is unlikley that you win in the lottery, it is very likley that someone does…

Chances of winning powerball: 1 in 292 million
Chances of erroneously being flagged via Apple: 1 in a trillion

You have a MUCH higher chance of winning the powerball. And even after being flagged, Apple will review and re-active your account if it was in error.

Other perspectives:
- Odds of you dying from a car crash: 1 in 107
- Odds of you being struck by lightning: 1 in 1.2 million
- Odds of you dying from a shark attack is 1 in 3.7 million
- Odds of you dying from a plane ride: 1 in 29.4 million


Think about it. 1 in a trillion. What other event in the world happens with 1 in a trillion odds.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.