Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Every tool can and will be used for nefarious purposes. No need for step by step guide. No need to nitpick wording and argue otherwise.

Also, if private company like Apple is all out on mass surveillance, the government would more than likely to offset their burden of law to private companies and change the law to take full advantage of that.

It’s just unfortunate that there is literally no way for Apple or other companies to fall back this level of surveillance and all android manufacturers will follow up very quickly, ramping up the surveillance war while every single customer will be the victim sooner or later.

And I fear no amount of PR damage would cause Apple to roll back this surveillance software. They have released multiple articles showing commitment with no sign of backing down. And sales drop will be very minor (if happening at all) compared to last year given most parents would rejoice anyways.

The next thing to look into is how bad this can be when balance is already lost. Apple will be fine either way, and most Apple user (compulsory or voluntary) will have no choice but to offer unconditional surrender.

Sad state of affairs, isn't it?
 
You don't know that. Apple told you that. Even if they are, they are encrypted with a key generated by Apple. They are also decrypted on-device, which is where the scanning in spyOS is taking place.

So, yeah, amusing indeed.

This makes no sense. If Apple is lying to you, why would they even announce they're doing CSAM? Why not just stay quiet and lie about how they're not doing CSAM detection?

If your response is "well someone might find out and it'll be a huge PR mess", well then someone might find out "spyOS" is secretly taking your iMessage encryption keys too.

Huge hole in your logic there.

This is no longer amusing but rather facepalm worthy. I'm done talking to you. Have a good one.
 
Don't really understand the point in being paranoid of potential actions in the future.

Because once the infrastructure is in place, on device, it's far far more likely a jurisdiction will force Apple to use it.

This is common sense

Once they see you've built "hammers" into everyones phone, they are going to come knocking and force you to hit "nails" (of their choosing and for their purposes).

The reason to be worried about potential issues in the future is that it's incredibly hard to roll things like this back and out of the OS. You have to stop stuff like this from ever getting started.
 
"For now"

The tool being installed on the device is the problem.

All that's stopping it being used for other things (or for all photos, not even iCloud ones) is a simple policy change by Apple (perhaps simply being compelled by a local authority, agency or government).

Creating the tool and installing it on users devices - at all - is the issue.

The issue is trust.

Do you or do you not trust Apple?

I (still) trust Apple, and I hope that in time, more people will see the light and come to appreciate the elegance of what Apple is trying to do here.

Right now, what I am seeing from critics is that they somehow expect Apple to stay on a privacy high horse and flat out refuse to do anything that would require extensive work on Apple’s part to rethink existing status quos, which to me is not attractive or viable at all.

What I see here is Apple innovating by coming up with privacy-minded solutions to existing products and services which people long assumed they had to give up their privacy to.

People thought that a company would have to collect as much data as possible for AI or maps; Apple proved them wrong by showing that it is very much possible to continue to develop in these areas without collecting as much data as these companies would have you believe.

Apple also launched ATT to show that advertising was still very much viable without the degree of invasive tracking that facebook is now infamous for.

It’s no different with CSAM. Right now, the current assumption is that everything you upload to the cloud is by default no longer private. What if Apple had a way to change that? What if they could offer users fully encrypted iCloud storage, and they are able to identify whether users have child pornography on their devices, without having to use AI to scan for them like what companies like Facebook are doing with uploaded images?

And honestly speaking, there are likely far easier and more practical ways for a government to spy on its people than rely on CSAM detection / alteration.

I think a lot of the criticism is coming from organisations and companies who don’t want to see Apple succeed, as well as from people who are still unable to wrap their heads around this paradigm shift that Apple is delivering. Your phone isn’t scanned, your photos in the cloud aren’t scanned, and throughout this entire process, your photos are subject to scrutiny only once, and even then, in the least invasive manner possible.

Compared to google drive actively scanning every image i upload, I find this a much more palatable tradeoff.

Apple is upending the current existing status quo even as we speak. And I want to be a part of that new world order.
 
It won't run unless you turn on iCloud

"Apple is also reinforcing that if a user does not use iCloud Photos, then no part of the CSAM detection process runs. This means that if a user wants to opt out of the CSAM detection process, they can disable iCloud Photos."
The problem is not that such a process exists. No one would be this outraged if Apple ran the hashes on their servers. The problem is that the hashes themselves are included in spyOS. The scan is still running on-device if iCloud is enabled, although it is decrypted in iCloud. But in a future update, Apple could easily make the scans run when iCloud is off. Apple is essentially saying, "We made a backdoor in the OS, but don't worry because we would never use it."
 
This is actually a really good breakdown of the various arguments in this situation (from this afternoon):


Good link
The part in bold was curious to me...


Apple has legitimate reasons to step up its child protection efforts. In late 2019, The New York Times published reports of an “epidemic” in online child sexual abuse. It blasted American tech companies for failing to address the spread of CSAM, and in a later article, NCMEC singled out Apple for its low reporting rates compared to peers like Facebook, something the Times attributed partly to the company not scanning iCloud files.


So has Apple *not* been scanning iCloud photos on their own servers for CSAM?

If not - the proper step here is to scan the photos on their own servers - not install a way to do it on our devices.
Users fully expect things that are uploaded to be scanned. No objection there.
 
I'm sure you're right - though other cloud companies already do this? So I don't know how much cost it would add.

I don't think it will affect battery life much at all on the device.

I know it might seem silly, but it's just the principle of them using my device for it and not giving me a choice in the matter, other than disabling a feature I highly value as my family has 200+ GB of photo/video history on iCloud Photos between the 5 of us.

Other cloud companies monetize the data which would pay for the processing costs. At the very least the metadata is highly valuable as Google and Amazon can track your location and spending habits. I don't think Apple has a way to monetize every single photo taken.
 
The issue is trust.

Do you or do you not trust Apple?
Considering they just introduced what essentially amounts to a backdoor into Photos after spending years sitting on the privacy high-horse, no. What would make me trust them is not releasing this easily abused feature, not telling me to have faith in the fact that they will "only" use this to check (without any supporting evidence) that I'm not a pedophile.
 
Because once the infrastructure is in place, on device, it's far far more likely a jurisdiction will force Apple to use it.

Infrastructure has been in place since 2011 with on device facial detection in iPhoto.

This is common sense

Once they see you've built "hammers" into everyones phone, they are going to come knocking and force you to hit "nails" (of their choosing and for their purposes).

"their purposes" of losing customers? Doesn't sound like a win for Apple. They want to sell more products, not less. Don't know how is that common sense.
 
For the CSAM stuff, why don’t they do this BEFORE an image is saved to the device? Check the hash when someone goes to save a photo to the device and warn them that it’s a match instead of this horrible “safety voucher” idea where the user has no idea what’s happening. If they try to screen shot it, do the same hash check and throw up a warning. Much more simple and transparent.

Goal should be to PREVENT distribution. The iCloud Photo Library and hidden safety voucher idea doesn’t do that. And nothing is solved if that feature is disabled.

Not well thought out.
 
If Apple is lying to you, why would they even announce they're doing CSAM? Why not just stay quiet and lie about how they're not doing CSAM detection?

Because Apple is marketing this as a feature. I would think that if they wanted to catch the six or seven idiots putting their child porn up on iCloud that they would stay silent about it.

Enjoy your time at "Camp Apple Does No Evil". You can be "done" talking to me all you want. The responses from other users to your posts speak for themselves. The crowd is not with you on this one, son. It's a hell of a hill to die on, but at least you'll be dead.


Screen Shot 2021-08-10 at 8.39.54 PM.png
 
For the CSAM stuff, why don’t they do this BEFORE an image is saved to the device? Check the hash when someone goes to save a photo to the device and warn them that it’s a match instead of this horrible “safety voucher” idea where the user has no idea what’s happening. If they try to screen shot it, do the same hash check and throw up a warning. Much more simple and transparent.

Goal should be to PREVENT distribution. The iCloud Photo Library and hidden safety voucher idea doesn’t do that. And nothing is solved if that feature is disabled.

Not well thought out.

Better, I'd agree there.

That said, I'm still not interested in having my device be part of a society wide dragnet in this fashion.

I don't even know where this sort of thing would "stop".
Think of all the things you might come across and be interested in looking into, reading about, watching...

Do we really want devices that have a capability that amounts to being State controlled "thought police"?
 
Many places the age of consent is less than 18, including many American states (IIRC the majority). I bet you went to school with a few people who had sex before 18, even if you didn’t.
When the word CHILD is mentioned I hardly think it pertains to someone just under 18. I live in the states and under 18 is actually 17 in terms of age of consent in select states so that's not a CHILD. And if you think it's acceptable for minors to send nude pics then Apple is definitely targeting you. As for my personal life that's none of your concern or business.
 
No one would be this outraged if Apple ran the hashes on their servers.

Nope. I can see the headlines "Apple now scans every single photo you upload to iCloud"

But in a future update, Apple could easily make the scans run when iCloud is off. Apple is essentially saying, "We made a backdoor in the OS, but don't worry because we would never use it."

On the subject of what Apple might do: John Gruber from Daring Fireball suggests this is Apple's way of enabling end to end encryption for photos and backups too by giving law enforcement only data that they can extract from a high threshold of detections. In return customers get end to end encryption. This is far better IMO than the government being able to come in and decrypt all of my iCloud data that's currently happening today.

No point in guessing what Apple might do. It could be good or bad. You could be wrong or John Gruber could be wrong.
 
  • Disagree
Reactions: Stunning_Sense4712
John Gruber from Daring Fireball suggests this is Apple's way of enabling end to end encryption

As of right now, that is just gas pulled from Johns' butt
Keep in mind that Gruber is so far up Apple's rear end he needs a flashlight.

He's made an entire nearly 2 decade career about of pontificating about Apple.
To say he's biased here is incredibly generous.
 
On the subject of what Apple might do: John Gruber from Daring Fireball suggests this is Apple's way of enabling end to end encryption for photos and backups too by giving law enforcement only data that they can extract from a high threshold of detections. In return customers get end to end encryption. This is far better IMO than the government being able to come in and decrypt all of my iCloud data that's currently happening today.

From that Verge article I posted above: "CSAM is illegal and abhorrent. But as the open letter to Apple notes, many countries have pushed to compromise encryption in the name of fighting terrorism, misinformation, and other objectionable content. Now that Apple has set this precedent, it will almost certainly face calls to expand it. And if Apple later rolls out end-to-end encryption for iCloud — something it’s reportedly considered doing, albeit never implemented — it’s laid out a possible roadmap for getting around E2EE’s protections."

It's that very last line that I take issue with when someone makes the argument that the way they're doing this gives us more E2EE. That's technically true. But it's penny wise and pound foolish. They're potentially giving us E2EE while providing a way (albeit difficult with many steps) to get through that E2EE if need be.
 
From that Verge article I posted above: "CSAM is illegal and abhorrent. But as the open letter to Apple notes, many countries have pushed to compromise encryption in the name of fighting terrorism, misinformation, and other objectionable content. Now that Apple has set this precedent, it will almost certainly face calls to expand it. And if Apple later rolls out end-to-end encryption for iCloud — something it’s reportedly considered doing, albeit never implemented — it’s laid out a possible roadmap for getting around E2EE’s protections."

It's that very last line that I take issue with when someone makes the argument that the way they're doing this gives us more E2EE. That's technically true. But it's penny wise and pound foolish. They're potentially giving us E2EE while providing a way (albeit difficult with many steps) to get through that E2EE if need be.

Apparently we need to CC: Gruber

This would be the 180 degree opposite of what he's speculating about this.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.