Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It's a bit interesting how everyone who is saying, "Well, I have nothing to hide but..." is listing what they are going to do as if they *do* have something to hide.

Most of this stuff it not new; it's at least a decade old. But if it *is* new to you, great, you can take this info and make an appropriate decision for yourself. But really, a lot of these responses have an amusing reactionary edge to them.
 
Aaaand I'm spending the rest of the day removing everything I have from iCloud. Not because I am doing anything wrong but because any moderation system is prone to abuse and false positives and I literally cannot afford to get all my **** shut down one day.

I actually want to entirely exist offline at this point. Back like 2010 again. Files on disk. Physical backups on disks off site.
Sounds like you really need a cabin in a remote woods area with no TV, internet, power, etc.
 
Most of this stuff it not new; it's at least a decade old.

I mean, all of this language can be chalked up to Tim reducing exposure to risk on their end. We saw what happened to the Telegram guy for not taking action about that kind of stuff.

There is nothing new under the sun
 
I did not know that. But then it sounds like they could always scan our photos and never really needed to ask us for permission.
 
It’s in their support documents here: https://support.apple.com/en-us/102651

it says they treat md5 hash sums like normal data by using their own encryption keys. Presumably it’s been used for duplication removal but seriously, that’s the easiest thing our devices can do locally. Mind you this hash is for unshared non public photos
 
Most criminals are not that savvy. plus even if they’re are, all it takes is one slip up to get them flagged.
 
  • Like
Reactions: IIGS User
CSAM isn't something you trust - it's something you should not be in possession of.
We are specifically referring to the system that involves identifying CSAM, and how such system can be used to scan other “unhealthy, dangerous” stuff determined by the government even if they are harmless based on the definition of CSAM. In short, this system opens the door to define anything possible as being dangerous, for no reason.
 
In reality this is so Project *** 2025 *** can take effect 😬 so they will scan for OTHER things THEY Don’t like.
And the most extreme would be they don’t like “everything” thus allowing them to prosecute “anyone” at will.
 
We are specifically referring to the system that involves identifying CSAM, and how such system can be used to scan other “unhealthy, dangerous” stuff determined by the government even if they are harmless based on the definition of CSAM. In short, this system opens the door to define anything possible as being dangerous, for no reason.

Aren't these scans based on hashes generated by NCMEC for the specific purpose of identifying CSAM? Moreover, they aren't a part of any government.

Moreover, the way this scanning works isn't by using image recognition or something. It compares hashes, mathematical representations of them. It only flags hashes of files that match the hash of known CSAM.

Edit to add: Don't take my word for it. Here's Apple's whitepaper on how their on-device scanning was going to work before they scrapped it.

 
Last edited:
Aren't these scans based on hashes generated by NCMEC for the specific purpose of identifying CSAM? Moreover, they aren't a part of any government.

Moreover, the way this scanning works isn't by using image recognition or something. It compares hashes, mathematical representations of them. It only flags hashes of files that match the hash of known CSAM.

Edit to add: Don't take my word for it. Here's Apple's whitepaper on how their on-device scanning was going to work before they scrapped it.

Still, the government can scan pictures they don’t like (which can be any), and use the same technology used in CSAM scan to flag people that stored such images. After then, government can prosecute those individuals. Whether it is efficient or not is another matter, but it can be used in that manner.

In short, our concern is the tool will likely be misused for nefarious purposes.
 
Yea so the other issue too is that it scans photos and uses ai to figure out whether or not your images have the same features as another photo in the csam database and if those features are organized in the same way. the reason is because pixel to pixel comparisons often fail due to cropping, resizing and compression variability. The problem with that has been flagged by other researchers in the space who say that false positives can occur and using the system is going to lead to routine checks of people’s private albums. And for anyone using adp this could mean outright bans since apple wouldn’t be able to perform a manual check.


I think though that this was dropped by apple but looking at the tos after this update makes me think there’s really no reason they couldn’t just use it.
 
Still, the government can scan pictures they don’t like (which can be any), and use the same technology used in CSAM scan to flag people that stored such images. After then, government can prosecute those individuals. Whether it is efficient or not is another matter, but it can be used in that manner.

Except they'd have to convince Apple et al to implement that scanning. Easy sell for an NGO, incredibly tough for Uncle Sam.

Yes, the technology can probably be used for something like that. But good luck making it happen and getting other people to agree to it.
 
  • Angry
Reactions: Shirasaki

How did they nail it? I couldn't find a decent image from the Human CentiPad episode. Everyone just clicked agree and then Apple came and started collecting them for a human centipede, which they agreed to in the terms and conditions.

Butters is the only one they didn't come for, because he actually read it and clicked disagree.

How could someone actually disagree to the terms in real life? They'd have to immediately stop using all Apple products, without even a chance to get back in to get their data out without agreeing.

Both scenarios are completely ridiculous and should be illegal. Which is (was?) South Park's genius. Demonstrating that fact with an over the top, technically plausible scenario.
 
Why is it so hard for people to just turn off “photos” and put your photos onto an external hard drive as backup? People care so much about Apple, and others, looking at their pictures, but don’t even think twice about having their photos off premises. Makes zero sense at all. If your pictures, or any data at all, are not in a non-connected physical source within your reach, you can pretty much assume someone else is already looking at them
 
Why is it so hard for people to just turn off “photos” and put your photos onto an external hard drive as backup?

Apple didn't make it as simple as using iCloud. If I could back up my phone to my NAS instead of iCloud I would love to.

I suppose I could backup to a PC, then store that on the NAS, but that's still nowhere near as easy as using iCloud. Plus, I'd need to dig up a USB-A to C cable.
 
How did they nail it? I couldn't find a decent image from the Human CentiPad episode. Everyone just clicked agree and then Apple came and started collecting them for a human centipede, which they agreed to in the terms and conditions.

Butters is the only one they didn't come for, because he actually read it and clicked disagree.

How could someone actually disagree to the terms in real life? They'd have to immediately stop using all Apple products, without even a chance to get back in to get their data out without agreeing.

Both scenarios are completely ridiculous and should be illegal. Which is (was?) South Park's genius. Demonstrating that fact with an over the top, technically plausible scenario.
Kind of seems like you are giving South Park too much credit.

The terms and conditions are called a "contract of adhesion." We agree to these all the time.

Airline ticket? Check.

Automobile financing? Check.

Doctor visit? Check.

It's impossible to negotiate these - it's either take it or leave it. Practically, it's impossible to negotiate these anyway. So they are not illegal because you have the option to decline them. We accept them out of convenience.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.