Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
If you put something on the cloud you are transferring it to someone else's servers, whether Apple, Google, your web hosting account, and so on. Someone else's hardware and responsibility, so yes, a company scanning what you put on their servers is expected and should not be a surprise. Put something on the cloud you do not want to be scanned? Encrypt it.

Quite different from coming into my private space uninvited and looking through my content.

In the world of computer repair, if I fix your computer and find child pornography on it, I must report or I'm an accomplice in the crime. My teacher explained the story when he had a machine brought it in for service. When they booted up the PC, I believe they saw images immediately. His boss said to stop work immediately and call the police.

Same thing would apply to any cloud service holding onto these images and videos. Whether public or private, they would be aiding the criminal.
 
Unfortunately while most people talk about privacy those same people are willing to give up their privacy if the government promises them some form of safety.

Comes from the mouths of the same people who tap "Agree" on Facebook's terms when they sign up, then scream "you don't have the right" without reading the fine print of a free service. You're bartering your information in exchange for usage of their service.

Private businesses are no better than the government since they own it.
 
Who took this photo of this pamphlet and why would anyone else have that photo in their library? Same deal with this picture of a banner?
Nobody. The system identifies similar photos by neural hash. It seems a hard point to get across here that the matching is not done by an exact image comparison. It's a "perceptual hash". Computers don't really perceive anything as is commonly understood, but they're reasonably good at finding similar things, whatever they happen to be. Until things go wrong.

Edited with example:

I'll give you an example. Here is a perceptual hash code of an image:

0x8F22D679CB113AE7406BB276C2F994

What's the image of? A can of Diet Coke? A banana. A porn image? A picture of a Taiwanese protest poster?

I'm telling you it's a porn image. And I'm telling the owners of the CASM database that it's a porn image, because I'm a very good hacker.

Now, when you take a snap on your iPhone, it will produce a perceptual hash code.

Let's say it produces a value of

0x8F22D679CB113AE7406BB276C2F994

Is that a porn picture? Yes. Because it matches a perceptual hash code in the CASM database. The phone is good at figuring out that your picture is visually similar to mine.

But I lied. It's not really a porn picture at all. It's a photo of a can of Diet Coke and a banana. And your clever iPhone can produce the same hash code if it sees a similar composition.
 
Last edited:
We have already seen a lower adoption rate for iOS15, guess why
The average iPhone customer knew/knows nothing about their CSAM initiative.

Most people haven't upgraded to iOS 15 because the phone isn't currently spamming them to do it. More iPhone users are like my mother than anyone here.
 
Nobody. The system identifies similar photos by neural hash. It seems a hard point to get across here that the matching is not done by an exact image comparison. It's a "perceptual hash". Computers don't really perceive anything as is commonly understood, but they're reasonably good at finding similar things, whatever they happen to be. Until things go wrong.
“Similar” meaning photos that have been cropped, rotated, grayscaled, resized, compressed, etc.

It does not search for arbitrary subject matter.
 
Apple should let people upload any photos they want to iCloud Photos, but just require the client (phone) to attach some sort of signature (voucher) that can verify if the provided photo has a match in the CSAM database or not. That way no one is being spied on without their knowledge, it's opt-in, and apple can still identify users that are attempting to upload CSAM to Apple's iCloud Photos servers.

This way Apple can be as effective at reporting service abuse as FB/Reddit/DropBox/MS/etc are currently doing. Can you imagine the uproar if Apple said they were just going to have their iCloud Photos servers continuously read and analyze every photo ever uploaded across all of their users? /s

 
That is why I will not upgrade. We have already seen a lower adoption rate for iOS15, guess why. And I hope, it's not because the button has been moved or people are not forced to upgrade.Of course, they could also inject this into iOS14 at any time, but this is about making a statement to apple.
I love my apple devices, but Apple has seriously grown unappealing in the last months to me. Their culture of ignoring customers and their handling of mistakes is very bad for the brand. Why are they doing this?
LOL, of course the slower adoption is 100% because the phone barely tells users iOS 15 is even out. You have to go looking for it.
 
Can Apple even put this toothpaste back in the tube? Would Tim saying "we heard you and disabled this specific application of local device scanning and reporting, for now" regain trust? There is no possible way they would make a blanket statement of not scanning and reporting local device content because that covers telemetry, diagnostics and security (usage data, crash reports, malware scans).
 
I think Apple will skip this one. The risk for them is too great. Makes no sense. Why use the photo library of nearly billion devices to catch few criminals? Let the people who get paid by our taxes to do their job without using our devices.
They’re using “protect the children” as a Trojan horse to get their foot in the door with this tech. It’s very intentional, and once they succeed, it will be used in countries in a number of ways to suppress freedom of speech and thought.
 
I consider both equally bad, honestly.
It's not even close for me, on device scanning to report to the authorities is not acceptable period. Cloud scanning, it's their servers and responsibilities so I don't care, even if it's my photos they're scanning. (or whatever else I put in the cloud.) I can always encrypt something I don't want them to see, but I'd hardly bother -- if something's private, it never gets to the cloud. (private being drivers license, passport, bank accounts, things like that.)
 
Companies don't have an obligation to look for CSAM (only to report if found) b/c then they would be considered government agents and run afoul of the 4th amendment. But, no company wants to be the place for CSAM. That's why all the big cloud providers (other than Apple iPhotos) already scan for CSAM.

Keep in mind that if you own a server, and there is CSAM on it, you are in possession of CSAM. They may be liable regardless.
 
Apple should let people upload any photos they want to iCloud Photos, but just require the client (phone) to attach some sort of signature (voucher) that can verify if the provided photo has a match in the CSAM database or not. That way no one is being spied on without their knowledge, it's opt-in, and apple can still identify users that are attempting to upload CSAM to Apple's iCloud Photos servers.

This way Apple can be as effective at reporting service abuse as FB/Reddit/DropBox/MS/etc are currently doing. Can you imagine the uproar if Apple said they were just going to have their iCloud Photos servers continuously read and analyze every photo ever uploaded across all of their users? /s

That's not opt-in, icloud photo sharing defaults to on, so at best it's opt-out. Neither would be acceptable, it's still doing the comparison on my phone.
 
  • Like
Reactions: BulkSlash
It can scan for whatever it wants. It's not that smart an AI. Let's say the Communist Chinese Party add a photo hash of a pamphlet to their CASM database, and Apple acquiesce by referencing that database. Or they sneakily manage to add an entry to a foreign power CASM hash database. Your lovely picture of a protest banner then identifies you as a pedophile.
Apple doesn’t “reference” any database. They distribute the reference hashes in the iOS image. If Apple did this, every security researcher in the world would instantly know about it, and it would happen everywhere in the world, not in china.
 
We’ve all given up so much data already to tech corporations. Which the United States government kindly uses anytime they please. It’s funny that now people are suddenly questioning privacy. I’m sorry folks that ship sailed a long time ago.
We are taking it back! Who's with me!?
 
What part of this is NOT a good idea!!! You do realise don't you your own precious Cat, Dog or Family photo will not be actually looked at!!! - it simply compares the Hash for the photos against already identified hashes of indecent images of children!! It doesn't actually look for what it thinks are indecent images, just the hash!
The snooping part is NOT a good idea!!! It simply snoops on over a billion phones to catch a few perps. That's where it STARTS... Open your eyes, this only leads to less privacy and more snooping. This isn't about being worried bout your dog or cat photos
 
So cloud scanning is still OK according to the EFF?

You'd think they wouldn't like that either!

Scanning is scanning.

Whether your photos are on your device... or on a server somewhere... it's still scanning.

Device Scanning: "Don't violate my privacy!"
Could Scanning: "Come on in! Look at all my photos!"

:oops:

Scanning on YOUR device is the issue. You don’t own the datacenter where the cloud service is hosted. Don’t expect privacy on it.

If you store your stuff in someone else’s house, don’t be surprised if they go through it or even throw some of it away.
 
Scanning on YOUR device is the issue. You don’t own the datacenter where the cloud service is hosted. Don’t expect privacy on it.

If you store your stuff in someone else’s house, don’t be surprised if they go through it or even throw some of it away.

Gotcha. Thanks!

I was just wondering what the EFF thought about cloud services.
 
Strangely, this does makes sense for the EFF. They are all about the ownership and control of the software that runs on the things we own. So, software running on our phones that we can't control would fit very much into EFF's interests.

Cloud services are not really part of their interest - but there are plenty of other orgs who would be interested in that!
I think that cloud scanning is just too pervasive in tech. Every single company/device/service relies on cloud scanning so the EFF knows that they cannot call for all cloud services to stop in the name of privacy. They have to draw the line at on-device scanning and Apple seems to the be the biggest target these days.
 
  • Like
Reactions: glowdragon
I think we should allow Apple to implement this software! What could go wrong? I believe in them. They will never be pressured by any Government to scan our images. Just like they wont be pressured to create software that displays our personal medical information (inoculation), for strangers (businesses) to view. Or having a Judge rule they must unlock a persons cell phone due to a shooting. In case anyone disputes this, although apple "drew the line" at assisting with the unlocking of shooters cell phone, apple still provided "gigabytes of information” to investigators, including “iCloud backups, account information and transactional data for multiple accounts.”

Hopefully if you've made it this far, you've recognized the sarcasm. Call it conspiracy, but the Government has been known, to create a crisis, just to take away our freedoms. All the Government has to, and will do, is create a crisis once apple has implemented this feature, to further surveil. Do your research, and look up "The Lawful Access to Encrypted Data Act". They WANT and NEED to get into our phones, and into our lives. It has nothing to do with safety, and everything to do with compliance and control.

Safety is for you and I to control, not a business, not the Government. All our safety's are located in the Constitution.

Continue to fight against this surveillance state!
I agree with you. Below is one of my favorite cartoons...

IMG_3475.JPG
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.