Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
But in the release features I read on this web site yesterday, one of the new features was that Apple has introduced in 15.2 is

In iOS 15.2, Apple is enabling Communication Safety in Messages for children. The feature is designed to scan incoming messages images on children's devices for nudity and warn them that such photos might be harmful

so it looks to me like they are still going to scan peoples photos to look for content that Apple deems to be unacceptable.

To me, that is unacceptable. Stop scanning peoples private data. Period.
That’s not the same thing and works entirely differently.
 
I think the concerns from cybersecurity specialists over the possibility that state actor hackers could override the limits imposed by Apple on hash tag analysis to look for things far beyond the system was originally intended is why Apple may just drop the whole idea. Especially now with both China and Russia now possessing very advanced supercomputing facilities.
 
Apple probably saw the need for this feature based on direct threats from Republican senators in congressional hearings.
I strongly doubt that. Apple likely saw a marketing opportunity to make themselves out as the "good guys" acting against people widely despised by everyone. They were not expecting the backlash they got and are now trying to fix the PR issues. There is no actual legal need for any of this - and Apple will find any threats made by Republicans risible.
 
They dont have to scan things in the cloud. This was discussed to death when this came up. Its a choice and a bad one.

True, but Republican senator(s) indirectly threatened Apple with harsh requirements to do it in a Congressional hearing.
 
  • Love
Reactions: Maximara
I strongly doubt that. Apple likely saw a marketing opportunity to make themselves out as the "good guys" acting against people widely despised by everyone. They were not expecting the backlash they got and are now trying to fix the PR issues. There is no actual legal need for any of this - and Apple will find any threats made by Republicans risible.

U.S. senators threaten Facebook, Apple with encryption regulation​

“You’re going to find a way to do this or we’re going to go do it for you,” said Senator Lindsey Graham. “We’re not going to live in a world where a bunch of child abusers have a safe haven to practice their craft. Period. End of discussion.”


This was in December 2019 and the timeline fits very well with Apple starting on the CSAM Detection system in early 2020.
 
  • Love
Reactions: Maximara
Honest question though.
In regards to this CSAM detection on device scanning. Will they alert the FBI when they find a parent with a picture of their child taking a bath?
 
Honest question though.
In regards to this CSAM detection on device scanning. Will they alert the FBI when they find a parent with a picture of their child taking a bath?
A picture of a parent giving their kid a bath isn’t going to be in the database.
 
All I hear are people who have photos they shouldn’t have, complaining that they shouldn’t be caught with illegal images. This thread would make Josh Duggar happy!

Hope apple gets this enabled soon!
I know, right? If they implemented a system to perform a content match on known “Yummy Egg Salad Sandwiches” images (call it the YESS system), then I would ABSOLUTELY have a need to be concerned. Because, I’ve got several of those on my phone, likely enough to trigger a flag and I’d definitely be reported.

The content matching they’re talking about here? No worries. :) Next up, I don’t want the government having my number in their Amber alert system. I DIDN’T TAKE NO KIDS!!
 
  • Like
Reactions: WolfSnap
better, this was bound to fail at the start, you'd only need one bad actor feeding apple's system wrong hashes and everyone is a potential suspect for whatever governmental purpose that bad actor wants to silence, like criticism, dissent, protestors in Hong Kong, LGBT minorities in certain regions you name it. Also, as an EU citizen, I'm glad, as this system Apple proposed, wouldn't have been allowed here anyway because of the strong protections in our GDPR privacy laws.
You clearly do not understand how the system works if you thought this would work. I really wish folks would read the public material that Apple already published on how it works and the checks it uses to prevent exactly what you are mindlessly spouting.

Don’t just read the headlines. Really read the whole article and if it has holes then speak to those.
 
Last edited:

Apple has already admitted to scanning email, and stated they don't want CSAM on their servers. It's not a big leap of logic if client side CSAM isn't implemented, they'll just do what every other cloud provider has done for years and scan iPhoto images on the server (if they are not already).
An interesting article and possibly/likely true. It is important to remember that this is an article that someone states Apple told them this and not a statement from Apple directly. Do you have anything that is from Apple that states this? Apple will sometimes dispute claims that are false in the press but often they do not address them either.
 
You really think people are doing stuff illegal, they are going to use a cloud service? Some crooks are stupid, but this wouldn’t help.

Would you be ok if the government put a camera in your house? for the sake of the children…
Putting a camera in my home and what Apple wants to do are 100% different things. Ether you are dumbing down your analogy to try and win an argument or you really don’t understand how Apple’s system is suppose to work.
 
Putting a camera in my home and what Apple wants to do are 100% different things. Ether you are dumbing down your analogy to try and win an argument or you really don’t understand how Apple’s system is suppose to work.
It isn’t as far off of an analogy as you imagine it is. Client side scanning is the equivalent of them looking at something on your property instead of waiting until it is on their property as is with server side scanning.
 
I think the concerns from cybersecurity specialists over the possibility that state actor hackers could override the limits imposed by Apple on hash tag analysis to look for things far beyond the system was originally intended is why Apple may just drop the whole idea. Especially now with both China and Russia now possessing very advanced supercomputing facilities.
State Hacker can just build they own spyware. They are doing that right now btw. This fear you guys have about CSAM is laughable.
 
Why are you making such an "enlightened" claim that Apple is already betraying its customers? Your personal gut feelings are of no interest to anyone here, feel free to bring facts.
If you end up like Julian Assange, you have my sympathy.

Courageous freedom lovers don't always have to win, but courage is something exhausting, strange, rare and great in mankind.

View attachment 1928861
He said they scan server-side rather than client-side. This is already well known, and they aren't the only ones who do so. Google, DropBox, et all have been scanning server-side for years.
 
I hope Apple brings it back. We need to stop pedophiles and child sex abuse. Children need to be protected.
Then, by all means, protect your own kids. Don't subject everyone else to government overreach and intrusion simply because you can't be bothered to watch over your flesh and blood effectively. That's literally half the job description of a parent.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.