Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
EXACTLY!

This has ZERO relation to what corrupt governments or those with dubious human-rights/censorship tendencies (such as China) may WANT to do. This is something APPLE (who as far as I am aware, for all it's fault are trying to do something genuinely GOOD here with good intentions) are instigating for their own systems. They wouldn't even unlock phones for the FBI for peat sake so they're not going to just open up their systems for China (or other nefarious governments) to do 'evil' things. People in the USA in particular it seems appear in a perpetual state of fear and mistrust. It's what help Trump rise to power and continue to hold sway with millions of delusional followers.
The government is the curator of the “naughty image” database……

this most certainly will become corrupt and politicized
 
  • Like
Reactions: peanuts_of_pathos
It's not doing any machine learning on your photos. It's checking photo "hashes" against a database of known "hashes" that are provided by the agencies.

For example if they come across an an image, they will add that hash to the database, then it will be cross checked against people's phones to see if they also have the exact same image.

The problem is it’s not a 1:1 hash. It can find a match on a manipulated picture. So how are false positives guaranteed to be so low with that amount of fuzzy matching?
 
I don't follow this slippery slope thought process. He is essentially saying he doesn't trust Apple not to abuse this functionality. You could literally apply this to anything if you believe Apple has ill-intentions.

If we believe Apple has some master plan to abuse this then this opens open every part of their ecosystem to the same "it could be abused if they want to" statement.
Read the 4th amendment, then apply the concept of having the ability to do this type of activity. It isn't just Apple (the collective) that may or may not have ill-intentions, but others as well. This is a specific type of warrantless inspection capability that does not apply to other parts of their ecosystem. Slippery slope is a very fair and valid argument here.
 
  • Love
Reactions: peanuts_of_pathos
This is where privacy and free speech die. They have the right (initial) reason but are doing the absolute wrong thing. Apple has demonstrated its willingness to bow to governments before, and they will eventually bow to governments on this one. They will expand their filter to search for anything the government wants, and I see this being especially used in China and other communist states that have a heavy-handed approach to censorship. This kind of backward move in privacy may be the reason I abandon Apple.
 
I own my device. Not Apple. The pictures/metadata/hash whatever it may be, is mine. Not Apple's. I own it. What gives them the right to install something on the device I own and scan and upload the data I own without my consent?

I applaud Apple for going after child predators, but this isn't the way to do it. I bought into Apple products because they sold me on privacy and security. How can you sell privacy and security now by taking data that doesn't belong to you in the first place and scan it for a possible match in a database somewhere that could possibly be flagged and looked at by someone else.

Like everyone else on here, I have nothing to hide. However the pictures/data on my device are just that, mine. I choose who I want to share information with, not Apple. What will be next, scanning texts/emails for phrases that could be deemed hurtful?

It would be different if Apple said we have this feature aimed at fighting child exploitation, etc.. and give us the option to help combat that.
 
What exactly is the big picture. NOTHING mechanics wise is changing . iCloud photos were unencrypted before and after this
Apple scanned your iCloud data before uploading and after uploaded before and after this
Apple shared iCloud data before and after this with police
Actually it's a massive sea change. It's the first time a major device manufacturer is installing software on users' devices that has the sole purpose of searching for blacklisted material and reporting you if it thinks it finds any. It negates all forms of end-to-end encryption since they can simply scan your information on the device before encryption. There is no accountability and no insight into what exactly is included in the hashed database of blacklisted material that is downloaded to your phone. It's designed so that you cannot know when the algorithm has decided that you are a criminal and reports you.

It is not at all comparable to companies scanning content on their servers that users have voluntarily uploaded after agreeing to terms of service. They are now reaching into *your* personal device whether you like it or not.
 
That's a rather big stretch.

I'll have a stab at this line of thinking.

"If Apple can scan my photos using AI to find cats, dogs and faces, others can do it too, that means no privacy"
This is exactly what Snowden saying too.and it’s true,whether you agree with his previous actions or not.

first of all,Apple shouldn’t do this without some form of clear user permission,and let’s not forget this extreme policy is from a company who was promoting their strong user privacy policies,even refusing to unlock a suspected criminals phone for the FBI..so what happened now? privacy stunts don’t sell anymore? a new marketing stunt needed?

Who knows,but for many people,this privacy protection was the main reason of buying iPhones and Apple is now simply taking it away,in a very aggressive and extreme form.like no other!
 
Wrong . The photos lose their encryption when iCloud is enabled. They don’t open a back door and scan your photos. YOU as a user remove the encryption from those photos when you choose iCloud photos. Why is this so hard to understand ??
Our arguments are not dependent on encryption…. Why are yours?
 
  • Like
Reactions: peanuts_of_pathos
Like everyone else on here, I have nothing to hide. However the pictures/data on my device are just that, mine. I choose who I want to share information with, not Apple. What will be next, scanning texts/emails for phrases that could be deemed hurtful?
Anything that may piss off JJ Abrams should be a red flag.
 
Do you have any idea how the database is being updated? How often it would be? What would stop law enforcement to add those photos immediately after the case? Sure, If those individuals are stupid enough, they deserve to get caught. Still doesn’t invalidate the huge potential of overreach.

As for second point, I would not believe 1 in trillion is the real number. It got to be much smaller. Also, what trillion they are referring to? Photos? User? Either way, it’s vague and unclear how it is being designed and leaves plenty of room for close door manipulation.

What you’d better realise is this will be an ongoing issue forward, and situation will change and evolve, rather than standing still. What might not happen NOW might happen LATER.

The point is the accuracy is extremely high, no matter how you slice it. You and others are acting like there will be false positives left and right and on top of that, people will be thrown in jail for innocent images, LOL! Come on, let's get real please.

Again, I understand people's concerns about the bigger picture, but I DON'T understand why people feel the need to twist the facts regarding the details of this "smaller" picture being discussed now. Let's be fair and not act like every detail of what Apple is doing here is somehow suspect and sinister. You can admit that without supporting the general principle.
 
I own my device. Not Apple. The pictures/metadata/hash whatever it may be, is mine. Not Apple's. I own it. What gives them the right to install something on the device I own and scan and upload the data I own without my consent?

They scan them just a second before you bring those pics into their home.
And for sure they will ask for your consent when you update to iOS 15.
Don’t upload your pics to their servers and they won’t scan them.

I’m not saying there’s nothing concerning about this, but these are the facts.
It’s a bouncer guarding Apple’s club in the cloud.
Just don’t go there.
 
The point is the accuracy is extremely high, no matter how you slice it. You and others are acting like there will be false positives left and right and on top of that, people will be thrown in jail for innocent images, LOL! Come on, let's get real please.

Again, I understand people's concerns about the bigger picture, but I DON'T understand why people feel the need to twist the facts regarding the details of this "smaller" picture being discussed now. Let's be fair and not act like every detail of what Apple is doing here is somehow suspect and sinister. You can admit that without supporting the general principle.

I’m not at all worried about false positives.

I am worried about my cartoon frog memes, election anomaly pics, and hunter Biden scandal explainers.
 
  • Love
Reactions: peanuts_of_pathos
This deserves a re-post :

1628231729468-jpeg.1815406


Things are different now.
(not for me,if they really do this,no more iPhones for me after 12 years simple as that)
 
They scan them just a second before you bring those pics into their home.
And for sure they will ask for your consent when you update to iOS 15.
Don’t upload your pics to their servers and they won’t scan them.

I’m not saying there’s nothing concerning about this, but these are the facts.
It’s a bouncer guarding Apple’s club in the cloud.
Just don’t go there.
How about they just don’t scan them?
 
  • Like
Reactions: peanuts_of_pathos
If I was an entertainment industry lawyer, I’d be salivating over the potential to pressure the government to expand this technology to sniffing out copyright offenders….

this prompted me to google “is it safe to upload pirated stuff to dropbox”

results from 2014


looks like nothing new under the sun
 
this prompted me to google “is it safe to upload pirated stuff to dropbox”

results from 2014


looks like nothing new under the sun
Apple isn’t google. This is new ground being turned. We expect Apple to tell the feds to get a damn warrant.
 
So every family photo of the kids at the beach/pool or running around the house in only their diaper are going to get flagged and the feds knock on your door?
 
I think people really haven't bothered to read much around this. Just the headline, then cracked their knuckles and went full tinfoil hat keyboard warrior.

This image is on the main page of Mac rumours.

apple-csam-flow-chart.jpg

It has been stated that it’s not a true 1:1 hash. People aren’t thinking about that. It’s been stated it will detect crops, color adjustments, warps, transforms, rotations and more. Therefore, it’s not a true hash comparison 1:1. So how can they guarantee a similar photo with an adult won’t get flagged? What if that 24 year old looks like 15? What happens then? I know many people that look like they are in their teens. Even under manual review that can possibly be an issue. This isn’t as simple has hash match if they also can tell you manipulated the picture, which would produce a different hash but they still will match it.
 
So every family photo of the kids at the beach/pool or running around the house in only their diaper are going to get flagged and the feds knock on your door?
Actually, all of your pictures.
They are just selling it as CP
 
Always a curious thing when people worry about what "could" happen, not what is happening.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.