Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
That is scanning the users own photos, for the users own benefit.

What they are proposing now is, for the first time ever, going through users content on users devices, looking for things that external third parties want.

It couldn't be more different than your example.

But a lot of people on Macrumors are against scanning on the device under any circumstances, which must include for their own benefit.


So your view is "scanning on the device is OK if it benefits me". I have no problem with this point of view.
 
Federighi also revealed a number of new details around the system's safeguards, such as the fact that a user will need to meet around 30 matches for CSAM content in their Photos library before Apple is alerted, whereupon it will confirm if those images appear to be genuine instances of CSAM.He also pointed out the security advantage of placing the matching process on the iPhone directly, rather than it occurring on iCloud's servers.
So what I'm hearing is that once I have 30 pictures of my baby, Apple will start looking at them taking baths? No thanks.
 
No, you lack a basic understanding of logic. With absolutely ZERO evidence, you're just assuming this technology will be used for nefarious purposes. Why would Apple even announce anything and create this huge snow job if they intended to use this technology for "evil" purposes? They'd just sneak it in there without telling you.
I love this new age of "Listen to the science"-people who think that zero evidence proves anything. Apple is testing the waters here. If people were fine with this, maybe they'd also be fine with the next step etc. Don't be naive about this. If there is any business case in massive scale image analysis, this will go to the next step, provided Apple judges the negative consequences of this as manageable. People freaking out about this in unison here will hopefully do a small part in showing Apple what their customers think about this. You may think it's over the top, but it's like maintaining your weight: better start when you are 5 lbs over what you want to be rather than waiting until people make comments.
 
Slippery slope arguments are logical fallacies. You can deal with these issues separately. Apple should do what it can to stop perverts from hurting kids.
Wouldn't it be better if people stops, and Apple concentrates in making iPhones instead of spyPhones?
And is there really anyone in the world stupid enough to upload kiddyporn to iCloud?
 
What I did not like in the Federighi statement is the lack of understanding for the concerns. He is only like you don't understand but not like I respect your different view and I see your point or similar. Below the line it did not help Apples position from my view.

They insist on doing it, I insist on nothing is happening on my phone. We will see how this will end? I'd say they will have to move back. This is escalating. Heads might roll.
 
Last edited:
The more you have to "explain" over and over some new controversial "feature", the more it begins to stink.

(And I don't even use iCloud to store anything important -- especially photos!)
Or, perhaps, the more you just have to explain to misinformed people what the reality is. Kind of like so much else in life right now.
 
  • Like
Reactions: VTECaddict
Wouldn't it be better if people stops, and Apple concentrates in making iPhones instead of spyPhones?
And is there really anyone in the world stupid enough to upload kiddyporn to iCloud?
No. You cannot expect or demand privacy when using a third party server. Same it as has always been. This is not really a new concept. If you want privacy use an encrypted local storage. Simple really.
 
For the life of me, I cannot understand why Tim Cook thought putting this capability on ANY iPhone was a good idea !

It would have been a much, much smarter decision to include it (OR support it) ONLY on their Servers !
 
  • Like
Reactions: IG88
I love this new age of "Listen to the science"-people who think that zero evidence proves anything. Apple is testing the waters here. If people were fine with this, maybe they'd also be fine with the next step etc. Don't be naive about this. If there is any business case in massive scale image analysis, this will go to the next step, provided Apple judges the negative consequences of this as manageable. People freaking out about this in unison here will hopefully do a small part in showing Apple what their customers think about this. You may think it's over the top, but it's like maintaining your weight: better start when you are 5 lbs over what you want to be rather than waiting until people make comments.

You're not getting it. I'm not saying I can PROVE that Apple could not or would not ever abuse this technology (or any others in iOS, macOS, etc.). Obviously no one could ever prove something like that unless they are all-knowing. I'm saying you have no rational basis/evidence to assume or even suspect that they will abuse/misuse this technology. Just about any technology COULD be abused, but that's not a rational basis to promote the elimination of that technology.

Your weight analogy doesn't work, because it assumes there's something off/wrong with what Apple's doing here (like being 5lb. overweight is) without proving so. Now, if some confidential document between Apple and a government agency is leaked and verified as genuine that details plans on how they will use this technology to spy on citizens, THEN you'd have something. Until then, it's all in your imagination.
 
  • Like
Reactions: VTECaddict
And case in point of why some people don’t understand the system in place here and tend to oversimplify things. I’m not defending or justifying if but this right here is the problem in understanding if anyone cares to.

”Naughty” and “nice” isn’t even close what’s going on here. I have plenty of the “naughty” variety and you know why I won’t get reported to the cops? Because they aren’t illegal pictures.

C’mon man, stop with this juvenile misrepresentation of what’s going on here. Talk about how misinformation gets spread. Could be disinformation, who knows with some of you people here.

You havent the slightest idea of what it’s like to be oppressed, truly, as some people have, so, please, just give the melodramatics a rest.
Sorry, what part am i wrong;

apple is going to use a software method to review my images

if my images = illegal notify government sanctioned organization

therefore my phone becomes a government surveillance device
 
It most certainly is a fallacy. You may pick and choose what is logic if you want. You're still wrong.
Breaking the law and child abuse are not "emotional issues" or an appeals to emotion. This is illegal activity we are talking about perpetrated by criminals. We live a civil society and Apple is part of that society. So, sorry, I'm not wrong. The perverts should just use local encrypted storage devices and not infect Apple's servers with such materials.
 
Clearly I’m not up to day with “scanning hash”. What exactly does that mean. I’m assuming not hash tags?
 
This isn't new, it was explained on the day the story broke, that it was on children's accounts, etc. But the first reaction of Americans is "But Mah Freedumbs!"
 
Sorry, what part am i wrong;

apple is going to use a software method to review my images

if my images = illegal notify government sanctioned organization

therefore my phone becomes a government surveillance device
What you missed is the key section where you uploaded it to Apple's servers which makes them legally responsible to report the images.
 
I just don't see how Apple will be able to resist complying with laws that governments will inevitable pass requiring them to expand this scanning to other things.

This pretty much says it all:
By making our phones run an algorithm that isn’t meant to serve us, but surveils us, it has crossed a line.
 
Are you just making this up as you go along? The bidding of third parties...doing this because someone told them to?

You must be a high level Apple employee and in the room when this was discussed.

OR...PERHAPS...Apple is 100% liable for their servers hosting or transmitting child pornography by US Law and MUST report such activity. They are meeting this requirement by providing a more secure and private way of identifying it with this method versus just scanning every single photo you have taken and uploaded to iCloud.

Every internet service is required to do this either by scanning all photos are forwarding user submitted complaints/reporting.

The fact that Apple has figured out a way to do this while still protecting their customers versus a mass grab of every image you upload is pretty impressive IMHO.

Clearly you’ve never heard of Section 2230.
 
No. You cannot expect or demand privacy when using a third party server. Same it as has always been. This is not really a new concept. If you want privacy use an encrypted local storage. Simple really.
No one was demanding it. Apple pushed it as their business model and it's a business model people paid for and increased Apple's bottom line. Stop being foolish please.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.