That is scanning the users own photos, for the users own benefit.
What they are proposing now is, for the first time ever, going through users content on users devices, looking for things that external third parties want.
It couldn't be more different than your example.
So what I'm hearing is that once I have 30 pictures of my baby, Apple will start looking at them taking baths? No thanks.Federighi also revealed a number of new details around the system's safeguards, such as the fact that a user will need to meet around 30 matches for CSAM content in their Photos library before Apple is alerted, whereupon it will confirm if those images appear to be genuine instances of CSAM.He also pointed out the security advantage of placing the matching process on the iPhone directly, rather than it occurring on iCloud's servers.
I love this new age of "Listen to the science"-people who think that zero evidence proves anything. Apple is testing the waters here. If people were fine with this, maybe they'd also be fine with the next step etc. Don't be naive about this. If there is any business case in massive scale image analysis, this will go to the next step, provided Apple judges the negative consequences of this as manageable. People freaking out about this in unison here will hopefully do a small part in showing Apple what their customers think about this. You may think it's over the top, but it's like maintaining your weight: better start when you are 5 lbs over what you want to be rather than waiting until people make comments.No, you lack a basic understanding of logic. With absolutely ZERO evidence, you're just assuming this technology will be used for nefarious purposes. Why would Apple even announce anything and create this huge snow job if they intended to use this technology for "evil" purposes? They'd just sneak it in there without telling you.
Wouldn't it be better if people stops, and Apple concentrates in making iPhones instead of spyPhones?Slippery slope arguments are logical fallacies. You can deal with these issues separately. Apple should do what it can to stop perverts from hurting kids.
"Won't somebody think of the children?" is an Appeal to Emotion logical fallacy.Slippery slope arguments are logical fallacies. You can deal with these issues separately. Apple should do what it can to stop perverts from hurting kids.
Or, perhaps, the more you just have to explain to misinformed people what the reality is. Kind of like so much else in life right now.The more you have to "explain" over and over some new controversial "feature", the more it begins to stink.
(And I don't even use iCloud to store anything important -- especially photos!)
It isn't but you believe what you want."Won't somebody think of the children?" is an Appeal to Emotion logical fallacy.
No. You cannot expect or demand privacy when using a third party server. Same it as has always been. This is not really a new concept. If you want privacy use an encrypted local storage. Simple really.Wouldn't it be better if people stops, and Apple concentrates in making iPhones instead of spyPhones?
And is there really anyone in the world stupid enough to upload kiddyporn to iCloud?
Slippery slope arguments are logical fallacies. You can deal with these issues separately. Apple should do what it can to stop perverts from hurting kids.
It most certainly is a fallacy. You may pick and choose what is logic if you want. You're still wrong.It isn't but you believe what you want.
I love this new age of "Listen to the science"-people who think that zero evidence proves anything. Apple is testing the waters here. If people were fine with this, maybe they'd also be fine with the next step etc. Don't be naive about this. If there is any business case in massive scale image analysis, this will go to the next step, provided Apple judges the negative consequences of this as manageable. People freaking out about this in unison here will hopefully do a small part in showing Apple what their customers think about this. You may think it's over the top, but it's like maintaining your weight: better start when you are 5 lbs over what you want to be rather than waiting until people make comments.
Sorry, what part am i wrong;And case in point of why some people don’t understand the system in place here and tend to oversimplify things. I’m not defending or justifying if but this right here is the problem in understanding if anyone cares to.
”Naughty” and “nice” isn’t even close what’s going on here. I have plenty of the “naughty” variety and you know why I won’t get reported to the cops? Because they aren’t illegal pictures.
C’mon man, stop with this juvenile misrepresentation of what’s going on here. Talk about how misinformation gets spread. Could be disinformation, who knows with some of you people here.
You havent the slightest idea of what it’s like to be oppressed, truly, as some people have, so, please, just give the melodramatics a rest.
Breaking the law and child abuse are not "emotional issues" or an appeals to emotion. This is illegal activity we are talking about perpetrated by criminals. We live a civil society and Apple is part of that society. So, sorry, I'm not wrong. The perverts should just use local encrypted storage devices and not infect Apple's servers with such materials.It most certainly is a fallacy. You may pick and choose what is logic if you want. You're still wrong.
...and there it is... 🤣 🤣 🤣So what I'm hearing is that once I have 30 pictures of my baby, Apple will start looking at them taking baths? No thanks.
What you missed is the key section where you uploaded it to Apple's servers which makes them legally responsible to report the images.Sorry, what part am i wrong;
apple is going to use a software method to review my images
if my images = illegal notify government sanctioned organization
therefore my phone becomes a government surveillance device
By making our phones run an algorithm that isn’t meant to serve us, but surveils us, it has crossed a line.
Are you just making this up as you go along? The bidding of third parties...doing this because someone told them to?
You must be a high level Apple employee and in the room when this was discussed.
OR...PERHAPS...Apple is 100% liable for their servers hosting or transmitting child pornography by US Law and MUST report such activity. They are meeting this requirement by providing a more secure and private way of identifying it with this method versus just scanning every single photo you have taken and uploaded to iCloud.
Every internet service is required to do this either by scanning all photos are forwarding user submitted complaints/reporting.
The fact that Apple has figured out a way to do this while still protecting their customers versus a mass grab of every image you upload is pretty impressive IMHO.
No one was demanding it. Apple pushed it as their business model and it's a business model people paid for and increased Apple's bottom line. Stop being foolish please.No. You cannot expect or demand privacy when using a third party server. Same it as has always been. This is not really a new concept. If you want privacy use an encrypted local storage. Simple really.