every time MR posts one of these articles, seems it is buried quickly. where you have to scroll through quite a few articles. can we pin one to the top this is a really big deal and conversation should continue with high visibility
What the hell are you talking about?Just remember it is partial content scanning. What does that mean? It means in this case that the code is running an algorithm that looks for specific similarities in a subset of the total picture matching a code or set of codes.
Now why does that sound just like facial recognition. Think before your react. I did not say it WAS facial recognition, I said it was nearly the same algorithm. Now think about how well Apple's facial recognition does NOT work.
1) This is so near facial recognition that there is no way to stop it from being applied to that end without users never being aware. In fact, I'll bet the government is forcing Apple to do this and Apple is trying to make a positive out of it before they get caught.
Next it will be, "we are only looking for terrorists." And there is the problem, a terrorist is anyone the government does not currently like. Why? Because the government has discovered (thanks to President Bush and the Patriot Act) they don't have to follow any laws as long as the person in question is classified as a terrorist. A terrorist can now be held without due process and without human rights.
2) Remember how well Apple's facial recognition does NOT work. Sure its fine for families that don't matter if it is wrong.
But you really want SWAT showing up at people's home based on Apple's picture recognition AI (and I know someone is supposed to review so don't get me started on the problems with that).
Come on, get real. Apple's spell checker is awful. Apple's Siri AI is awful. Apple is great at providing tools, but not in the actual end implementation.
To get that right Apple will have to port over code from the CIA, FBI, NSA, etc. If they have not already.
The software scans your cars and hills, but doesn't call home to Apple.
Now, if the software notices a unique identifier (or enough unique identifiers) of known child porn, it sends a message to Apple "hey, better have a real person take a look at these."
Yup, and the majority of those doing so have an established history of railing against Samsung and Google for "invading their privacies," but magically when Apple does so it's apparently not just ok but actually a good thing.The shilling being done by some here is absolutely pathetic.
On device scanning is an invasion of privacy. Period.
Thing is, as sick as CP is, its just another thing that is illegal. There are lots of things in this world that are illegal. Animal abuse is illegal, dealing drugs is illegal, trespassing is illegal...Why aren't we scanning for that type of stuff too if we're just looking at doing the morally right thing here? I can understand the suspicion.
I'm not convinced by your two arguments - you should be able to decide which applications download images directly to your photo roll and you'll need to bring me evidence about how an iPhone could be easily hacked than iCloud.
and your first point would be either bad if it's client or server side anyway.
my real issue with this outrage is : if client side scanning is not ok, server side scanning is not ok either (less evil is still evil). both are fugly if they can't be peer reviewed in any way.
I understand people being skeptical to the least with Apple's approach but I'm yet to find voices getting angry about the status quo that is server side scanning.
As a former 13yo gay kid, I would be horrified that my parents would be notified of anything I wanted to research in regards to my developing sexuality in private, especially considering such a disclosure would have easily resulted in me being sent to a correctional camp.As a member of the LGBTQ community, I am dismayed by opponents of the parental controls in iMessage because of a presumed disparate impact on homosexual youth. The parental notification of potentially explicit imagery views on kids devices are for only kids under 13yo. Parents of all kids, including LGBT kids, deserve to protect these young kids from potentially being targeted by child predators.
Further, LGBT kids in repressive households or communities are MORE susceptible from grooming by child predators. We shouldn't limit parental controls because there are some parents with unhealthy parenting skills. These parents could just take their kids devices and check all the messages regardless.
The photos are going onto iCloud anyway… so it won’t be only just on your deviceBecause it’s on your device. Get off my device.
It does seem hypocritical.
A web hosting company wants to be immune from what their customers post, or host, and now Apple wants to 'police' their iCloud member's data. It does seem like an odd thing to want to do. So *should* hosting companies be held liable for their content? How can a hosting company 'police' all of their customers and eliminate 'objectionable content', and who decides what's objectionable. Seeing multiple angles on this doesn't help...
And what nonsense have you provided? Just numbers of reports. Doing client side scans helps the integrity of encryption for non-scam images…Why do people keep repeating this nonsense?
Take that up with your non-understanding parents. Also it’ll only flag to your parents if you received in iMessage. Do your research in safari and don’t save the pictures…As a former 13yo gay kid, I would be horrified that my parents would be notified of anything I wanted to research in regards to my developing sexuality in private, especially considering such a disclosure would have easily resulted in me being sent to a correctional camp.
At this point, someone in Apple must have been regretting the decision to publish this feature, which understandably creates such a big wave of backlash.
Next time Apple doing this, I am not sure if they will just keep their mouth shut and deny any allegation like they always did regarding major hardware failures (antenna, battery, bend etc).
That's something I never thought of, and it's a very good point against reporting txt's to parents. A lot of parents would understand, a lot wouldn't.As a former 13yo gay kid, I would be horrified that my parents would be notified of anything I wanted to research in regards to my developing sexuality in private, especially considering such a disclosure would have easily resulted in me being sent to a correctional camp.
Which is why the scans only occur when uploading to iCloud photos.. to protect apples property..It's on airport (their) property.
Ditto, though I would hope a beaten wife would be reported by anyone. It's not something you have to scan for!
Again, their property, their responsibility to follow law.
Their property, again.
My phone, *my* property.
Exactly! My phone, my property. Their property, their responsibility.
As a member of the LGBTQ community, I am dismayed by opponents of the parental controls in iMessage because of a presumed disparate impact on homosexual youth. The parental notification of potentially explicit imagery views on kids devices are for only kids under 13yo. Parents of all kids, including LGBT kids, deserve to protect these young kids from potentially being targeted by child predators.
Further, LGBT kids in repressive households or communities are MORE susceptible from grooming by child predators. We shouldn't limit parental controls because there are some parents with unhealthy parenting skills. These parents could just take their kids devices and check all the messages regardless.
And it only does the scans when uploading to iCloud.. so what’s the difference?That doesn't even make sense. I have said I'm okay with server side scanning already, multiple times.
so you're saying that iCloud is easier to hack than an iPhone, the exact opposite of what the other was talking about. quote them, not meYou do understand that iCloud’s web interface can be hacked into and photos placed there which automatically propagate to all iCloud devices synced to it?
Regardless of the outcome, I will never feel the same about security with Apple products as I have previously. They can do anything thing they want in the cloud, but not on my phone.
For now... (until a govt asks them to change that.)Which is why the scans only occur when uploading to iCloud photos.. to protect apples property..
It sounds like Apple was planning on releasing this product for awhile know since the CSAM coding has been on their software since iOS 14.3. It is sounding like Apple does not want to have to backtrack on this project.So do people think Apple has something to gain by implementing this, or do they think Apple are being forced to do this by some Government agency, because I'm genuinely puzzled by why they would go to all this trouble if they didn't think it was the right thing to do.
But the same can be said for server side scans, it could be amended to scan whatever they wish.For now... (until a govt asks them to change that.)
Part of looking out for future problems is to not get blindsided when it happens, and given this subject, it will happen.
They could protect their property on their property...
They’ve confirmed it’s a different hashing mechanism to what’s in iOS 14 right now… not quite the same thingIt sounds like Apple was planning on releasing this product for awhile know since the CSAM coding has been on their software since iOS 14.3. It is sounding like Apple does not want to have to backtrack on this project.
The code will be modified with the latest version but this code shows Apple has been planning on releasing this type of CSAM scanning software for awhile.They’ve confirmed it’s a different hashing mechanism to what’s in iOS 14 right now… not quite the same thing
This feature does not limit young people searching for information online. This is specifically targeted to images being exchanged in private communications. The issue here is about the potential of child predators grooming children. Parents are notified AFTER a child ignores or consents to the notification before viewing the content.As a former 13yo gay kid, I would be horrified that my parents would be notified of anything I wanted to research in regards to my developing sexuality in private, especially considering such a disclosure would have easily resulted in me being sent to a correctional camp.