The shilling being done by some here is absolutely pathetic.
On device scanning is an invasion of privacy. Period.
On device scanning is an invasion of privacy. Period.
This has really gotten a lot of people very nervous. I get the privacy implications, but at least Apple was public about their plans, I guess...
I'm not convinced by your two arguments - you should be able to decide which applications download images directly to your photo roll and you'll need to bring me evidence about how an iPhone could be easily hacked than iCloud.First big difference: By just knowing your phone number your could get send matching images on your device. There are apps that would put those images into Photos which then syncs to iCloud. Not much fun...
Second big difference: Your iPhone may be more easily hacked than iCloud.
Not nervous. Angry.
As a member of the LGBTQ community, I am dismayed by opponents of the parental controls in iMessage because of a presumed disparate impact on homosexual youth. The parental notification of potentially explicit imagery views on kids devices are for only kids under 13yo. Parents of all kids, including LGBT kids, deserve to protect these young kids from potentially being targeted by child predators.I`m glad I am not the only one who thinks the blur thing could be used by homophobic parents trying to block everything a kid would see. And they could still bypass the age registered in the system to control their teenagers.
Prior to this, I trusted Apple's privacy stance and so I was not bothered by activities that could potentially turn into surveillance. This breaches that trust. Look at all of the core items that rely on a balance between privacy and surveillance: Homepod, Siri, your airtags using my phone connection, findmy. Apples Contact Tracing, photo identification, Homekit Secure Video, Maps, icloud mail, Password storage, health. I'm sure you can figure out more.My point was that you trust that the scans of your photos will remain for your benefit. Doesn't all the same "slippery slope" arguments apply to the scanning that Apple is already doing. And, further, it is doing it for all photos not just those that are being uploaded to iCloud Photos.
I'm surprised to see Apple doing this because they seem to be the front runners of this whole privacy mantra. It counterpoints everything where Apple stands for.
Privacy does also NOT mean the ability for a private company to invade it.Why make a huge drama out of something that's very simple? Nobody is looking at your photos.
Privacy doesn't give anyone the right to commit a crime.
People do deserve their privacy. This is not like having a camera watching their every move. It's just an AI filter that checks for illegal images, images used to commit a crime. If it finds any, it will raise a flag.
Apple is trying to protect themselves from being accused of protecting criminals and getting involved into a lawsuit.
There are people who think that by using an Apple device they can take and store illegal pictures.
Governments want Apple to create a backdoor, something that would affect the security and privacy of the devices. What Apple is trying to do is implement a non-invasive way to help combat crime, so certain governments and haters don't complain that Apple protects criminals.
Maybe as you suggest, Apple should put this to rest, and let people continue with their business causing pain and suffering to helpless children. It may not bother some people at all, until it happens to someone close to them. Then they will be actively in favor of what Apple is trying to do here.
Just like with the use of a face mask. People are complaining saying "is their body and their right not to wear a mask", without any concern for the people around them. The mask is to protect those around you and protect yourself.
Same thing in schools now, Mothers against having their children wear masks "because is their right not to do so". This is ridiculous!
Just wait until they or their children get sick with COVID-19, their mind will change and will be rallying in favor of wearing the mask. It's already happening, just search in YouTube. Many of those against it, or those who used to say that COVID-19 doesn't exist, are now pleading to people to take it seriously.
This is exactly how we know that Apple's "Privacy" initiatives are Marketing and not real. Internally and privately, they scan and give all governments what they want.
So the marketing department knows they are already doing it and are just trying to get some good PR out of it. Unfortunately, in the process they revealed the reality of what happens behind the scenes.
Not much else makes sense. Unless of course, they are really that ignorant of their user base.
By making our phones run an algorithm that isn’t meant to serve us, but surveils us, it has crossed a line.
.....the problem with human review is that it can be swamped if - as may be the case - they've badly underestimated the number of flagged cases. Partly because there's just more of those images going around but also... that would be the *point* of being able to produce innocent images that trigger the algorithm: To swamp the human reviewers with false positives. a Denial of Service attack. And then shortcuts get into the process and mistakes are made, both ways. And as for the human reviewers, see Facebook's experience: there's a human cost to *having* to view the kind of images this system is meant to trap, and a shockingly high burnout rate. Especially if they're swamped and can't even keep up.
2 words : butterfly keyboardI would have liked if Apple backtracks on the decision. Hope Apple listens and does not go ahead with it. 🤞
But its still surveillance...which appears to be what many are trying to ignore.About the "Messages" thing.. it doesn't break end to end encryption because the processing is done before or after the photo is sent/received, and it stays on the device, so Apple doesn't have any access to that material.
That's the thing though. Human reviewers don't see your photos until 2 things happen. You have to have 30 CSAM matches. Once 30 vouchers are generated during image upload, then a second separate process on Apple's server perceptually scans the photos and generates a new hash, then those hashes are compared to the CSAM database, and THEN only THOSE matches are sent to human review. That's a lot of steps for someone to see your pictures and you also deserve it for harboring those disgusting images in my opinion......
the problem with human reviewers is.....
private photos weren't meant for anyone else's eyes.
Its like Polaroid getting to review your instant photos that you never intended to take someplace to have developed.
Surveillance by whom? Not Apple. Also, the Messages thing is a child safety feature that is opt-in by the parent. It's no different than blocking certain channels on your cable TV service.But its still surveillance...which appears to be what many are trying to ignore.