I go to a ballgame and they search me. Thats ok. I dont let then come to my house and search me before the gameWhat you missed is the key section where you uploaded it to Apple's servers which makes them legally responsible to report the images.
Yea how on earth could Apple take a stance against child pornography. Pictures of abused 5 year olds are great /sHow does this change the fact at all that there’s now essentially a new backdoor to be abused that’s installed in iOS 15?
Stop defending and get rid of this BS, Apple.
Yep, you're still wrong. Where you're going wrong is using an appeal to emotion to justify the insertion of the capability to scan and report on all of the data on anyone's phone prior to any of it being uploaded to a cloud service. Until now the capability has not existed. Apple could refuse authority on the basis that it was impossible.Breaking the law and child abuse are not "emotional issues" or an appeals to emotion. This is illegal activity we are talking about perpetrated by criminals. We live a civil society and Apple is part of that society. So, sorry, I'm not wrong. The perverts should just use local encrypted storage devices and not infect Apple's servers with such materials.
You act like Apple can refuse to obey a law. They can’t.Why do ‘the great feature’ image analysis at all, even if it is local. I was wondering the other day that even grain varieties are hashed. Or pets.
Who is pushing this idea? We customers are not interested in it, so we are not shareholders in this game.
It would be good if Apple would ask its customers who is actually excited about this functionality. Technically, I'm impressed with how perfectly facial recognition works, but suddenly I realize that a hash has already been prepared for every face in my library. This can end badly because it arouses the curiosity of people who are not interested in privacy. Then it can easily happen that hashes from terrorists, for example, boost the CSAM stuff in every private computer, and representative of this example, the dam is quickly broken.
Edward Snoden is not a crank, and he warned about exactly such scenarios a few days ago because he knows this world pretty well.
I don't think Apple will be strong enough to put this demon in its place either.
Never play with fire. They should know that in Cupertino.
Unfortunately, Apple has proven to be currently too weak to protect their future development fields (which are very dependent on customer trust). This stupidity is really hard to understand. In my company all scan instigators would have been warned off as damaging to future business.
It's high time to turn back, and please don't forget an apologetic genuflection to us customers, Mr. Craig, Erik Neuenschwander, 中華人民共和國國家安全部 / 中华人民共和国国家安全部, Федеральная служба безопасности Российской Федерации (ФСБ) etc.
I would have been impressed by such kind of straight statement:
"We at Apple have been forced by law to perform image scans, which we are supposed to justify with the protection of children. Since we are too weak to enforce consumer interests against the institutions by pushing for a general ban on image analysis, we want to be a little better than Google and the like, and have created an instrument that is difficult to explain."
This would return the game to those responsible and Apple would not have lost any trust and would not try crude and funky interviews…
You don’t have a right to privacy with that materialYep, you're still wrong. Where you're going wrong is using an appeal to emotion to justify the insertion of the capability to scan and report on all of the data on anyone's phone prior to any of it being uploaded to a cloud service. Until now the capability has not existed. Apple could refuse authority on the basis that it was impossible.
Now they can't refuse on that basis. Apple has already capitulated to government demands in at least a couple of countries. Some of our own less-enlightened politicians have already demanded backdoors to iCloud encryption so the will is definitely there. Apple, and other tech companies can already be prohibited from alerting a user that their data has been subject to a warrant. Now the capability exists for an authority to not only demand more data, but also prohibit Apple from telling anyone that the data is being gathered. You may not hear about the next step down the "slippery slope" you intend to deal with on a case-by-case basis.
"Won't somebody think of the children" is not an excuse to wreck privacy for everyone no matter what kind of warm fuzzy feeling it gives you.
What you missed is the key section where you uploaded it to Apple's servers which makes them legally responsible to report the images.
There is no liability.Has anyone considered that for Apple to maintain immunity from liability they have to actively work to stop child pornography… as that is one of the exceptions to the law?
This is what happens when you decide the next growth driver of the company isn't going to be another insanely great product but s**t-tier services. Hardware becomes a means to an end, not an end itself. Today it's spying on us to "protect the children", tomorrow it can be targeted ads (but we do our machine-learning algorithms to track you on-device, so your privacy is protected!!). San Bernardino happened in 2015, peak iPhone for Apple, the year of iPhone 6 sales. The services push happened around 2016. Services companies don't care about privacy.The phone isn't the real product anymore. The user is. spiPhone
You and I have very different definitions of what it means to have control over systems and data that are owned by me. As long as you believe that a ToS agreement change and disabling a feature I paid for when I bought my devices is a legitimate contract which can revoke, in Tim’s own words, a fundamental human right, we have no space for agreement. I believe you will be on the wrong side of history here. Good day.
Wrong.False. There is no such law. In fact, 2230 exempts them from liability.
Wrong.There is no liability.
The post service doesn't have to open every letter and parcel to look if something bad is inside.
It's apple that decided to slightly change their business model from " what happens on your iphone stays on your iphone" .
Tech experts express concerns and it's tech experts that asked for apple to release the code. It's also apple employees who express concerns too. Many serious media magazines too. We are not used to apple issuing so many PR releases so frequently, so not only Apple changes but they are doing things they were not doing in the past. So once again, why not publish the code?? Is there a reason you are against it? - at least this is the feeling I get from your reply -
So countless of examples of exploited mass surveillance systems are no rational basis? Only considering things for which there is concrete evidence is precisely being naive. This isn't about proving anything, it's about ethics.You're not getting it. I'm not saying I can PROVE that Apple could not or would not ever abuse this technology (or any others in iOS, macOS, etc.). Obviously no one could ever prove something like that unless they are all-knowing. I'm saying you have no rational basis/evidence to assume or even suspect that they will abuse/misuse this technology. Just about any technology COULD be abused, but that's not a rational basis to promote the elimination of that technology.
Your weight analogy doesn't work, because it assumes there's something off/wrong with what Apple's doing here (like being 5lb. overweight is) without proving so. Now, if some confidential document between Apple and a government agency is leaked and verified as genuine that details plans on how they will use this technology to spy on citizens, THEN you'd have something. Until then, it's all in your imagination.
Security researchers will see how it compares the hash in real time, no need for open source. They’ll be able to see if it’s being used for other purposes or compared to other databasesI'm not against it, but it wouldn't help those people who are principled against it.
Apple could reveal the source, then just change it, and start using a different version of it and it would be very hard to know.
The first part in the NeuralHash is probably using the neural engine hardware. So you have to trust the hardware to do what the designer says. Who is the designer of the hardware? Apple.
Also, iCloud Backup is a much better feature to misuse for governments. Why not demand they open source that part also?
What I am reacting to is the demand for open sourcing a part of the OS which is ill-suited to do surveillance and not asking it for those parts which are truly great for such surveillance.
There is no expectation of privacy for data that resides on a third party server. People using iCloud never should have had that expectation with regard to illicit materials ... child abuse materials or otherwise. I think you really do misunderstand the service Apple is providing to users ... in some cases a free service. As you state, Apple is required to respond to warrants and subpoenas. Apple cannot decline to respond to such lawful requests because you think it is technologically impossible to do ... especially when it is quite technologically possible in fact. IMHO.Yep, you're still wrong. Where you're going wrong is using an appeal to emotion to justify the insertion of the capability to scan and report on all of the data on anyone's phone prior to any of it being uploaded to a cloud service. Until now the capability has not existed. Apple could refuse authority on the basis that it was impossible.
Now they can't refuse on that basis. Apple has already capitulated to government demands in at least a couple of countries. Some of our own less-enlightened politicians have already demanded backdoors to iCloud encryption so the will is definitely there. Apple, and other tech companies can already be prohibited from alerting a user that their data has been subject to a warrant. Now the capability exists for an authority to not only demand more data, but also prohibit Apple from telling anyone that the data is being gathered. You may not hear about the next step down the "slippery slope" you intend to deal with on a case-by-case basis.
"Won't somebody think of the children" is not an excuse to wreck privacy for everyone no matter what kind of warm fuzzy feeling it gives you.
This makes no sense whatsoever.What Apple is doing is they announce that every iPhone user is a suspected child molester. Is that a good image?
I mean, just think about how the People will look at you soon using an Apple product. Everybody will assume you are one of "those". Probably won't take long and people attack people who take out their iPhone when children are around.
I will use a different phone in public. Showing the iPerv ain't cool!
What concerns me is the high number. Since that could indicate their system has a high false positivity rate for it to be set at 30.They have come up with a reasonable way to keep illegal content off of their platform, without Apple violating privacy. It’s all done on device and Apple isn’t notified until there are 30 positive hits. That’s a reasonable compromise for privacy vs the rights of children to be safe from exploitation