Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Introducing the all new
FD33AC3D-C8B9-4CE6-B96E-AFCBB1CAA68D.jpeg
 
Has anyone considered that for Apple to maintain immunity from liability they have to actively work to stop child pornography… as that is one of the exceptions to the law?
 
  • Like
Reactions: MozMan68
How does this change the fact at all that there’s now essentially a new backdoor to be abused that’s installed in iOS 15?

Stop defending and get rid of this BS, Apple.
Yea how on earth could Apple take a stance against child pornography. Pictures of abused 5 year olds are great /s
 
Breaking the law and child abuse are not "emotional issues" or an appeals to emotion. This is illegal activity we are talking about perpetrated by criminals. We live a civil society and Apple is part of that society. So, sorry, I'm not wrong. The perverts should just use local encrypted storage devices and not infect Apple's servers with such materials.
Yep, you're still wrong. Where you're going wrong is using an appeal to emotion to justify the insertion of the capability to scan and report on all of the data on anyone's phone prior to any of it being uploaded to a cloud service. Until now the capability has not existed. Apple could refuse authority on the basis that it was impossible.

Now they can't refuse on that basis. Apple has already capitulated to government demands in at least a couple of countries. Some of our own less-enlightened politicians have already demanded backdoors to iCloud encryption so the will is definitely there. Apple, and other tech companies can already be prohibited from alerting a user that their data has been subject to a warrant. Now the capability exists for an authority to not only demand more data, but also prohibit Apple from telling anyone that the data is being gathered. You may not hear about the next step down the "slippery slope" you intend to deal with on a case-by-case basis.

"Won't somebody think of the children" is not an excuse to wreck privacy for everyone no matter what kind of warm fuzzy feeling it gives you.
 
Everyone who would get caught by this will just switch to Android, rendering this useless anyway.
 
  • Like
Reactions: Zalmox
Why do ‘the great feature’ image analysis at all, even if it is local. I was wondering the other day that even grain varieties are hashed. Or pets.
Who is pushing this idea? We customers are not interested in it, so we are not shareholders in this game.
It would be good if Apple would ask its customers who is actually excited about this functionality. Technically, I'm impressed with how perfectly facial recognition works, but suddenly I realize that a hash has already been prepared for every face in my library. This can end badly because it arouses the curiosity of people who are not interested in privacy. Then it can easily happen that hashes from terrorists, for example, boost the CSAM stuff in every private computer, and representative of this example, the dam is quickly broken.

Edward Snoden is not a crank, and he warned about exactly such scenarios a few days ago because he knows this world pretty well.

I don't think Apple will be strong enough to put this demon in its place either.
Never play with fire. They should know that in Cupertino.

Unfortunately, Apple has proven to be currently too weak to protect their future development fields (which are very dependent on customer trust). This stupidity is really hard to understand. In my company all scan instigators would have been warned off as damaging to future business.

It's high time to turn back, and please don't forget an apologetic genuflection to us customers, Mr. Craig, Erik Neuenschwander, 中華人民共和國國家安全部 / 中华人民共和国国家安全部, Федеральная служба безопасности Российской Федерации (ФСБ) etc.
I would have been impressed by such kind of straight statement:

"We at Apple have been forced by law to perform image scans, which we are supposed to justify with the protection of children. Since we are too weak to enforce consumer interests against the institutions by pushing for a general ban on image analysis, we want to be a little better than Google and the like, and have created an instrument that is difficult to explain."

This would return the game to those responsible and Apple would not have lost any trust and would not try crude and funky interviews…
You act like Apple can refuse to obey a law. They can’t.
 
  • Like
Reactions: Stunning_Sense4712
Yep, you're still wrong. Where you're going wrong is using an appeal to emotion to justify the insertion of the capability to scan and report on all of the data on anyone's phone prior to any of it being uploaded to a cloud service. Until now the capability has not existed. Apple could refuse authority on the basis that it was impossible.

Now they can't refuse on that basis. Apple has already capitulated to government demands in at least a couple of countries. Some of our own less-enlightened politicians have already demanded backdoors to iCloud encryption so the will is definitely there. Apple, and other tech companies can already be prohibited from alerting a user that their data has been subject to a warrant. Now the capability exists for an authority to not only demand more data, but also prohibit Apple from telling anyone that the data is being gathered. You may not hear about the next step down the "slippery slope" you intend to deal with on a case-by-case basis.

"Won't somebody think of the children" is not an excuse to wreck privacy for everyone no matter what kind of warm fuzzy feeling it gives you.
You don’t have a right to privacy with that material
 
  • Like
Reactions: Geepaw
What you missed is the key section where you uploaded it to Apple's servers which makes them legally responsible to report the images.

False. There is no such law. In fact, 230 exempts them from liability.
 
Last edited:
The phone isn't the real product anymore. The user is. spiPhone
This is what happens when you decide the next growth driver of the company isn't going to be another insanely great product but s**t-tier services. Hardware becomes a means to an end, not an end itself. Today it's spying on us to "protect the children", tomorrow it can be targeted ads (but we do our machine-learning algorithms to track you on-device, so your privacy is protected!!). San Bernardino happened in 2015, peak iPhone for Apple, the year of iPhone 6 sales. The services push happened around 2016. Services companies don't care about privacy.
 
You and I have very different definitions of what it means to have control over systems and data that are owned by me. As long as you believe that a ToS agreement change and disabling a feature I paid for when I bought my devices is a legitimate contract which can revoke, in Tim’s own words, a fundamental human right, we have no space for agreement. I believe you will be on the wrong side of history here. Good day.

You seem to be hung up about privacy = secrecy from everyone. Even if privacy is a fundamental right it needs to be balanced with other rights. I believe this is also Apple's view.

iCloud (Photo Library) is Apple's property. They have the right to control and use it the way the want to. Just like you have with your property.

They have offered you the privilege to use their property under certain criteria. Among those are a privacy policy. When privacy is important to you, it is important to read the privacy policy you have agreed to and which governs the company you are dealing with.


You should read it.

"Apple uses personal data to power our services, to process your transactions, to communicate with you, for security and fraud prevention, and to comply with law. We may also use personal data for other purposes with your consent."

"We may also process your personal data where we believe it is in our or others’ legitimate interests, taking into consideration your interests, rights, and expectations."

"Others. Apple may share personal data with others at your direction or with your consent, such as when we share information with your carrier to activate your account. We may also disclose information about you if we determine that for purposes of national security, law enforcement, or other issues of public importance, disclosure is necessary or appropriate. We may also disclose information about you where there is a lawful basis for doing so, if we determine that disclosure is reasonably necessary to enforce our terms and conditions or to protect our operations or users, or in the event of a reorganization, merger, or sale.

Now over to what you agreed with Apple when they gave you the privilege to use iCloud:

"Changing the Service. Apple reserves the right at any time to modify this Agreement and to impose new or additional terms or conditions on your use of the Service, provided that Apple will give you 30 days’ advance notice of any material adverse change to the Service or applicable terms of service"

"You understand and agree that your use of the Service and any Content is solely at your own risk."

"However, Apple reserves the right at all times to determine whether Content is appropriate and in compliance with this Agreement, and may screen, move, refuse, modify and/or remove Content at any time, without prior notice and in its sole discretion, if such Content is found to be in violation of this Agreement or is otherwise objectionable."

These are legal contracts in the US and you should not enter into them if you don't agree to them. But even more importantly it tells you how Apple looks at privacy.
 
  • Like
Reactions: VTECaddict
It's apple that decided to slightly change their business model from " what happens on your iphone stays on your iphone" .
Tech experts express concerns and it's tech experts that asked for apple to release the code. It's also apple employees who express concerns too. Many serious media magazines too. We are not used to apple issuing so many PR releases so frequently, so not only Apple changes but they are doing things they were not doing in the past. So once again, why not publish the code?? Is there a reason you are against it? - at least this is the feeling I get from your reply -

I'm not against it, but it wouldn't help those people who are principled against it.

Apple could reveal the source, then just change it, and start using a different version of it and it would be very hard to know.

The first part in the NeuralHash is probably using the neural engine hardware. So you have to trust the hardware to do what the designer says. Who is the designer of the hardware? Apple.

Also, iCloud Backup is a much better feature to misuse for governments. Why not demand they open source that part also?

What I am reacting to is the demand for open sourcing a part of the OS which is ill-suited to do surveillance and not asking it for those parts which are truly great for such surveillance.
 
What Apple is doing is they announce that every iPhone user is a suspected child molester. Is that a good image?

I mean, just think about how the People will look at you soon using an Apple product. Everybody will assume you are one of "those". Probably won't take long and people attack people who take out their iPhone when children are around.

I will use a different phone in public. Showing the iPerv ain't cool!
 
  • Like
Reactions: hunapu
You're not getting it. I'm not saying I can PROVE that Apple could not or would not ever abuse this technology (or any others in iOS, macOS, etc.). Obviously no one could ever prove something like that unless they are all-knowing. I'm saying you have no rational basis/evidence to assume or even suspect that they will abuse/misuse this technology. Just about any technology COULD be abused, but that's not a rational basis to promote the elimination of that technology.

Your weight analogy doesn't work, because it assumes there's something off/wrong with what Apple's doing here (like being 5lb. overweight is) without proving so. Now, if some confidential document between Apple and a government agency is leaked and verified as genuine that details plans on how they will use this technology to spy on citizens, THEN you'd have something. Until then, it's all in your imagination.
So countless of examples of exploited mass surveillance systems are no rational basis? Only considering things for which there is concrete evidence is precisely being naive. This isn't about proving anything, it's about ethics.
 
I'm not against it, but it wouldn't help those people who are principled against it.

Apple could reveal the source, then just change it, and start using a different version of it and it would be very hard to know.

The first part in the NeuralHash is probably using the neural engine hardware. So you have to trust the hardware to do what the designer says. Who is the designer of the hardware? Apple.

Also, iCloud Backup is a much better feature to misuse for governments. Why not demand they open source that part also?

What I am reacting to is the demand for open sourcing a part of the OS which is ill-suited to do surveillance and not asking it for those parts which are truly great for such surveillance.
Security researchers will see how it compares the hash in real time, no need for open source. They’ll be able to see if it’s being used for other purposes or compared to other databases
 
Yep, you're still wrong. Where you're going wrong is using an appeal to emotion to justify the insertion of the capability to scan and report on all of the data on anyone's phone prior to any of it being uploaded to a cloud service. Until now the capability has not existed. Apple could refuse authority on the basis that it was impossible.

Now they can't refuse on that basis. Apple has already capitulated to government demands in at least a couple of countries. Some of our own less-enlightened politicians have already demanded backdoors to iCloud encryption so the will is definitely there. Apple, and other tech companies can already be prohibited from alerting a user that their data has been subject to a warrant. Now the capability exists for an authority to not only demand more data, but also prohibit Apple from telling anyone that the data is being gathered. You may not hear about the next step down the "slippery slope" you intend to deal with on a case-by-case basis.

"Won't somebody think of the children" is not an excuse to wreck privacy for everyone no matter what kind of warm fuzzy feeling it gives you.
There is no expectation of privacy for data that resides on a third party server. People using iCloud never should have had that expectation with regard to illicit materials ... child abuse materials or otherwise. I think you really do misunderstand the service Apple is providing to users ... in some cases a free service. As you state, Apple is required to respond to warrants and subpoenas. Apple cannot decline to respond to such lawful requests because you think it is technologically impossible to do ... especially when it is quite technologically possible in fact. IMHO.
 
In the video interview on the wall street journal page the Apple Representative stated that it took 30 suspected photo's of CSAM material to trigger a manual review. My question is why that number is so high? I am wondering if Apple's CSAM system has a high false positive rate for that number to be so high. In addition does that counter ever reset after a period of time?
 
What Apple is doing is they announce that every iPhone user is a suspected child molester. Is that a good image?

I mean, just think about how the People will look at you soon using an Apple product. Everybody will assume you are one of "those". Probably won't take long and people attack people who take out their iPhone when children are around.

I will use a different phone in public. Showing the iPerv ain't cool!
This makes no sense whatsoever.
 
They have come up with a reasonable way to keep illegal content off of their platform, without Apple violating privacy. It’s all done on device and Apple isn’t notified until there are 30 positive hits. That’s a reasonable compromise for privacy vs the rights of children to be safe from exploitation
 
  • Like
Reactions: Stunning_Sense4712
They have come up with a reasonable way to keep illegal content off of their platform, without Apple violating privacy. It’s all done on device and Apple isn’t notified until there are 30 positive hits. That’s a reasonable compromise for privacy vs the rights of children to be safe from exploitation
What concerns me is the high number. Since that could indicate their system has a high false positivity rate for it to be set at 30.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.