Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The road to hell is paved with good intentions, whether that's procedural failures or an oppressive government mucking up their planned implementation by demanding that they add their own non-CSAM hash database under penalty of an iPhone import ban, other sanctions against Apple and/or its executives, etc.

After all, Apple does famously follow all local laws. It's a matter of when, not if.
Yep, csam scanning is coming to apple. As we see google actively scans, hopefully apples process is better. And of course apple famously does follow all local laws, as does google.
 
  • Like
Reactions: DeepIn2U
The point is that google caught this in the cloud and there were no procedures in place. Don’t want scanning on your apple device, disable iCloud. Apple has several layers to endure false reporting is minimal ized.
No. The point is that the same legal constraints that inevitably led to google’s actions will lead to the exact same behavior at Apple. This is the procedure.

Once CSAM is ever flagged in an account by the automated system, the legal risks are too great to do anything other than to shut down the account and report the user. There is nothing in the law to indemnify google or Apple once they’ve become suspicious of CSAM in an account (documentation of a review or reporting it to law enforcement doesn’t shield the company from the risk).

Apple’s layers are meaningless because it’s not going to assume any of the huge legal risk for a user. Just like google doesn’t.
 
Last edited:
No. The point is that the same legal constraints that inevitably led to google’s actions will lead to the exact same behavior at Apple. This is the procedure.
The point is the method is totally different and can be disabled on iOS.
Once CSAM is ever flagged in an account by the automated system, the legal risks are too great to do anything other than to shut down the account and report the user.
That’s what google did, shut down the account.
There is nothing in the law to indemnify google or Apple once they’ve become suspicious of CSAM in an account (documentation of a review or reporting it to law enforcement doesn’t shield the company from the risk).
Yep, inherently one sided.
Apple’s layers are meaningless because it’s not going to assume any of the huge legal risk for a user. Just like google doesn’t.
Apples hash tag scanning is much different than googles ai scanning.
 
  • Like
Reactions: DeepIn2U
Yep, inherently one sided.

Apples hash tag scanning is much different than googles ai scanning.

On the one hand, you are agreeing with me that Apple is going to do exactly the same as Google with automatically flagged accounts because the law is "Yep, inherently one sided."

On the other hand, you sprinkle the phrase "hash tag scanning" like it is magic pixie dust that somehow makes things different for Apple than Google.

You contradicted yourself.
 
Last edited:
On the one hand, you are agreeing with me that Apple is going to do exactly the same as Google with automatically flagged accounts because the law is "Yep, inherently one sided."
In the end the results are the same, reported to authorities. If that is what you are getting at.
On the other hand, you sprinkle the phrase "hash tag scanning" like it is magic pixie dust that somehow makes things different for Apple than Google.
Google scanning is different as was shown any image can be flagged. Apple scanning is against a database hash of known csam images. So unless you use your iphone to send an image of your baby through gmail, the end result is not the same as the image of the baby will (or shouldn't) trigger a csam hit.
You contradicted yourself.
Addressed above.
 
  • Like
Reactions: DeepIn2U
No. The point is that the same legal constraints that inevitably led to google’s actions will lead to the exact same behavior at Apple. This is the procedure.

Once CSAM is ever flagged in an account by the automated system, the legal risks are too great to do anything other than to shut down the account and report the user. There is nothing in the law to indemnify google or Apple once they’ve become suspicious of CSAM in an account (documentation of a review or reporting it to law enforcement doesn’t shield the company from the risk).

Apple’s layers are meaningless because it’s not going to assume any of the huge legal risk for a user. Just like google doesn’t.
That is the issue with this entire system, and I don't disagree. There is no legal recourse for being put through the wringer for a legal image. The reporters are probably free from liability. Be that as it may, we will all have to deal with this new world order. Unless one has a custom phone and a custom cloud setup. Using a commercial cloud, no matter whose, there is a risk of a legal image being reported.
 
  • Like
Reactions: DeepIn2U
Why is it worrying?

If you have nothing to hide, then you have nothing to hide.

See my signature "it's not that i have something to hide, it's that I have nothing for you to see".
nothing to hide doesn't automatically give or provide consent to enter or access.
Just cause I did NOT say the word 'No' doesn't automatically mean I've provided the word 'Yes'.

Because it’s a violation of personal privacy and it sets a dangerous precedent for the future. Today we let Apple scan our photos and tomorrow what?

It’s funny, years ago Apple was firmly standing against the US Government asking for a back door into a locked iPhone but today they’re proposing passive surveillance of individuals’ photo libraries.

I’m all for the identification and prosecution of people breaking laws but the whole “if you have nothing to hide don’t worry about it” argument is so dismissal of such a fundamental right.

I'm highlighting in BOLD for specific affect of rebuttal.

1. source of those pictures - if taken from public domain (source of the picture itself, not the subject) then it's not personal privacy. Example if you saved a picture or screenshot if a picture from the internet it's already public and not personal privacy. The real specifics in courts of law I'm not aware of, so I could be wrong but likely in this specific mention I doubt it's personal privacy protections.

2. If a law allows for searching device, or cloud storage ... then its likely not a breach of personal privacy.
More importantly you'd need to think where are you storing your pictures and what content OF those pictures has you thinking it's personal privacy.

Like a) nudes of children. If not your children or children you've adopted and simple clothing or baby pictures (I'm still feeling the tradition of taken family photos of nude babies is creepy period, but that's just me) or nude pictures of your kids whom have had a rash or mark or health concern forwarded to health care practitioners ... then that's on you/them/they.


The example of your kids with health issues sent to health care practitioners is something that not too recently occurred since a father at home noticed his child had a strange protrusion in the grown of his child and took a picture and sent to the doctors is now in legal hot water. Name is public and lost his job without even an explanation. not sure how that turned out though.

That is where this situation can go wrong! photo he used or OS or cloud service I don't know/cannot recall for the life of me. In such a case I wonder if Apple's implementation if a parent would get flagged in this scenario?!
 
  • Like
Reactions: I7guy
Then don't trust Apple. Don't update to ios 16. Do what you feel best, and if that means android, you should think about it.
Privacy is a big part of why I don't want an android. Also, why I voice how much I don't want Apple going down this slippery slope and maybe, eventually turning into Google, or another company that doesn't request user privacy.

Also, all these hypothetical arguments are great from where we sit, but if other countries can figure out how to add hashes of homosexuality, oppressed people/religions, it will be a sad day for the world.
 
  • Like
Reactions: sorgo †
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.