Depends on what the government views to be a crime, Facebook censoring also started with child porn. Then it skipped other crimes altogether and is now thought police 'avant la lettre'.So don’t be a criminal…seems like an easy fix.
Depends on what the government views to be a crime, Facebook censoring also started with child porn. Then it skipped other crimes altogether and is now thought police 'avant la lettre'.So don’t be a criminal…seems like an easy fix.
What you say is a crime today, is someone else's fight for freedom.
No one, I say again, NO ONE is in favor of seeing children exploited, abused, or harmed. At least no on here I would hope. But that is not the point.
The point is, this is indeed a slippery slope. Much akin to Apple saying they will unlock a phone for law enforcement via a "back door". Which, at present it is my understanding they won't because it doesn't exist
Once the mechanism exists, once the door is installed, or the code made part of the basic building blocks of how the machine operates, it's no longer a question of not being able to do it, but when it will be done. At that point, it's incumbent upon the gatekeepers to decide what is and isn't permitted, or acceptable, or legal.
These are decisions made by human beings. Just as humans are capable of horrible evil acts (like exploitation of children) for their own personal reasons, they can be capable for such evil on a political scale.
Today, child exploitation. Tomorrow, someplace where being LGBTQ or pro democracy where Apple does business. Apple has all ready proven they will bow to the whims of foreign governments who threaten to cut off their business (and revenue stream).
When countries like China are jailing dissidents for expressing pro democracy viewpoints (see footnote link), one can only question how long it is before this sort of invasiveness is unleashed for nefarious reasons.
This is scary stuff. Apple is wrong on this. One hundred percent wrong. People (good people, with liberal with a small "l" ideals will suffer and die because of this). I have no doubt.
They say it could never happen here. Wherever "here" is. Well, it can and probably will happen wherever you are. This is one bigger step towards a high tech dystopia.
![]()
Hong Kong: First person jailed under security law given nine years
Tong Ying-kit rode a motorbike into police officers while flying a flag with a protest slogan.www.bbc.com
Public nudity is not illegal here in Seattle. So I don’t think you can say it’s a federal thing. More of state or city laws in most parts of the country.Isn’t it even forbidden to show bare breasts on the beach in the U.S., e.g while taking a topless sun bath? There is a reason why things like “The free nipple movement” exist there.
One can only hope it does. Massively.It’s not Apples job to be an arm of law enforcement.
I have a feeling this will blow up in their face!
I don’t have any pictures that I wouldn’t want scanned but I will not continue to use iCloud photos. In fact I’ve already turned it off.Good intent but a bad idea. I don’t have any pictures that I wouldn’t want scanned and I’ll continue to use iCloud but I don’t like the idea.
The other article said the analysis all happens on device, not the cloud. So are they really creating a back door?
That said, this line concerns me from the first article:
“Apple then manually reviews each report to confirm there is a match, disables the user's iCloud account, and sends a report to NCMEC. Apple is not sharing what its exact threshold is, but ensures an "extremely high level of accuracy" that accounts are not incorrectly flagged.”
I’m all for stopping child porn, predators, sex trafficking, etc (and regular porn for that matter, but that’s a rabbit trail for another discussion). But this feels like an over-reach. I just can’t imagine there won’t be some false positives along the way, and this will ruin those peoples lives.
His page is on Wikipedia... so long as we can access Wikipedia, his memory will be preserved.Snowden is our martyr.
Our children will be lucky if they even get to read about him at this rate. He won’t be vilified. He will simply be written out of the human record.
I don't follow this slippery slope thought process. He is essentially saying he doesn't trust Apple not to abuse this functionality. You could literally apply this to anything if you believe Apple has ill-intentions.
If we believe Apple has some master plan to abuse this then this opens open every part of their ecosystem to the same "it could be abused if they want to" statement.
These are not "nudes"... they are known images and/or videos of CSAM (child pornography). It's not 15-17 year olds on social media, it's largely pre-pubescent young children involved in lascivious acts. You obviously don't understand how it all works, since parents with naked kids in the tub have NOTHING to worry about. Read a bit about CSAM and known hashes and I think your blood pressure will come down.Awful step by apple. And the 'alert your parents if you view a nude' is some awful overreach but typical of apple's strangely prudish approach.
I don't follow this slippery slope thought process. He is essentially saying he doesn't trust Apple not to abuse this functionality. You could literally apply this to anything if you believe Apple has ill-intentions.
If we believe Apple has some master plan to abuse this then this opens open every part of their ecosystem to the same "it could be abused if they want to" statement.
They have already confirmed that MacOS Monterey will scan your photos.Next the Mac and Apple TV?
Hardly. Its not about how it works. Its about Apple violating privacy. They have no right at all to view or scan private images. No right at all.These are not "nudes"... they are known images and/or videos of CSAM (child pornography). It's not 15-17 year olds on social media, it's largely pre-pubescent young children involved in lascivious acts. You obviously don't understand how it all works, since parents with naked kids in the tub have NOTHING to worry about. Read a bit about CSAM and known hashes and I think your blood pressure will come down.
What really pisses me off about all this is the sheer audacity of Apple to present this patently obvious invasion of privacy as a feature. As if we're all idiots. If Tim had written an open letter or done interviews to the tune of, "We don't like this but we have to do it and we think our system is the best way to preserve your privacy," that would be one thing. Instead Apple patronizes us with this BS, as if we should thank them for handing over our privacy on a silver platter.
Wide awake here. See you in camp…Yea….but I’m not. Wake up.
I have pictures I don't want scanned: all of them.
And that is the beauty of it, you can disable iCloud photos. Apple isn't forcing you to use the service, especially if you don't agree to the terms.I don’t have any pictures that I wouldn’t want scanned but I will not continue to use iCloud photos. In fact I’ve already turned it off.
Again: its not about the tech.It should be a requirement to read about hash values, the definition of CSAM, and how a comparison against a hash database works before you post here. A lot of really angry people for no reason. Not even Snowden understands it yet... he says "secret" database. The hash database used by NCMEC and others are used by law enforcement and non-profits. In fact, project VIC is not government at all. A ton of misinformation here and a lot of folks with heightened blood pressure for nothing.
Non-profits use it?It should be a requirement to read about hash values, the definition of CSAM, and how a comparison against a hash database works before you post here. A lot of really angry people for no reason. Not even Snowden understands it yet... he says "secret" database. The hash database used by NCMEC and others are used by law enforcement and non-profits. In fact, project VIC is not government at all. A ton of misinformation here and a lot of folks with heightened blood pressure for nothing.