Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Who’s the genius who presented this idea to Tim Cook? Scan our library? That’s what Google, Amazon and Facebook do, not Apple!

Is it Apple job to policing our contents? People buying iPhone because of opposite reason.
 
What you say is a crime today, is someone else's fight for freedom.

No one, I say again, NO ONE is in favor of seeing children exploited, abused, or harmed. At least no on here I would hope. But that is not the point.

The point is, this is indeed a slippery slope. Much akin to Apple saying they will unlock a phone for law enforcement via a "back door". Which, at present it is my understanding they won't because it doesn't exist

Once the mechanism exists, once the door is installed, or the code made part of the basic building blocks of how the machine operates, it's no longer a question of not being able to do it, but when it will be done. At that point, it's incumbent upon the gatekeepers to decide what is and isn't permitted, or acceptable, or legal.

These are decisions made by human beings. Just as humans are capable of horrible evil acts (like exploitation of children) for their own personal reasons, they can be capable for such evil on a political scale.

Today, child exploitation. Tomorrow, someplace where being LGBTQ or pro democracy where Apple does business. Apple has all ready proven they will bow to the whims of foreign governments who threaten to cut off their business (and revenue stream).

When countries like China are jailing dissidents for expressing pro democracy viewpoints (see footnote link), one can only question how long it is before this sort of invasiveness is unleashed for nefarious reasons.

This is scary stuff. Apple is wrong on this. One hundred percent wrong. People (good people, with liberal with a small "l" ideals will suffer and die because of this). I have no doubt.

They say it could never happen here. Wherever "here" is. Well, it can and probably will happen wherever you are. This is one bigger step towards a high tech dystopia.


Very well said 👏🏻
 
Isn’t it even forbidden to show bare breasts on the beach in the U.S., e.g while taking a topless sun bath? There is a reason why things like “The free nipple movement” exist there.
Public nudity is not illegal here in Seattle. So I don’t think you can say it’s a federal thing. More of state or city laws in most parts of the country.
 
  • Like
Reactions: Wildkraut
The other article said the analysis all happens on device, not the cloud. So are they really creating a back door?

That said, this line concerns me from the first article:

“Apple then manually reviews each report to confirm there is a match, disables the user's iCloud account, and sends a report to NCMEC. Apple is not sharing what its exact threshold is, but ensures an "extremely high level of accuracy" that accounts are not incorrectly flagged.”

I’m all for stopping child porn, predators, sex trafficking, etc (and regular porn for that matter, but that’s a rabbit trail for another discussion). But this feels like an over-reach. I just can’t imagine there won’t be some false positives along the way, and this will ruin those peoples lives.

I keep hearing people say what I bolded in your post above, and I'm trying to figure out why you guys are so confused about this. Look at the very part of the article you quoted: "Apple then manually reviews each report to confirm there is a match." So unless you have confirmed child porn images on your device that you then upload to iCloud, your life will not be ruined because nothing will come of false positives. Why would Apple report an innocent image after review? Makes no sense. I don't understand your concern here at all.
 
Snowden is our martyr.


Our children will be lucky if they even get to read about him at this rate. He won’t be vilified. He will simply be written out of the human record.
His page is on Wikipedia... so long as we can access Wikipedia, his memory will be preserved.

Unfortunately, I think both China and India censor the internet among others, and I think many others are working towards it...

We need to work on making it so the internet can't be censored.
 
We are conflating two features. There is a parental control feature and a hash database feature against known hashes. I don’t know much about the hash feature but the parental control option is fantastic
 
  • Like
Reactions: peanuts_of_pathos
I don't follow this slippery slope thought process. He is essentially saying he doesn't trust Apple not to abuse this functionality. You could literally apply this to anything if you believe Apple has ill-intentions.

If we believe Apple has some master plan to abuse this then this opens open every part of their ecosystem to the same "it could be abused if they want to" statement.

It has nothing to do with Apple's intentions. We've seen time and again that Apple will compromise on their ideals to "comply with the law" in authoritarian and human-rights-abusing countries. This is a system that can scan every single photo on every single (iOS) device on the planet, and compare them against an unreadable and infinitely alterable black list of banned images...just imagine what this tech could be leveraged to do in China, Russia, Saudi Arabia, Nigeria, etc. without users' knowledge or consent.
 
What really pisses me off about all this is the sheer audacity of Apple to present this patently obvious invasion of privacy as a feature. As if we're all idiots. If Tim had written an open letter or done interviews to the tune of, "We don't like this but we have to do it and we think our system is the best way to preserve your privacy," that would be one thing. Instead Apple patronizes us with this BS, as if we should thank them for handing over our privacy on a silver platter.
 
Awful step by apple. And the 'alert your parents if you view a nude' is some awful overreach but typical of apple's strangely prudish approach.
These are not "nudes"... they are known images and/or videos of CSAM (child pornography). It's not 15-17 year olds on social media, it's largely pre-pubescent young children involved in lascivious acts. You obviously don't understand how it all works, since parents with naked kids in the tub have NOTHING to worry about. Read a bit about CSAM and known hashes and I think your blood pressure will come down.
 
I don't follow this slippery slope thought process. He is essentially saying he doesn't trust Apple not to abuse this functionality. You could literally apply this to anything if you believe Apple has ill-intentions.

If we believe Apple has some master plan to abuse this then this opens open every part of their ecosystem to the same "it could be abused if they want to" statement.

Your privacy is yours to lose then…
 
These are not "nudes"... they are known images and/or videos of CSAM (child pornography). It's not 15-17 year olds on social media, it's largely pre-pubescent young children involved in lascivious acts. You obviously don't understand how it all works, since parents with naked kids in the tub have NOTHING to worry about. Read a bit about CSAM and known hashes and I think your blood pressure will come down.
Hardly. Its not about how it works. Its about Apple violating privacy. They have no right at all to view or scan private images. No right at all.
Its a warrantless search they are doing. Violation of consitutional rights even.
 
What really pisses me off about all this is the sheer audacity of Apple to present this patently obvious invasion of privacy as a feature. As if we're all idiots. If Tim had written an open letter or done interviews to the tune of, "We don't like this but we have to do it and we think our system is the best way to preserve your privacy," that would be one thing. Instead Apple patronizes us with this BS, as if we should thank them for handing over our privacy on a silver platter.

Why not? We’ve all fallen for severe erosion over the past 18 months blaming various boogeymen.

We almost deserve it.
 
It should be a requirement to read about hash values, the definition of CSAM, and how a comparison against a hash database works before you post here. A lot of really angry people for no reason. Not even Snowden understands it yet... he says "secret" database. The hash database used by NCMEC and others are used by law enforcement and non-profits. In fact, project VIC is not government at all. A ton of misinformation here and a lot of folks with heightened blood pressure for nothing.
 
It should be a requirement to read about hash values, the definition of CSAM, and how a comparison against a hash database works before you post here. A lot of really angry people for no reason. Not even Snowden understands it yet... he says "secret" database. The hash database used by NCMEC and others are used by law enforcement and non-profits. In fact, project VIC is not government at all. A ton of misinformation here and a lot of folks with heightened blood pressure for nothing.
Again: its not about the tech.

Its about scanning private images. They have no right to do that. The fact that they are not ashamed of violating people‘s privacy shows:

- they don‘t care about privacy. At all. All their bragging about how deeply they care about privacy is debunked as outright marketing lies
- they cannot be trusted
- people are angry, and for good reasons. People caring about privacy and freedom should be angry
 
It should be a requirement to read about hash values, the definition of CSAM, and how a comparison against a hash database works before you post here. A lot of really angry people for no reason. Not even Snowden understands it yet... he says "secret" database. The hash database used by NCMEC and others are used by law enforcement and non-profits. In fact, project VIC is not government at all. A ton of misinformation here and a lot of folks with heightened blood pressure for nothing.
Non-profits use it?

which ones?
 
  • Like
Reactions: peanuts_of_pathos
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.