Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
So the victims are blaming the company that abandoned the project and not the government that allows for the pictures to circulate?? Maybe sue the government? 🙄
The government does not allow the pictures to circulate, the government provides the laws to prevent circulation of the pictures. It is companies that provide devices and communications systems the allow the pictures to circulate and it is clear they have no intention of stopping the picture circulation which is evident by the lead plaintiff in the law suit who keeps getting notifications from law enforcement every time her child image is reported/caught.
 
  • Disagree
Reactions: I7guy and 63W
Then add in re-defining and dog whistles.
Say I don't like drrich2. He orders pineapple on pizza, and I am firmly in the camp of "if you want a salad, order one, but keep it off of my pie."
So I call him a few names. He doesn't care for this and calls me a few names. Because it is the internet, one of us (let's say it's me) eventually calls the other one a "Nazi." I then go on to point out how having "ich" in your screen name is a dog whistle for being a fascist. Never mind that no one heard of this before. I find or make up some obscure reference to who that relates, and now I am telling everyone drrich2 is a known and admitted fascist based on some truly made up garbage and demanding people cancel him. He should never be allowed to hold a job, even as the fries guy at McDonald's because of his hateful nature (how is he supposed to eat without committing crimes, by the way).

This kind of nonsense needs to stop - quickly.
You clearly have no idea what CSAM is.
 
The government does not allow the pictures to circulate, the government provides the laws to prevent circulation of the pictures. It is companies that provide devices and communications systems the allow the pictures to circulate and it is clear they have no intention of stopping the picture circulation which is evident by the lead plaintiff in the law suit who keeps getting notifications from law enforcement every time her child image is reported/caught.

The government needs to enforce the laws they have created. It's not Apples or Samsungs job to stop people from sharing child porn. By your logic, might as well start suing every cell phone maker on the planet and every computer maker. 🙄
 
  • Like
Reactions: I7guy
This kind of nonsense needs to stop - quickly.
Yes. Even in the U.S., where we love our free speech rights, we have concepts such as slander, libel, defamation of character, harassment and menacing. A key issue in some instances is whether a person is making a statement of claimed fact (e.g.: stating I'm a Nazi, when I'm not) vs. stating an opinion (e.g.: that his impression of me is that of a fascist). There are gray areas - such as doxing.
It is companies that provide devices and communications systems the allow the pictures to circulate and it is clear they have no intention of stopping the picture circulation which is evident by the lead plaintiff in the law suit who keeps getting notifications from law enforcement every time her child image is reported/caught.
Ironically, those pictures will keep circulating regardless of whether Apple implements such a system.

There is a key principle in play here that we need to be mindful of - the moral, ethical and practical tradeoffs of a privacy-protecting vs. a surveillance state.

You say companies provide devices and communications systems that 'allow' the CSAM pictures to circulate. And the cable, fiber and satellite systems that relay Internet signals do so. If anyone is using DSL, the phone system is doing so.

It is true that automated systems scanning for objectionable content would lead to catching more malefactors and likely reduce (not eliminate) some illegal behaviors. But why stop at photos? If you and I talk on the phone and one of us mentions 'drug deal,' perhaps AT&T should have an A.I. system record and flag that conservation so a staffer can listen and decide whether we're discussing an episode of Breaking Bad or arranging a narcotics transaction.

Not so long ago, I watched video of Senator Ted Cruz caustically grilling Mark Zuckerberg of Facebook. Cruz wanted to know whether if someone got on Facebook and did a search for child porn the system would flag that for somebody to review (IIRC). That may sound needful. Thing is, let's say he ticked Zuckerberg off, and later out of legitimate concern Cruz decides to see for himself just how easy accessing CSAM is, and it gets flagged. A staff member tips off Zuckerberg, and before an upcoming election Zuckerberg has his people send the police a list of people trying to access 'child porn' on Facebook - including Cruz! Shouldn't be too hard to make sure a liberal media outlet that doesn't like Cruz gets wind of it. In this scenario, Cruz did no wrong and Zuckerberg broke no law, but the consequences could be serious. Disclaimer: that's a hypothetical scenario I made up to make a point.

I get that the perceived security benefits of a surveillance state appeal to some people. Public cameras with facial I.D. technology come to mind (which I'm told are used to some extent in China).

But how monitored are you willing to be?
 
  • Like
Reactions: Night Spring
That's not true. The system was exceptional bad at detecting other kind of content including unknown CSAM, pornorgaphy and nudity.

If an authoritarian regime wants to detect content on phones, there are better and simpler to implement methods already out there as open source.
The system was essentially tuned to detect CSAM material. Similar information processing steps could be used to to identify faces, flags, memes, sounds, etc. from pictures, videos and sound recordings. All that is needed to to replace the database of hashes of CSAM material with hashes representing other stimuli. That's the thing about digital information processing - it's all 1's and 0's to the CPU and it doesn't matter if those sequences of bits represent CSAM or the flag of a political movement. Apple really screwed up presenting their system as something that protects privacy, and then to compound matters they made the details of the surveillance system public.

It might be true that there are simpler ways of detecting content on mobile devices, but how many of those have the stamp of approval from the privacy-minded Apple? How long will it be before some dictator claims they are only doing what Apple proposed to do?
 
  • Like
Reactions: drrich2
It was created because of direct threats from Republican senators in a senate hearing.
Please cite source, but supposing that this is true, the sky hasn't fallen because Apple withdrew this proposal. Even Senators do not have unconstrained power, as Apple is fully aware.
 
The government needs to enforce the laws they have created. It's not Apples or Samsungs job to stop people from sharing child porn. By your logic, might as well start suing every cell phone maker on the planet and every computer maker. 🙄
(in bold), I disagree but we are not going to come to a solution on this so lets end it here.
 
(in bold), I disagree but we are not going to come to a solution on this so lets end it here.

You are not going to come to your solution with your logic. So yeah, this conversation is over.
 
You clearly have no idea what CSAM is.
I do know what CSAM is. I also know a lot of innocent things that could get caught up in such a thing (do you have any idea how many naked baby pictures parents take?), and then a bunch of people trying to act like they are better than other people saying said innocent thing is just a cover/dog whistle/gateway drug/code because they want to win some debate on the internet, over-prosecute someone to force them into a plea deal (or look good on their resume) and so on. I couldn't easily count the examples.
Look at how willing the other side of the political spectrum are to take something out of context against your side (then try really hard to notice your side is every bit as bad).
There was a girl who was 16-17 and sent an indiscreet photo of herself to her boyfriend and was found out (story was on Ars Technica years ago).
She was prosecuted - as an adult because she knew better than to take such photos of one so young because they don't know better.
She was the perpetrator, the victim, an adult, and a child all in the same trial for the same crime, found guilty, and has to register as a sex offender for the rest of her life because of this. No. thank. you.
Even if the record gets cleared, it will still come up in a web search that she was involved in such crimes, putting a damper on her employment chances. Ask all the men falsely accused of sexual assault.

And CSAM wouldn't stop the images from existing. Go after the people actually doing wrong. Do not go scanning people's phones wholesale. And did you think about the precedent that would set? IF they can go fishing in your images, why not your text, email, web browsing, and anything else?

Heck, for the people with the lawsuit, what action did Apple take to harm you? Apple didn't do something to you. They just didn't spend a ton of resources in a failed attempt to erase something from the internet that was never going to solve the problem anyway.
 
The whole thing reads as if it were Apple the actual perpetrator.
Clever messages through and through that give reasonable picture: then why don’t we sue too the city “for allowing” these crimes to happen inside its boundaries? Why don’t we sue the Android consortium, Samsung and Google? Why don’t we sue also the screen manufacturers for allowing to view said crimes?

It’s nonsense, they should be finding, imprisoning and suing THE ACTUAL PERPETRATORS!
 
  • Disagree
  • Like
Reactions: I7guy and laptech
Your answer is 'the courts,' and it is a very bad answer for a number of reasons.

In the U.S., a common saying is that ignorance of the Law is no excuse when caught violating it. Understandable, and if that were as far as it went, likely the way things must be (never mind that the laws are so extensive virtually no one knows them all).

Thank you for spending the time to explain things. This is quite informative!
 
  • Like
Reactions: drrich2
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.