Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I don't see a difference here. Don't all photos generated / stored on the iPhone get uploaded to the iCloud? Are users really not creating AppleIDs to avoid this?
No. You can turn off icloud photo uploads. Mine don’t get uploaded since I don’t care to pay for additional storage kust to have photos in the cloud.
But the key is that the hash was generated on your device, before the image was uploaded to iCloud. So the implementation of that hash generating scan was on-device.
 
Any living being defending this whole CSAM scan implementation thing are the very same people who would turn in people who the Party of the 0range Charlatan labels undesirables. It is not a joke. It’s a matter of the truth
It’s similar to how Emperor Palpatine from Star Wars had both sides fighting each other when they never saw who the true enemy was until it was too late.
Luke Skywalker VS Darth Vader = Divide & Conquer so the Rich/Ruling Class people can eat a buffet in luxury while the others are fighting each other.

It's worse than that.

Hashes and could be retrospectively identified. So for example, what is policy now will change over time. You are already bagged and tagged ready for any future policy changes.

All you have to do is reach the minimum standard of undesirable and they have the stick to hit you with.
 
These things don't work properly anyway. I keep reading about people being in legal trouble over some gifs being flagged incorrectly or over your own photos here in Germany all the time.

I always wonder how that happens anyway if WhatsApp is supposedly end to end encrypted and the content is unknown
Honestly, if I was to store a photo that was legal but personal, and I was to store it online, I would encrypt it using PGP or GPG first. Don't ever trust what is happening on a server, if you did not write the code yourself. If you don't have full control of that server, don't trust it anyway.
 
  • Like
Reactions: Stavros86
What's next? Knife manufacturers being sued for stabbings?
It happens. Gun manufacturers have been successfully sued (counting a settlement as a success) for shootings.


 
Maybe we should sue display manufactures for having the ability to display illicit content. We could also sue town, counties, states, and national governments for allowing people who engage in illegal activities to live within their boarders. I mean surely the government should be “scanning” your home to make sure you aren’t engaged in any activity that harms others right?

If we are OK holding innocent people accountable for the actions of the perpetrators, it kind of seems like we could sue anyone and everyone…

Agreed. Maybe they should also sue any and all lawyers who have not already sued some big tech company or the government over this, because not suing is enabling it to happen or something like that right? /s
 
Last edited:
It happens. Gun manufacturers have been successfully sued (counting a settlement as a success) for shootings.


I don't believe they should - they aren't doing anything illegal. Those cases rely heavily on anti-second amendment right sentiment to force gun manufacturers to stop selling "assault weapons." By this logic, is the goal to stop Apple from selling devices with iCloud? If you want to ban high capacity weapons, then why not enforce a national ban and then the gun manufacturers wouldn't exist? And would an equivalent ban for Apple look like? No iPhones in CA or NY?
 
Last edited:
NOT A GOOD IDEA WITH THE CURRENT INCOMING ADMINISTRATION.
Northern California is of the same *Party* that is returning to power next month in America 😳 they always vote that direction and that’s not a good idea at all. Apple will endanger everyone who is LGBTQ+ and P0C including migrants.
What does skin color and identity politics have to do with it? You guys love injecting these topics into everything.
 
Last edited:
Sued if you do, sued if you don’t.

Attorneys are the people winning either way.

This was exactly my thought. Before reading the article it was hard to tell if they were going to be sued for trying to release it, not releasing it, etc.

I don't know how any company can grow to be successful in the US anymore. As soon as a company starts getting money, the lawsuits just endlessly roll in. Have to retain a building full of lawyers just to maintain normal operations.
 
This is definitely a tricky one for Apple considering privacy is touted by them as matched by no other.
 
If they can scan for one thing, they can scan for something else. I don't like child porn. Those who create it should be punished in the most extreme way possible. Those who view it should suffer a fate almost as bad. The problem is, I don't trust what the incoming administration will do with this technology. What will they force Apple to do? Will saying nasty things about the Dear Leader be a thought crime? If not in the upcoming administration, perhaps the next.

Note:
This is the political forum and my post is directly relevant to the question being discussed.

This is the lesson that applies to everything that everyone conveniently forgets when their party is in power. It doesn't matter which party is in power. The one thing both sides can agree on is government power. They may differ slightly in their approach to getting it, but they will absolutely take advantage of what the last administration did, and it never ratchets down, only up.
 
If you say you are going to do something that can and will help the victims of abuse, saying you will give victims peace of mind that abusers will be caught and punished and then renege on what you said you was going to do, you can be sued for it.

If the lead plaintiff can prove that images of her when as an infant are being found on Apple devices after 2021 when CASM was supposed to have been introduced, images that would have been detected by CASM and thus those distributing the images being caught and arrested then Apple does have a case to answer.

Apple will now be forced to argue in court that the rights of privacy to it's customers out weighs the rights of victims of abuse to not have their images passed around on Apple devices.
 
One can sue for anything, but I don't think they have much legal grounds. It isn't Apple's legal responsibility to do this any more than Google's, Microsoft's, Facebook's, etc. It is like suing the state for building roads that allow people to be trafficked. If they built the roads, they should prevent people from using them in ways that hurt people.
 
Apple had good intentions with CSAM, but it was abandonned and for good reasons. iThink Apple was drunk having had the idea to introduce this thing.
I'm not even sure Apple had this idea in the first place. I suspect they were getting pressured by the government to implement this, as it ran completely counter to their privacy messaging at the time. It's not like Apple doesn't "work with governments" in order to sell in those markets. They jump through burning hoops to sell in China.

It was a slippery slope idea for sure. Today, it's going after really bad criminals we all agree are bad. Tomorrow, maybe it's going to go after wrong think. I guess we will get to see what comes to light in this lawsuit. It looks like AI is going to breed a whole new crop of lawsuits, where the claim will be the AI should have known something was wrong and done something about it.
 
The lead plaintiff in the lawsuit, filing under a pseudonym, said she continues to receive law enforcement notices about individuals being charged with possessing abuse images of her from when she was an infant.
That sucks, but is it not possible to ask them to stop notifying her? I also fail to see how she thinks the notifications will decrease by implementing a system that will catch way more people with her images?
 
The underlying issue is that individuals, myself included, tend to view their relationship with Apple as a straightforward, one-to-one connection. If I adhere to the terms of use and avoid causing trouble, there should be no reason for Apple to investigate or report me. However, the implementation of scanning tools fundamentally alters this dynamic. These tools aggregate (in a legal, technical, actual or virtual sense) and proactively police the digital content of billions of users, forming a kind of surveillance programme, shifting the relationship from one of security and simplicity to something far more complex and unsettling.

What was once a direct and relatively transparent agreement (based on offer, consideration and acceptance) becomes a situation where my content is effectively hostage to the shifting tides of US and international politics, regulatory diktats, and the tidal influence of pressure groups and media conglomerates. While billions of us have nothing to hide, we also have nothing to share unless we choose to do so knowingly, in accordance with the specific terms of our relationship with Apple. Any shift towards scanning undermines the foundational trust and autonomy of that relationship. My heart breaks that people have been victims of abuse, but this doesn't justify, in any moral or ethical framework I think reasonable, the mass surveillance of billions of users. In the understandable race for justice, it is not justice to discriminate against my right to privacy.
 
I don't believe they should. They aren't doing anything that isn't illegal. Those cases rely heavily on anti-second amendment right sentiment to force gun manufacturers to stop selling "assault weapons." By this logic, is the goal to stop Apple from selling devices with iCloud? If you want to ban high capacity weapons, then why not enforce a national ban and then the gun manufacturers wouldn't exist? And would an equivalent ban for Apple look like? No iPhones in CA or NY?
I was simply providing links to show that gun manufacturers have been sued for how their guns have been used. Whether any of the lawsuits should have been made is not a discussion I have time for today. Maybe someone else will jump in with a reply.
 
In other news, tire companies are being sued because tires are used in getaway cars for violent crimes... :rolleyes:

Apple's CSAM system was the kind of cockamamie idea that engineers and managers dream up without checking with experts in other fields. I understand that illegal child pornography is a social issue that deeply affects many people, but the CSAM scheme was ripe for abuse. Indeed, in its technical documents made available online, Apple published openly a roadmap for authoritarian regimes to detect on mobile devices any kind of content in pictures, from flags to posters to faces. That was Apple's second mistake, after conceiving of this ludicrous plan in the first place. The proposed CSAM system was invasive, potentially vulnerable to countermeasures, and prone to false positives. I am quite happy for Apple to scan images on their servers (their property). I also don't mind Apple scanning incoming images on the phones of minors, with parental consent, to alert them to inappropriate content. However, I object to Apple installing any kind of blanket surveillance software on my phone.

If there is a suspected crime, get a warrant to investigate. It's that simple. This lawsuit should be thrown out.
 
Maybe we should sue display manufactures for having the ability to display illicit content. We could also sue town, counties, states, and national governments for allowing people who engage in illegal activities to live within their boarders. I mean surely the government should be “scanning” your home to make sure you aren’t engaged in any activity that harms others right?

If we are OK holding innocent people accountable for the actions of the perpetrators, it kind of seems like we could sue anyone and everyone…
*lawyers furiously taking notes*
 
  • Haha
Reactions: Morod
The underlying issue is that individuals, myself included, tend to view their relationship with Apple as a straightforward, one-to-one connection. If I adhere to the terms of use and avoid causing trouble, there should be no reason for Apple to investigate or report me. However, the implementation of scanning tools fundamentally alters this dynamic. These tools aggregate (in a legal, technical, actual or virtual sense) and proactively police the digital content of billions of users, forming a kind of surveillance programme, shifting the relationship from one of security and simplicity to something far more complex and unsettling.

What was once a direct and relatively transparent agreement (based on offer, consideration and acceptance) becomes a situation where my content is effectively hostage to the shifting tides of US and international politics, regulatory diktats, and the tidal influence of pressure groups and media conglomerates. While billions of us have nothing to hide, we also have nothing to share unless we choose to do so knowingly, in accordance with the specific terms of our relationship with Apple. Any shift towards scanning undermines the foundational trust and autonomy of that relationship. My heart breaks that people have been victims of abuse, but this doesn't justify, in any moral or ethical framework I think reasonable, the mass surveillance of billions of users. In the understandable race for justice, it is not justice to discriminate against my right to privacy.
People in glass houses have a hard time with going to the bathroom.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.