Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Yeah. Just do it and dont think twice.
Yeah, I have some thinking to do.
Interesting quote -- "Preserving the privacy of child predators is absolutely inexcusable." Unfortunately, our current Administration would disagree.
Yeah, that is the worst aspect about this Administration. They promised us transparency and accountability about it, and they've spent the last year defending predators, acting like its a hoax, pfffft, what a load of BS. Seriously.
I don’t think they’re in bed with them because they like them. I think they’ve just done the cost benefit analysis of “kiss trump’s ass, stock price remains stable.” But I don’t think that means they’d just abandon all their privacy beliefs, this is the company that told the FBI to f*** off, and won.
Switching from Apple to Android might be doable, but the last time I had an android phone I hated it so I'm definitely not going to be getting a Samsung. I have a lot of concerns about Android too, so I have to weigh it accordingly.
 
Funny, they have no problem eliminating our access to information and surveilling and censoring everyday users, but when actual criminals, who are openly boasting about child abuse are around, somehow no one seems to get charged. I hope Apple squashes this case with all due haste. I mean, what if the next Epstein is using iCloud Drive? seems like even if the DAs had access to the material, if the perpetrators are wealthy and powerful enough, they'll end up with no punishment, not even an investigation.
 
Switching from Apple to Android might be doable, but the last time I had an android phone I hated it so I'm definitely not going to be getting a Samsung. I have a lot of concerns about Android too, so I have to weigh it accordingly.
Last time I tried it, I felt like the apps just didn't work together in both function and design. Going back to Apple, you realize how nice it is to have apps that use a single SDK and work so well together. At the same time, I do keep an Android phone handy because Apple blocked network-level access to apps so a lot of tools I use are no longer on iOS and I have to use apps on Android (network scanning, as an example).
 
Funny, they have no problem eliminating our access to information and surveilling and censoring everyday users, but when actual criminals, who are openly boasting about child abuse are around, somehow no one seems to get charged. I hope Apple squashes this case with all due haste. I mean, what if the next Epstein is using iCloud Drive? seems like even if the DAs had access to the material, if the perpetrators are wealthy and powerful enough, they'll end up with no punishment, not even an investigation.
Especially when actual child sex abusers are CURRENTLY in the White House. Like Howard Lutnick, for example. He laughs whenever someone mentions the abuses of Epstein. Truly demonic.
 
the fact that Apple is so in bed with the government now
There is an appearance that the CEO is trying to appease the President. CEOs always have to be half politician. But that is far removed from both the rest of Apple and the rest of the government. The US still has an antitrust case against Apple for monopolization. And they just threatened Apple because Apple News is not promoting enough conservative news sources.
 
West Virginian moral outrage vs politically left, trillion dollar company!

Stay tuned!

Other related news: Tim gets Epstein files ...pass.
 
Last edited:
  • Like
Reactions: NappyHead
I'm on Apple's side on this one. Fully against CSAM, but not at the expense of individual privacy. In this age of E2E and data-at-rest encryption fighting digital crime is really hard, I get it. I think LE already has some epically capable tools for policing, Stingray, Pegasus, etc, but we can't sacrifice trust in our privacy for them. That would destroy everything. Parents have to parent, LE has to EL, and Bad Guys have to bad... same as it ever was. Can't upend the barrel for a few bad apples (so to speak...). Unfortunately.
 
McKuskey is apparently a member of the Federalist Society, who have been pushing things like requiring ID to be online and other limits on internet activity meant to curtail anything they don't like. They want to put parents first, though that is code for making sure that kids aren't exposed to anything outside of their constricted worldview that's pushing the US to be a Christian nation ruled by white people instead of a nation of multiple cultures and religions that's enshrined in the Constitution.

When something like this comes out of left field it's a good idea otherwise look at the background of the people pushing it. It makes no sense for WV to sue Apple over this unless there's another agent pushing their puppet to make the move.
I disagree with this guys approach but as a parent "put parents first" can actually mean put parents first not some bizarre conspiracy about ideology that you cooked up. It shouldn't be so hard to keep kids off social media or porn. Many countries are limited teens and children access to the internet because of the overwhelming data showing its bad for them emotionally and mentally. We have the data now...we made a mistake as a society. We can correct it.
 
West Virginia's Attorney General JB McCuskey today announced a lawsuit against Apple, accusing the company of knowingly allowing iCloud to be used to distribute and store child sexual abuse material (CSAM). McCuskey says that Apple has opted to "do nothing about it" for years.
When is McCuskey going to sue X / xAI / SpaceX?


Elon Musk’s AI chatbot generates child sexual images

Elon Musk’s artificial intelligence chatbot has generated sexualised images of children that have been shared on social media platform X, raising concerns about the safety of a model used by millions.

Over the past few days, users have been able to get Grok, the AI chatbot developed by Musk’s xAI, to create sexual images of children, which goes against the company’s user guidelines.
 
If the cloud is the modern form of a private hard drive, it should be protected the same way. Child pornography is unfortunate, but image Seagate or Samsung checking what data is on our private hard drives. That's just not their business. Child pornography is just an excuse to invade our privacy in general.

And as general rule your should never let your cloud provider do the only encryption. Encrypt your files on your own and then upload them to the cloud! Apple says it does not have a back door, but it could easily create one if it wanted.
 
  • Like
Reactions: Timpetus
Scanning for this stuff is just comparing image hashes against a database of known bad image hashes. There’s no privacy being violated. Nobody is looking at anyone’s cat pictures here.

The reason, I suspect, that apple isn’t reporting more images is because predators aren’t trying to store their images in iCloud to begin with. The VA thing is political theater.
 
The CSAM protection is hardly protection at all, anyway. In reality, it's doing a checksum of known images of child porn that have already been discovered. By the time the photo is made and initially distributed, the damage has been done. The predator is in control of a kid and forcing him/her to take these photos or shoot these videos!

If brand new photos or videos are made, CSAM will happily let them slip on past.

And maybe this one's debatable, but I've always felt it's a FAR less serious situation that some random person was found in possession of some of this material in a digital format on some cloud storage or mailbox of theirs, vs finding the people actively MAKING the material. (I personally know teenagers who got in trouble for possessing some of this stuff, when what really happened is they were playing online games with someone who convinced them to download an archive file. Maybe they lied and told them it had game cheats or cracks or who knows what? But as soon as it landed on their cloud storage, it was auto-detected and they got a visit from authorities.)
 
The CSAM protection is hardly protection at all, anyway. In reality, it's doing a checksum of known images of child porn that have already been discovered. By the time the photo is made and initially distributed, the damage has been done. The predator is in control of a kid and forcing him/her to take these photos or shoot these videos!

If brand new photos or videos are made, CSAM will happily let them slip on past.
Right, and having a classifier try to identify this stuff would cause so many false positives that it’s not worth developing. I mean, they can’t even get Siri to reliably “call mom”.
 
Tell me the West Virginia state budget is in the red without telling me the West Virginia state budget is in the red. The real question is if the politicians involved are angling for donations or just publicity.
 
  • Like
Reactions: Kengineer
Maybe the GA hopes this will go to court in 2028 when he wants to be re-elected as GA. On the other hand, a Republican being worried about CSAM is a bit weird, when you read everything about the Epstein Files. But this might be completely unrelated.

I'm just annoyed that with these monitoring systems, everyone is essentially being accused of sharing those images, and only if the scanners gives its green light, you are declared innocent until the next picture.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.