Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Based on some podcasts I've listened to, they're not even that dumb. There was a server in Europe that was incredibly well guarded. Law enforcement officials finally figured out who was running it, and they had to take it over and also arrest an accomplice at exactly the same time. The serve had set up a sort of dead man's switch, where if the accomplice didn't post at a specific regular interval, everyone would know the site was compromised by the authorities. (The reasoning was that the cops would take over the site, but not also be able to access the account of the accomplice, who would then not be able to post.) They posted the all-clear, but just in time.

Then they spent months (the authorities did) facilitating the distribution of CSAM materials to keep the sting going. So essentially the law enforcement officials of Australia were distributing child sex materials for months and months after taking over the site, in the hopes of snagging the largest new-content providers.

It's a bit of a slippery slope there, allowing low-level offenders to traffic this material in the hopes of getting a bigger person. I don't know if they ever managed to catch one of these whales, and I don't think they prosecuted anyone smaller. When it was found that the site was owned by the officials, people just slipped away to other sites.

That is nothing. The New York Times had an article on how the FBI hacked a CSAM distribution site and have been monitoring it for over 6 years and still haven't taken any action. They just kept it going as a surveillance tool I guess.
 
  • Wow
Reactions: dk001
Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.
To me this sounds like they’re waiting for people to buy the new iPhone in September-December and then release it it in January after everyone has bought the new iPhone. Shady af.
 
I get it that CSAM within the Apple ecosystem is probably a problem, possibly even a problem aggravated by Apple's heretofore good privacy practices. That said, on-device surveillance, even surveillance with as many privacy protections as Apple tried to build into their recently-announced-but-now-delayed system, is a non-starter for me... if they eventually go forward with this I'm out, full stop. I'll switch to LineageOS or a PinePhone, and fall back to FreeBSD. I have firsthand experience with US intelligence hacking a computer I was using because of my political views in the early 2000's; it was crude back then, but it happened.

I think that Apple need to go back and re-visit what is the problem they're trying to solve here? Since they have said that the scanning/hashing occurs prior to upload to iCloud it seems reasonable to assume that their goal is to disrupt sharing of CSAM. It seems to me that would most likely be happening via Shared Albums in Photos; I would be fine with server-side scanning of Shared Albums, since that is what every other cloud vendor is already doing. I really do think that in this deeply polarized modern environment (red vs blue, vaxxed vs. unvaxxed, etc) the slippery slope argument really does apply - at some point on-device scanning for CSAM will turn into on-device scanning for evidence of wrongthink.
 
Once more this affair shows sideloading is necessary, I want apps created independently from Apples restrictions. A backup option is now Apple only, with that monopolized.. freedom of choice is necessary in every part of life, including your idevice..
Imagine apple not able removing an app like hk.live from the appstore.
Because apple monopolizes every aspect on its idevices, its vulnerability is just that, it can be pressured to change policy..by the public or other entities .. if there was a different option for backup besides apple, I simply could switch to another service..
 
  • Like
Reactions: slomojoe
Good, now
Bash:
mv ~/Projects/Apple-csam/ > /dev/null
the whole thing.
Erm...better

Bash:
rm -R ~/Projects/Apple-csam/

Or
Bash:
rm -rf ~/Projects/Apple-csam/
Shouldn't that be the above?

Capital R, for removing a directory/Folder?


I've said this right in the beginning...
This whole crap project should be abandoned ASAP, hell, whoever thought this was/is a good
idea *should be getting the pink slip.


*I forgive you Craig Federighi ;)
 
Last edited:
Can someone explain to me how this "sets a precedent for future scans"? I've been following this for weeks but still don't really understand this point.

The same applies to all the other examples mentioned - persecution of political activists is one that comes up a lot. If I were such an activist (or insert any other persecuted demographic) and I have photos of me at some demonstration or rally (or insert any other compromising activity), how can a match by flagged without the actual photos first being in the hashed database?
Since political activists may collect and share known images and memes (think of all the Tiananmen Square photos that still make the round), there are certainly valid concerns that the same algorithms could be used to scan for these instead of CSAM, but it's a stretch. These people would also have to be storing those photos in iCloud Photo Library.

Most of the actual privacy advocates and activists are focused on the examples of what could happen, and for some of them and the work they do, that's totally fair, but it's not something that's going to directly affect most people.

Almost everybody else who is concerned about this either doesn't fully understand what's going on (e.g., "Apple is going to be scanning everything on my iPhone!"), is playing fast and loose with extreme "what if" scenarios (e.g., "What if Apple suddenly changed the algorithm and all the rules to do something completely different?"), or they're simply philosophically opposed to on-device scanning in any form.

Is it the potential to compromise the hash comparison algorithm such that less and less exact matches can be garnered? This to me seems to be the only way to "trick" the system into reporting my unique photos as matching the CSAM (or other) database, however, I don't see this one articulated too clearly in any of the arguments presented. Also I don't see this as being particularly efficient since in order the ensure retrieval of any targeted photo the bar would have to be set so low that all photos would be most likely be reported and sent for investigation.
Due to the way the hashing algorithms work, it wouldn't be possible to get "less exact" matches. Loosening the algorithm would just result in more false positives, which would end up being completely unrelated photos.

Some folks are also conflating this with the Communication Safety in Messages feature, which is fair as Apple announced both features at the same time, mixing up the messaging, but it's a completely separate feature. Communication Safety does use machine learning to identify sexually explicit photos and blur them out for users under 18 years of age, but it doesn't report anything at all back to Apple. At most, parents can get notified if their kids below the age of 13 actually view or send explicit photos. The whole thing is opt-in, however, and can only be used by those in an iCloud Family Sharing group.

That said, some do fear that once Apple has opened the door to this kind of on-device scanning and reporting, it could very well choose to build algorithms that would scan for a lot more, but again that's getting into extreme "What if?" scenarios. If you're willing to go down that road, you should have stopped using an iPhone years ago, as Apple "could" do just about anything it wants to behind your back.
 
The very few dislikes to your good statements always come from the same people. That was fantastic: now I could easily update my ignore list. I thank MacRumors for this option.
Hahaha, glad you caught that. Your smart! Yup, same dislikes from the same people. I’m used too it now.

Can’t please everyone, right?
 
I'm really glad this is being delayed. I listened to enough tech podcasts about this (ATP, Rocket, other Relay FM podcasts) to know this is a bad idea and a dumb backdoor. Of course everyone is against the kind of material this flags, but this could so easily be abused. Think of queer kids in the south that get caught by parents who then abuse or even kill them. what about consenting adult couples who share nudes? What if one of them was "young looking" (a late teens couple, two 19 year olds). What about the government just adding terroist hashes to the database so they can backdoor their way into anyone who posts stuff critical of the government.

This is the same garbage about shutting down back pages a few years ago, because of "trafficking". What it really did was hurt adult sex workers who had used it as a safe way of screening clients. Puritanical values aren't just out dated and bad, they actively hurt the things they claim to try and protect. You want to actually cut down on child trafficking? Legalize sex work, tax and regulate it. That would do a hell of a lot more to curb this **** than just letting them have a bad backdoor into everyone's phones.

If you want to actually catch more child abusers? Do Surveillance on evangelical churches. That's where most of the abusers are both trained and hang out.
 
  • Like
Reactions: PC_tech
I was okay with the under 13, let a parent know if your kid is receiving bad images. It was opt-in and between a parent and child. Actually I’d Arthur the feature should be available to parents up to 16 or 18. It is annoying that the way rules are set up, when a kid turns 13, it is like parents are told your kid is a mini adult, you have no control over their actions (aside from taking away a kids computer/phone). Disclosure, I’m not a parent.

The CSAM stuff, I was wholly against. There was no reason to check my stuff for kiddie porn when Apple pushes privacy hard. Disclosure, I’m against child porn.
 
Apple should be embarrassed they even started down the road of invading the privacy of people who they promised to protect. Yes, we have given up a lot of privacy, involuntarily/voluntarily for the siren song of the Internet. But Apple was one of the last places I thought would go "Brave New World" on us!

I say it is time to re-establish our right to privacy, and make it known to Apple, and other corporations this is not acceptable. I hope the big backlash to this sort of "well meaning" privacy invasion continues. Bad actors are bad enough, but when Apple puts on the Black Hat I cannot contain my disappointment.
 
  • Like
Reactions: jseymour
I was okay with the under 13, let a parent know if your kid is receiving bad images. It was opt-in and between a parent and child. Actually I’d Arthur the feature should be available to parents up to 16 or 18. It is annoying that the way rules are set up, when a kid turns 13, it is like parents are told your kid is a mini adult, you have no control over their actions (aside from taking away a kids computer/phone). Disclosure, I’m not a parent.

The CSAM stuff, I was wholly against. There was no reason to check my stuff for kiddie porn when Apple pushes privacy hard. Disclosure, I’m against child porn.
That's a big no because as I stated listening to a podcast with a tech expert, this is a great way for bad parents to target their queer children. But like everyone else, I'm against child abuse material too of course, and grooming.
 
MacRumors I can’t wait for the day when you post in the headline “Apple decides to cancel CSAM, period”.

Don’t lose your vision, Apple.

When it comes to privacy I want to have world-class protection from, Apple.

View attachment 1826746
Seeing this banner makes me realize what hypocrites Apple have become planning to roll out this CSAM scanning system. It is a poor technological world we live in.
 
The thing is, the bulk of child porn is consumed in the US and produced in Japan and the Philippines. China culturally does not sexualize children. Neither did Japan, actually, until the anime and rorikon (lolita obsession) kicked in. This is a relatively recent social phenomenon. The US soldiers stationed in the Philippines are the biggest consumer of, not child porn, but child prostitution.

I researched this extensively when I vacationed in the Philippines and Thailand. I was appalled by the treatment of women and the condition of children there. Thailand was much better in comparison, but they have the issue of "transgender freak show". Boys of poor families are being sold off to be neutered and end up in circuses as ladyboys. those ladyboys don't just show you their bits and what they got and entertain you. They do things that are truly damaging to their bodies. Therefore, they only live a very short life. For example, things they inject to keep them look a certain way, and when they do their freak show, think about stainless steel ping-pong balls coming out of places that are not supposed to host 8 ping-pong balls. I will leave you to imaging the rest, and the type of freak show they do.

Edit:

To clarify, lolita is generally regarded as around 14 years old girls. Rorikon is much younger, around preteen girls.

WHAT?

Where are you getting this from? Disgusting people are simply that. Culture, country, faith etc does NOT come into this. I dislike seeing posts like this without any factual information linked ... it causes pain and separation of cultures and people - I don't care for pedophiles ... I'm talking normal people on these forums and in our communities. I'm NOT about hate in ANY form. Not saying your post does this - but it's close to affecting thoughts or ideals on that path.

Example ... here in Canada : https://thewalrus.ca/how-police-cracked-canadas-largest-child-pornography-ring/
this is just an example ... it's EVERYWHERE. Hence the word human trafficking.

I think you're statement is when it hits mainstream and trending ... but the source, gross enough for me to even think about is a global issue.

That said I'm sure you witnessed first hand IN those countries a lot of horrible things - horrible to human beings. Yuck - and super saddened children ANYWHERE going through this.

But the boldest part quoted above ... that's not just happening there. how it happens may differ in various parts of the world, but the bulk ... without proof I cannot believe - maybe cause I'm a father and even if I wasn't this whole thing just makes me sick I want to vomit and cry for hours in a corner. I just cannot believe it's situated in 1 place more than any other on this planet.

The scanning for known records I'm all for ... but without checks and proper iron-clad assurances that completely non-related data collection of upstanding citizens of good morals that's the issue most users are against.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.