Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The current system has no in device scanning to be transmitted to the cloud. All metada scans on current iPhones are local only. The server side scanning is only server side. This new system breaks that wall. With local device scanning capable of uploading the matches to the cloud.

And all you are saying are Apple's own policy. We already see how Apple can be forced to change their policy by local laws. Apple's pinky promise holds no ground when they themselves can be forced to allow third party payments for in app purchases, something that Apple was so adamant defending.

The local scanning already done by the Photo app is synced to iCloud Photo Library today.

Apple can be forced to change their software based on local laws in any country they're present in today completely independent of this new system.

Russia even implemented a law which forced Apple not only to change it's policy but to change the code in iOS.
 
None of that reports to the government, which is what the CSAM stuff does and hence the difference. One is for your personal benefit, the other is to turn you in.

So you would be OK with the CSAM Detection system if it didn't report to the government and it was to your benefit?
 
I find myself in rare agreement with @Jayson A . My reasoning goes something like this:
  • Minor children are generally regarded by the law as being the responsibility of their parents or guardians.
  • Likewise: Many Constitutional protections do not apply to minor children.
  • Those opposed to the "nanny state" and big tech substituting for good parenting can hardly make that complaint while, at the same time, urging parents be deprived of tools to help them do just that. See also my first point.
  • Lastly: "My house, my rules." If parents of a minor child are paying for the mobile device(s) and service(s), those parents ought to have rights to determine when and how they're used.
Mind you: I was raised in a day and age where parents' word, as regards minor children, was law. Period. End of story.
I can see your point, just wonder if your also thinking of the redneck down the road that will beat and kill his children because Apple sent him a notification they were sexting….. that will happen, and The gay or transgender outing thing will happen too and probably result in suicides, people tend to judge such tools based on their own situations and mental stability but we live in a messed up world… I am not finding Apple solution very appealing, just questioning if it’s really apples job to bake this in to iOS…. There’s an app for that if you think it’s important to spy on your kids
 
Last edited:
Linux will never have embedded, hidden code. But store your data on the public cloud or believe your android phone won’t have similar scans is not a reasonable assumption, imo.
GrapheneOS on a Google phone is far different from stock Android and even more so than the crap that Samsung and others overlay.

As far as data, local backups are the obvious solution but there are an array of personal cloud solutions (Nextcloud, OwnCloud and others).

The key is to not make assumptions. Research and find the best solution for your needs
 
Again, correct me if I'm wrong (but only with facts if possible), but aren't the images they are scanning for known images from CSAM? So it's not scanning for any photos, It's matching photos against known photos. A little bit of a distinction there I think.

It depends on your definition of scanning. I use scanning as in read, calculate and compare.

The system reads (part of) the image file, sends it through NeuralHash (calculate) and compares the outputted hash to a hash table on device.

You can call it scanning, reading and comparing, or something else. Doesn't really change what's happening.
 
I think ian87w was correct to call you out on your previous statements.

While I have never been to the Philippines, I have been a student of Japanese language and culture for my entire life and have visited Japan, and your statements regarding Japan, to me, read as if they were misinformed at a minimum and perhaps even anti-Japanese or sinophobic, something which seems lamentably prevalent in the USA.

To be clear, the production and distribution of child pornography has been outlawed in Japan since 1999, and additional legislation was introduced in 2014 to ensure that there was no secondary market of "grandfathered in" old content allowed.

By volume, the United States has been the leading source of child pornography distribution, year after year (citation: http://www.amlc.gov.ph/images/PDFs/2020 DEC CHILD PORNOGRAPHY IN THE PHILIPPINES POST-2019 STUDY USING STR DATA.pdf)

Your mention, or perhaps fixation on Lolicon (aka, fictional depictions, yes still legal there as observed by the break down here: https://en.wikipedia.org/wiki/Legality_of_child_pornography) seems particularly sensationalistic. The prevalence of Lolicon as a paraphilia within Japan can be directly linked to the US Occupation of Japan after WWII and its policies of censorship, which also gave rise to the prevalence of so-called "tentacle porn", neither of which is widespread nor widely distributed nor consumed. Similarly, while some point to Hokusai's 蛸と海女 「tako to ama」(The Dream of the Fisherman's Wife) wood block print from 1814 as an early example of so-called tentacle porn, that was an outlier in a culture which produced a significant amount of 春画「shunga」(erotic wood block prints) and fixation on such examples, again, from my vantage tends to be tinged with sensationalism and more often than not, derisive anti-Japanese sentiments.

As an undergraduate student some of this was a subject of study in my Japanese history courses. In particular, MacArthur's Occupation after WWII, resulted in any mention of censorship of Japanese media itself being censored (more about that subject can be understood in John W. Dower's book, Embracing Defeat). Subsequently, producers of erotic content began to treat the media as a "black box" more or less throwing it.sh at the wall until they saw what stuck, with a lot of materials being confiscated by authorities in the process. This was true even after the Occupation ended due to having somehow violated the nebulously worded obscenity legislation in Article 175, with occasional contestations under Article 21 of the Japanese Constitution, which is intended to allow for freedom of expression.

You'll note that centuries old 春画「shunga」prints were explicit in their depictions of sexual activities, whereas after WWII, it became commonplace for mosaic obfuscation around genitalia to be utilized in pornographic content, this is still common in Japan, even to this day. Again, to be clear, in historical as well as current cultural context, Lolicon and tentacle porn were utilized as erotic content creator methodologies to depict non-mature and non-realistic genitalia symbols in a manner which could potentially side step post WWII censors, and are not indicative of any prevailing paraphilia nor pornography production relative to the whole of the adult and AV (Adult Video) industries' output in Japan, especially as contrasted with such industries' output prior to WWII.

I think if looking at present day statistics around child pornography distribution, particularly given that the USA repeatedly appears at the top of such lists, why does the USA seem to be omitted from your statements? Child pornography was outlawed in the USA in 1977, yet the tendencies of individuals such as you to mention Japan, which isn't even in the top 5 of countries which have statistical data for distribution of such content, despite it being outlawed decades more recently to me, something seems amiss about that. From my vantage, it really seems to beg the question of why you think you are writing about how "inclusive" you are? No one prompted you to mention Japan in a thread focused on Apple's CSAM policies, you volunteered that perspective entirely based upon your own biases and lenses of information, which again, do not seem as if they are necessarily rooted in present day legal realities nor statistics.

For things like this, what the law says matters very little. It's the culture of, or the acceptance of such, in the society that matters. The social acceptance of sexualization of minors is simply much higher in Japan than it is in the US. Minors in Tokyo, for example, are being solicited to progressively do certain things for older clients, then more. Those are in legal grey areas most of the time, and then, unreported when becoming illegal. Also, this is a predominantly US forum, the fact that the US has all sorts of sick people is well understood. The data you quoted is also subject to reporting bias and enforcement rates.

Regarding "anti-Japanese or sinophobic", just read my comments in my profile... One day, people call me CCP AI troll, the next day, people call me sinophobic...
 
I can see your point, just wonder if your also thinking of the redneck down the road that will ...
Isn't that essentially the same kind of what-about-ism some are using to justify scanning users' private spaces for CSAM? "But what about pedophiles?"

I think I have to apply the "Let us not throw out the baby with the bathwater" principle. Can some parents use the ability to monitor their minor children's behavior with their mobile devices to do ill? Certainly they can and no doubt some will. Conversely, as I wrote: You can hardly demand parents parent while at the same time depriving them of the tools to do so.
 
  • Sad
  • Like
Reactions: Mendota and dk001
So I have no facts for you specifically, however I would be suspicious of the number of CSAM images on the dark web. What exactly defines CSAM? In a broad sense it could be any image of a naked person under 18yo, viewed by someone 18 or older. I’m dubious about viewing these as a crime, but hey, that’s the law. So who are the primary producers of these photos? Most are probably kids themselves, innocents, or doting parents, again innocents. Now obviously there are a few really despicable people in the world who create harder core content, including parents. Those people should be locked up. But what Apple is doing won’t do anything about the worst producers of CSAM. It’s just a lot of technical jargon to create a backdoor using CSAM as the excuse. No children will be saved, but there will be the backdoor into iOS that the NSA has been craving for years.

The images in the NCMEC database isn't an image of a 17 year old girl who has sent a photo of her breast to her boyfriend. The system would be overwhelmed if they included those.

It's a worst of the worst database. I believe the majority are of prepubescent children.
 
No, they are scanning for manipulated images of known CSAM, it's scanning for pictures that "seems to be these CSAM pictures". Which means your family photos can be false positives. That's why they have the 30 picture threshold, the more pictures you have in the iCloud , the more likely you'll end up getting false flagged.

They are scanning for exact CSAM images and close derivatives, not similar photos.

A practical test performed by Apple showed 3 false positives when 100 million images where scanned.
 
Apple not only lost its way, it went completely bonkers, seeing all their own premium customers as potential predators.

Don’t think that take is quite the way it is.
May not be how Apple thinks of it, but I can see Ian's point. May not help that some of Apple's defenders have asserted those objecting to Apple's plan and celebrating its suspension must be traffickers in child porn :rolleyes:
 
  • Like
Reactions: PC_tech
I wonder how many additional children will be victimized from now until then? Apple the greatest company in history with the greatest humanitarian intentions forced to deal with grandstanding ignorant politicians and self centered selfish advocacy groups. It’s unbelievable!
what did the police force and detectives do before technology? Its much easier nowadays to catch people doing this anyway. There is no need for this to be on everyones phone. They will continue to do their jobs which are already way easier then pre technology what they all had to deal with actually doing real detective work.
 
Live in fear, obey your masters. So many people are happy to fall in line, parrot the latest propaganda from the elites, and cancel those people who dare to step out of line. So many good apparatchiks, willing to do anything to please the elite.
the same elite that have huge gatherings, without masks, but expect the masses to isolate and follow the rules.
works even better when you tie it to children, and their safety.
Ignore the fact that so many of your products are made by children, who are slaves. But that happens elsewhere, and you don't have to see it on cnn or Maddow. out of sight...besides, the media is telling you what the latest crisis is, and how you will perish unless you engage in the mandatory two minutes of hate.
live in fear, obey your masters.
 
And how do they know it's a false positive, they must have looked at the pictures in question.

Yes, Apple will look at a derivative of the photo for those 30+ matches.

Which is no worse since they are already capable of looking at every image in iCloud Photo Library today.
 
  • Like
Reactions: smoking monkey
I have to wonder how anybody can continue to defend this plan when every last security and privacy researcher that's commented upon it has taken the position it's a very bad idea and the push-back was so significant it caused even Apple to take a step back and reconsider?
shake.gif
 
what did the police force and detectives do before technology?
<devil's advocacy>They'd get their search warrant, enter the property in question, and seize the evidence. Today that evidence is often protected by strong encryption. They're just trying to level the playing field. Same reason they now wear body armor and oft times have true assault rifles close to hand: The criminals have upped their firepower. Law enforcement must needs answer evolving threats.</devil's advocacy>
 
So you would be OK with the CSAM Detection system if it didn't report to the government and it was to your benefit?
That's an interesting question, hadn't thought about it, what would be the point. But probably yes, it would be okay with me if all it did was to report to me. Say it gave me a notification, and nothing more, that whatever I had was illegal. I'm a real stickler for rules and law, so I wouldn't mind knowing about it so i could get rid of it. (not that I would have any CSAM anyway, but if someone emails me and I don't check that particular message...)

Reporting to big brother is my biggest problem with this..
 
  • Like
Reactions: BurgDog
Isn't that essentially the same kind of what-about-ism some are using to justify scanning users' private spaces for CSAM? "But what about pedophiles?"

I think I have to apply the "Let us not throw out the baby with the bathwater" principle. Can some parents use the ability to monitor their minor children's behavior with their mobile devices to do ill? Certainly they can and no doubt some will. Conversely, as I wrote: You can hardly demand parents parent while at the same time depriving them of the tools to do so.
I think I’ll just stick with there is an app for that…in this case it’s not the monitoring, it’s the Apple thing, I just don’t think it’s a good idea to bake this in to iOS, if parents want to track their kids then have at it…. Apple is normalizing that by having it -reinstalled on every device.
 
  • Like
Reactions: Playfoot and dk001
<devil's advocacy>They'd get their search warrant, enter the property in question, and seize the evidence. Today that evidence is often protected by strong encryption. They're just trying to level the playing field. Same reason they now wear body armor and oft times have true assault rifles close to hand: The criminals have upped their firepower. Law enforcement must needs answer evolving threats.</devil's advocacy>
all im saying is catching these people nowadays is much easier without any of this. Back before technology it was a much harder job to do. We don't see many big criminal names in the modern age. Back in the 70's and 80's it was a much harder job to do then now. The playing field is much lower for criminals nowadays then it was before all this technology.
 
May not be how Apple thinks of it, but I can see Ian's point. May not help that some of Apple's defenders have asserted those objecting to Apple's plan and celebrating its suspension must be traffickers in child porn :rolleyes:
There must be a reason that Apple has been working on this plan to begin with. It’s that it’s platform is being used as a distribution mechanism for CSAM.
 
  • Like
Reactions: dk001
GrapheneOS on a Google phone is far different from stock Android and even more so than the crap that Samsung and others overlay.

As far as data, local backups are the obvious solution but there are an array of personal cloud solutions (Nextcloud, OwnCloud and others).

The key is to not make assumptions. Research and find the best solution for your needs
There is nothing mainstream out there. GrapheneOS hinders your phone choices to pixel. It’s great it’s there, and if you want to go that route great for you, but is niche. The bigger point is anything in a data center cloud can be searched and seized and scanned. On-prem still can be seized but then there is the risk of loss.

If my business model consisted of distributing illegal or copyrighted material this type of setup makes sense.
 
That's an interesting question, hadn't thought about it, what would be the point. But probably yes, it would be okay with me if all it did was to report to me. Say it gave me a notification, and nothing more, that whatever I had was illegal. I'm a real stickler for rules and law, so I wouldn't mind knowing about it so i could get rid of it. (not that I would have any CSAM anyway, but if someone emails me and I don't check that particular message...)

Reporting to big brother is my biggest problem with this..
If something were detected in icloud…. Not on device I don’t have any issues with them reporting it to police…. I mean that’s the main point of looking for this stuff, would seem kinda crazy to send out a notice asking you to please delete your child porn…. We do want to catch these pervs, you would hope the cops would be able to discern whether it was a email attachment or something beyond your control…. Anyway different discussion than the pushback on device scanning
 
  • Like
Reactions: bobcomer
You’re right, I don’t take it personally but maybe others do.
i think it has to do with the sense of “intimacy” that some people have with their phone

it’s an odd word to use but all of the personal stuff on a phone makes it feel like a digital diary and when apple says “we want to look through your diary”, people get upset and some people really really get upset, no blame on either side but that is just the way it feels

child porn damages lives forever and we should do all we can to stop it … but i really reacted badly to apple wanting to put this file on my phone

i hope they find a way to just scan in the cloud which i find much much more acceptable, even though based on the technology, in some ways it amounts to the same thing, phone or cloud, same process

but the feeling is entirely different, once data leaves my phone and hits the cloud i assume all kinds of people can look at it if they make up an excuse to do so
 
  • Like
Reactions: BurgDog
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.