Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Bad move Apple. I’m unconvinced that this is “for the children”. I have zero specific evidence, however I speculate that Apple is having their arm twisted behind their back by the USA intel community. Given how many times tech execs had been caught contradicting previous public statements affirming their firms were not being sniffed or backdoored by the inte community, I am near certain that this will be abused in a similar manner.

It won’t be the hash for known kiddie porn images they will eventually hit a “match” upon, but images of vaccine denier memes, election disbeliever memes, “racist” memes, far left memes, etc etc. once in a while they will match some kid toucher to it to proclaim it’s relevance. But those guys will get caught the old fashioned way.

this is big brother. He doesn’t care about kiddie porn. He cares about wrongthink. And you will support it because it’s going to be used to sniff out people who will be blamed for all of the ills of society like our ongoing pandemic, capitol “terrorism”, racist militia men, anti-fascist actions, green action, etc. The smart folks will be easily manipulated into believing that the above isn’t happening and believing the boogeyman on the news that this will eventually entangle, is entirely organic.
 
And Apple wasn’t going to scan the contents of your phone.
Nobody said they were.
They are scanning iCloud for the forensic signature that matches the forensic signature of a known “naughty picture list” and then passing positive hits to the feds.

Of course the “naughty list” is supposedly just for known child porn forensic signatures. But why are we to believe that this list of “naughty images” will be limited to just child porn? Wouldn’t it be justifiable to also hit for a match on known dangerous science deniers like anti vaccine memes that are dangerous to public health?
 
Clearly she and I disagree...

View attachment 1826788

Emotional viewpoints like this are not anchored in reason, fact or any sort of technical understanding.

Apple designed this feature with privacy, safety and security in mind. They've clearly explained how it works. There no privacy invasion for those that aren't storing questionable content. Zero. Zip.

One good thing comes out of this... those looking to use Apple's systems to create and store questionable content are not safely hiding. Eyes are on you, regardless of this feature being enabled or not.
Again, YOU HAVE NO IDEA what there are doing either, stop acting like you do 🙄🙄🙄🙄
 
None of that reports to the government, which is what the CSAM stuff does and hence the difference. One is for your personal benefit, the other is to turn you in.
But that’s a slippery slope. They could flip a switch and upload that data to iCloud and even adjust the AI to look for other stuff? Isn’t this what you guys are all worried about? The technology is already there just waiting to be abused.
 
Just use another phone then. It's just a phone, not some life changing event.
I dunno... it seems to be for some people. I've never understood people who define themselves by their possessions, but many do.

I’m really getting tired of the “if it saves 1” argument.
The people who advance that emotionalism as an "argument" haven't figured out most people tired of it long, long ago.

For me, the damage has already been done...
Same here. The Apple stuff we've turned off, disconnected, and unsubscribed-from will remain so. The stuff we had planned to buy will remain un-purchased.

If you feel the same way about the slippery slope that is on device scanning, say so with your wallet and do the following…
  1. Cancel your subscriptions to iCloud and use other cloud services. I personally like Sync.com. It has E2E encryption and there are other options out there.
Done.

  1. Cancel your subscriptions to Apple TV+, iTunes, etc. Plenty of options for media for music and video.
Done.

  1. Do not use Apple Pay and close your Apple Card account if you have one.
Done.

  1. Sell your Apple stock if have them.
Yeah... not gonna do that. I leave such decisions to our retirement account guy. (But he did confirm he could rid us of our investments in APPL if we desired.)

  1. Do not upgrade your OS on any Apple device.
Automatic update turned off. Hell, automatic download turned off.

  1. Do not purchase anything through the Apple app stores.
Have no plans to do so. Well... maybe. If I'm going to stick on the iOS platform I want to get off the default Notes and Reminders apps. I want one that encrypts my notes, reminders, etc. on the device side. I may have to pay for that.

  1. Do not buy any new Apple hardware.
All planned purchases, around $3,000 worth, were canceled when Apple announced their plan.

3. Isn't Apple Pay one of the big points of using an iphone? Why bother staying with Apple if you're not going to use it?
I don't know about "one of the big," but it was a very cool thing. Particularly with my Apple Watch. But I didn't have it before and now I don't have it again ¯\_(ツ)_/¯ May not even have an Apple Watch much longer. That will actually remove a big impetus to have Apple Pay.

What?

You are equating child abuse with being a foreigner?
He was badly misquoting/paraphrasing Pastor Martin Niemöller's "First They Came..." poem.

Apple is just doing what’s required by the law.
As has been noted several times in the various threads on this subject: Apple is not required by law to proactively search for CSAM.
 
Last edited:
I have a conspiracy theory that Apple is doing this as a public stunt, and they do not plan to implement this feature in the first place.

They do it to please the politicians and stuff to gain favor over the epic case about app store monopoly. So that Apple can say, we attempt to do this, and the customers do not like it so we were forced not to implement it. Apple anticipated a huge backlash from customers in the first place.

Kind of like what Epic did with their public stunt on the app store monopoly case, they pre planed the entire scenario. Apple here is planning an entire public stunt scenario to gain an upper hand in the courts and politicians. (And possibly a little bit of customers).

Just a theory.
It will be implemented. It is just a matter of time. I am hoping Apple keeps the scanning to the cloud where it belongs and not on the device.
 
  • Like
Reactions: Fakada
Would that law also allow your landlord into your flat to randomly check your photos?
Not sure what’s your point. Apple is required to report CSAM contents stored in their servers if they knew about it. This is US law as far as I know. Apple proposed a way to do what’s required of said law, and shared what they intend to do with their customers.

If you have problem with said law and if you’re a US citizen, take it up with your local rep. My take is that Apple is planning E2EE for iCloud Photo. When that’s done, even Apple cannot provide any photos stored in iCloud Photo to anyone, because Apple just do not have the keys to decrypt the photos.

Btw, if your country’s law hypothetically requires your landlord to randomly search your flat, would you deny it and risks getting into trouble? I think we should all stop such silly hypothetical scenarios.

Lastly Apple only said they will hash photos that will be uploaded into iCloud Photos. If it’s not uploaded, there will be no hashing done. AFAIK there’s no scanning of device contents. It is up to you to believe what they say. I for one believe them. What does Apple have to gain by lying?
 
Clearly she and I disagree...

View attachment 1826788

Emotional viewpoints like this are not anchored in reason, fact or any sort of technical understanding.

Apple designed this feature with privacy, safety and security in mind. They've clearly explained how it works. There no privacy invasion for those that aren't storing questionable content. Zero. Zip.

One good thing comes out of this... those looking to use Apple's systems to create and store questionable content are not safely hiding. Eyes are on you, regardless of this feature being enabled or not.
Are you involved in law enforcement or any government department or agency?
 
  • Like
Reactions: PC_tech and Fakada
It is not about you or Apple’s customer.
Sure it is.

This is a way for Apple to comply to US laws requiring them to report CSAM materials that made it’s way to their servers.
Please stop repeating this half-truth. Apple is required by law to report CSAM of which they become aware. Apple is not required by law to actively seek it out.

This will likely pave the way for E2EE for iCloud Photo.
Apple has announced no intention ever to do E2EE of photos and, in fact, recently backed-off plans to implement E2EE after push-back from U.S. law-enforcement agencies (and probably others).
 
Not sure what’s your point. Apple is required to report CSAM contents stored in their servers if they knew about it. This is US law as far as I know. Apple proposed a way to do what’s required of said law, and shared what they intend to do with their customers.

If you have problem with said law and if you’re a US citizen, take it up with your local rep. My take is that Apple is planning E2EE for iCloud Photo. When that’s done, even Apple cannot provide any photos stored in iCloud Photo to anyone, because Apple just do not have the keys to decrypt the photos.

Btw, if your country’s law hypothetically requires your landlord to randomly search your flat, would you deny it and risks getting into trouble? I think we should all stop such silly hypothetical scenarios.

Lastly Apple only said they will hash photos that will be uploaded into iCloud Photos. If it’s not uploaded, there will be no hashing done. AFAIK there’s no scanning of device contents. It is up to you to believe what they say. I for one believe them. What does Apple have to gain by lying?

Apple is required to report what they become aware of. Not required to implement a scanning system that goes through my belongings.

The fact that they are heavily back petaling on this now is telling.

While end to end encryption would be nice to have on all Apple services, I don’t read from the current events that this is aimed for or coming and they certainly have not made such an announcement even to soften the negative impact they caused on the matter. I’ll believe it when I see it.
 
  • Like
Reactions: Mendota and dk001
As has been noted several times in the various threads on this subject: Apple is not required by law to proactively search for CSAM.
Apple is required to report CSAM contents if they knew that it’s stored in their servers. Apple doesn’t care about filth stored in users devices. This exercise has always been about protecting Apple. And it could pave the way for E2EE for iCloud Photo, which would protect their users privacy of their photos stored in their servers, as they will no longer be able to decrypt them.
 
  • Disagree
Reactions: rme
Counter to this one single point - if Apple's going to keep making money anyway, at least you can profit off of their stock price while not giving them a dime elsewhere. I have a friend who hates Facebook, hates that his wife is on Facebook, but bought Facebook stock a few years ago and loves the fact that he's made a killing off of them while they're not actually making anything off of him.

;)
Did your friend join the twirling moustache club and brag about his Purdue pharma shares too?
🤑
 
Apple is required to report CSAM contents if they knew that it’s stored in their servers.
No, Apple is required by law to report specific instances of CSAM material that comes to, or is brought to, their attention. They are explicitly not required by law to search for it. Would you like me to cite the relevant US Code again?

And it could pave the way for E2EE for iCloud Photo, ...
Simply repeating something I've already disproved isn't going to make it true. Again: Apple has made no suggestions whatsoever it has any plans to employ E2EE for iCloud photos and has, in fact, recently backed-off plans to implement E2EE for iCloud storage.
 
no they are matches hashes of known csam against hashes of photos you upload to icloud, apple says they are accurate to 1 in a trillion false positives so the naked kids in the bath pictures or romping through the sprinklers will not get flagged and then you have to have 30 of these image before they will have a human look, but if indeed, they are matching to a 1 in 1,000,000,000,000 accuracy there is no way you will ever get in front of human reviewer unless you have actual csam that exists in at least 2 databases in 2 different legal jurisdictions

No, they create a neural hash of features. Hence "seems like", they aren't matching 1:1. They use the 30 pictures to claim 1 in 1,000,000,000,000 due to hash collisions with your family pictures. Apple even admits there are false positives in their system. Thats before malicious users create indocent images with matching ID's, like background images.


And how do they know it's a false positive, they must have looked at the pictures in question.
 
Last edited:
I am hoping Apple keeps the scanning to the cloud where it belongs and not on the device.
I doubt it.
They just don't want this to affect iPhone sales the coming months. I expect they'll reintroduce it after Xmas with some cosmetic changes. (for example : not one but TWO apple employees will now be verifying matches! and only after 40 matches instead of 30!)
By the time the iPhone 14 is released most people will be used to on-device scanning.
 
Last edited:
  • Like
Reactions: BurgDog
Sure it is.


Please stop repeating this half-truth. Apple is required by law to report CSAM of which they become aware. Apple is not required by law to actively seek it out.


Apple has announced no intention ever to do E2EE of photos and, in fact, recently backed-off plans to implement E2EE after push-back from U.S. law-enforcement agencies (and probably others).
This article from Reuters provides the relevant background information regarding above.
 
I don't know about that. If you look back on all my comments, you would see that I'm pretty inclusive of different ideas. My bar is set really low here. If society as a whole does not value the well-being of its people and allows exploitive behaviour to run rampant without repercussion. That would be a crime that transcends all political differences.

Notice that I never used the word human rights or universal suffrage, etc. I only said the wellbeings of the people. If people are having a good life and are able to live and progress, then it's good.

Selling child prostitution as a service for foreign soldiers, or making ladyboys for freak shows, clearly subtracts from this very low bar I set.

It's the same girls that make child porn. And I'm not against child porn just because. Child porn that is done by teens, not children, without monetary incentives, i.e., just to have fun or being exhibitionists, may be okay, but a legal line must be drawn somewhere, and it's hard to distinguish which is which. Technically, most people have done it while in school, sexting, etc. I don't know at which point a nude picture of an underage teenager would be considered CSAM.
I think ian87w was correct to call you out on your previous statements.

While I have never been to the Philippines, I have been a student of Japanese language and culture for my entire life and have visited Japan, and your statements regarding Japan, to me, read as if they were misinformed at a minimum and perhaps even anti-Japanese or sinophobic, something which seems lamentably prevalent in the USA.

To be clear, the production and distribution of child pornography has been outlawed in Japan since 1999, and additional legislation was introduced in 2014 to ensure that there was no secondary market of "grandfathered in" old content allowed.

By volume, the United States has been the leading source of child pornography distribution, year after year (citation: http://www.amlc.gov.ph/images/PDFs/2020 DEC CHILD PORNOGRAPHY IN THE PHILIPPINES POST-2019 STUDY USING STR DATA.pdf)

Your mention, or perhaps fixation on Lolicon (aka, fictional depictions, yes still legal there as observed by the break down here: https://en.wikipedia.org/wiki/Legality_of_child_pornography) seems particularly sensationalistic. The prevalence of Lolicon as a paraphilia within Japan can be directly linked to the US Occupation of Japan after WWII and its policies of censorship, which also gave rise to the prevalence of so-called "tentacle porn", neither of which is widespread nor widely distributed nor consumed. Similarly, while some point to Hokusai's 蛸と海女 「tako to ama」(The Dream of the Fisherman's Wife) wood block print from 1814 as an early example of so-called tentacle porn, that was an outlier in a culture which produced a significant amount of 春画「shunga」(erotic wood block prints) and fixation on such examples, again, from my vantage tends to be tinged with sensationalism and more often than not, derisive anti-Japanese sentiments.

As an undergraduate student some of this was a subject of study in my Japanese history courses. In particular, MacArthur's Occupation after WWII, resulted in any mention of censorship of Japanese media itself being censored (more about that subject can be understood in John W. Dower's book, Embracing Defeat). Subsequently, producers of erotic content began to treat the media as a "black box" more or less throwing it.sh at the wall until they saw what stuck, with a lot of materials being confiscated by authorities in the process. This was true even after the Occupation ended due to having somehow violated the nebulously worded obscenity legislation in Article 175, with occasional contestations under Article 21 of the Japanese Constitution, which is intended to allow for freedom of expression.

You'll note that centuries old 春画「shunga」prints were explicit in their depictions of sexual activities, whereas after WWII, it became commonplace for mosaic obfuscation around genitalia to be utilized in pornographic content, this is still common in Japan, even to this day. Again, to be clear, in historical as well as current cultural context, Lolicon and tentacle porn were utilized as erotic content creator methodologies to depict non-mature and non-realistic genitalia symbols in a manner which could potentially side step post WWII censors, and are not indicative of any prevailing paraphilia nor pornography production relative to the whole of the adult and AV (Adult Video) industries' output in Japan, especially as contrasted with such industries' output prior to WWII.

I think if looking at present day statistics around child pornography distribution, particularly given that the USA repeatedly appears at the top of such lists, why does the USA seem to be omitted from your statements? Child pornography was outlawed in the USA in 1977, yet the tendencies of individuals such as you to mention Japan, which isn't even in the top 5 of countries which have statistical data for distribution of such content, despite it being outlawed decades more recently to me, something seems amiss about that. From my vantage, it really seems to beg the question of why you think you are writing about how "inclusive" you are? No one prompted you to mention Japan in a thread focused on Apple's CSAM policies, you volunteered that perspective entirely based upon your own biases and lenses of information, which again, do not seem as if they are necessarily rooted in present day legal realities nor statistics.
 
Last edited:
Are there any good threads here on MacRumors for Mac users switching off platform?

@Dionte and others have mentioned making changes. I'm moving off Apple and switching to Linux.

edit: I started a thread here:
 
Last edited:
But that’s a slippery slope. They could flip a switch and upload that data to iCloud and even adjust the AI to look for other stuff? Isn’t this what you guys are all worried about? The technology is already there just waiting to be abused.
Yep, that's what I'm worried about, especially the abused part but just rolling over and playing dead because you think it will happen anyway is the cowards way out. We have to keep letting Apple know that this is not what we want. (at least until Apple becomes totally irrelevant to us.)
 
  • Like
Reactions: crymimefireworks
Off topic, but I feel like a lot of people who are mentioning Brave New World haven’t actually read it. It’s not about a surveillance state in the same way as 1984. It’s about a society drugged and amused into an apathetic complacency. Huxley and Orwell had wildly differing views on how a dystopia could emerge.
 
[Brave New World is] not about a surveillance state in the same way as 1984. It’s about a society drugged and amused into an apathetic complacency.
Apropos to the current discussions, nonetheless, don't you think? Some people are (were?) willing to just go along with whatever Apple wants to do, despite the fact they don't like the idea, simply because it's too much effort to do otherwise or because Apple's stuff is too cool/handy to do without.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.