Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
From what I’ve read here: they don’t. The problem with their service was that it used an external server to scan for content. That’s not the case in Apple’s implementation at all. All communication is completely end-to-end encrypted. Malicious users can still send offensive material to whomever they want and no-one except for the receiving user will know about it. However, with the new service, if parents choose to enable it, children’s accounts will scan the received images after decrypting them but before displaying them and present the minor with a content warning. If the kid is below the age of thirteen, the parents will can choose to get a warning that improper material was sent to their child. None of this is enabled by default. No external sources are alerted, the service (iMessage) or its provider (Apple) don’t get a notification at all. So, the E2E-messaging is still safe, but children get an optional layer of protection from creeps. Also, older minors can avoid unsolicited dick pics without their parents knowing about it (just in case some moronic parents try to blame their kids just for receiving that kind of harassment. Sadly, victim blaming is not unheard of).
I think most people are not objecting to the child protection features, just the scanning of pictures in iCloud Photos, and I think that is what the researchers were talking about.

EDIT: pcmxa beat me to it.
 
After reading a deluge of negative comments on HackerNews for days, I was curious how a slightly different userbase that generally tends to skew more pro-Apple would react. The fact that MacRumours crowd think Apple have screwed up is rather telling imho.
You had some hard core people defending them the last few article mac rumors published however I think most are coming to the realization client side software designed to tag illegal content just opens too many doors and that even if it didn’t it’s still flawed in just the public perception of ownership…. Ie my device, my data, stay out unless invited in or I choose to send it to you. Pretty simple concept and it’s the way the vast majority of Americans feel
 
If they don’t cancel this I’m seriously going to have to look at alternative products, which saddens me.
I hate to break it to you, but there are no alternative products that don't track our every move and scan our devices. Big Brother is here to stay. This is only going to get worse if we continue to allow it.
 
  • Disagree
Reactions: pilotpat
They get the list of hashes from somewhere. That list of matching hashes can be generated from any set of images. Apple can be forced by governments to use that government's list of hashes.
The on-device encrypted CSAM database contains only entries that were independently submitted by two or more child safety organizations operating in separate sovereign jurisdictions, i.e. not under the control of the same government.

National Center for Missing and Exploited Children (NCMEC) is the only non-governmental organization legally allowed to possess CSAM material. Since Apple therefore does not have this material, Apple cannot generate the database of perceptual hashes itself, and relies on it being generated by the child safety organization. The threat model explicitly assumes the possibility of non-CSAM images being included in the perceptual CSAM hash database provided to Apple: either inadvertently, such as through an error at a designated child safety organization, or maliciously, such as through coercion. It goes on and on, page 8 of Security Threat Model Review of Apple's Child Safety Features, is where it talks about protections against mis-inclusion. I know most aren't reading any actual data and just parroting what they're being told to think, but there's the paper for what it's worth.
 
I think given the recent revelations about the abuse of the NSO Pegasus system, Apple may end up temporarily shelving the idea of that CSAM scanning system. Reason: some "state actor" hacker could bypass Apple's limits and start scanning for way more image types than CSAM.
I don't think they care. They're much more worried about the antitrust cases.
 
Honestly think the best way to get this through apples thick skull is for them to see a glimpse of the advertising campaign Samsung,lg and other competitors could bring…. It’s pretty simple to think of narrowly targeted 30 second spots saying how much they value your devices security and they would never install software on your device to spy on you….. like Apple does, they could even end with a blurb that they support cloud scanning to combat child endangerment …. That would be highly effective and devastating to Apple because it would be all true, one bad apple ruins the bunch
 
Don´t like the way Apple is going, seems creepy, feels like they assume you are guilty until proved innocent.

I am now playing with, and learning my new Google free Android Phone. Lots to learn, plenty of work, some good points - set your own, different notification sounds etc in all apps, some bad - converting Pages documents to something Android can read.

The built in app store is interesting, it shows all the permissions an app requires and any tracking the app does.

So far I prefer Apple phone speakers though, better sound, but thats minor. Will try for a few weeks, if happy, then no more iPhones and iPads for me. Hopefully one day a Galaxy Fold 3 will run this os.
I agree. I won't miss the iphone but I will miss the ipad. There are no really good alternatives to ipad. Not yet
 
The on-device encrypted CSAM database contains only entries that were independently submitted by two or more child safety organizations operating in separate sovereign jurisdictions, i.e. not under the control of the same government.

National Center for Missing and Exploited Children (NCMEC) is the only non-governmental organization legally allowed to possess CSAM material. Since Apple therefore does not have this material, Apple cannot generate the database of perceptual hashes itself, and relies on it being generated by the child safety organization. The threat model explicitly assumes the possibility of non-CSAM images being included in the perceptual CSAM hash database provided to Apple: either inadvertently, such as through an error at a designated child safety organization, or maliciously, such as through coercion. It goes on and on, page 8 of Security Threat Model Review of Apple's Child Safety Features, is where it talks about protections against mis-inclusion. I know most aren't reading any actual data and just parroting what they're being told to think, but there's the paper for what it's worth.
If you presume the FBI has zero influence on the NCMEC, presume the FBI doesn't cooperate with similar agencies in other countries (who in turn work with their local equivalent of the NCMEC), and simply ignore the existence of national security letters..... then you have a point.
Also : if Apple really cared about CSAM on their servers, they would do something about existing CSAM on their server.
 
I hate to break it to you, but there are no alternative products that don't track our every move and scan our devices. Big Brother is here to stay. This is only going to get worse if we continue to allow it.

This is why I hate Silicon Valley, and I am increasingly anti-tech, anti “tech bros”, anti tech-workers, anti tech-companies. The tech industry needs massive regulation, and where ever tech goes, it dictates, causes misery, and destroys. Seattle is now trapped under the thumb of Amazon. Tech *******s ruined San Francisco. Social Media has done absolutely NOTHING good for society. Uber, Lyft, Airbnb, Vrbo, are just *******s. New Yorkers, at least, were smart and gave Jeff Bezos the big middle finger and told him to get lost.
 
Last edited:
I guess I will have to carry this one.
1629472849836.png
 
The on-device encrypted CSAM database contains only entries that were independently submitted by two or more child safety organizations operating in separate sovereign jurisdictions, i.e. not under the control of the same government.

National Center for Missing and Exploited Children (NCMEC) is the only non-governmental organization legally allowed to possess CSAM material. Since Apple therefore does not have this material, Apple cannot generate the database of perceptual hashes itself, and relies on it being generated by the child safety organization. The threat model explicitly assumes the possibility of non-CSAM images being included in the perceptual CSAM hash database provided to Apple: either inadvertently, such as through an error at a designated child safety organization, or maliciously, such as through coercion. It goes on and on, page 8 of Security Threat Model Review of Apple's Child Safety Features, is where it talks about protections against mis-inclusion. I know most aren't reading any actual data and just parroting what they're being told to think, but there's the paper for what it's worth.
And if a government, say the PRC, requires Apple to use a specific 'child safety' organisation in the PRC as the primary source and requires Apple uses a specific 'child safety' organisation, say in North Korea, as the secondary source? What then? After all, any material opposing authoritarian government isn't good for children and constitutes potential child abuse. Indeed suppose authoritarians governments use Apple's system as a blueprint for universal invasive surveillance while 'protecing privacy', bypassing Apple altogether? Hey, if Apple does it, it must be OK...
 
If you presume the FBI has zero influence on the NCMEC, presume the FBI doesn't cooperate with similar agencies in other countries (who in turn work with their local equivalent of the NCMEC), and simply ignore the existence of national security letters..... then you have a point.
Also : if Apple really cared about CSAM on their servers, they would do something about existing CSAM on their server.
A movie where Gene Hackman plays the character Edward Lyle comes to mind.

And if a government, say the PRC, requires Apple to use a specific 'child safety' organisation in the PRC as the primary source and requires Apple uses a specific 'child safety' organisation, say in North Korea, as the secondary source? What then? After all, any material opposing authoritarian government isn't good for children and constitutes potential child abuse. Indeed suppose authoritarians governments use Apple's system as a blueprint for universal invasive surveillance while 'protecing privacy', bypassing Apple altogether? Hey, if Apple does it, it must be OK...
"if" and suppose" are doing a lot of heavy lifting here.
 
Last edited:
I agree. I won't miss the iphone but I will miss the ipad. There are no really good alternatives to ipad. Not yet
I'm staying with the iPad, like you said, there's aren't alternatives, but things like iMessage and iCloud Photos are easy to disable. Of course I wont be buying any new iPads for now, but I wont be buying any other tablets either. (for now)
 
  • Like
Reactions: 09872738
Honestly think the best way to get this through apples thick skull is for them to see a glimpse of the advertising campaign Samsung,lg and other competitors could bring…. It’s pretty simple to think of narrowly targeted 30 second spots saying how much they value your devices security and they would never install software on your device to spy on you….. like Apple does, they could even end with a blurb that they support cloud scanning to combat child endangerment …. That would be highly effective and devastating to Apple because it would be all true, one bad apple ruins the bunch
1984-7.jpg

Apple's new users. Think Android.

(link to frame from Apple's 1984 commercial, which should be required viewing for anybody commenting here)

EDIT: Hmmmm.... It is getting harder to actually see Apple's 1984 advert. I wonder why.
 
Last edited:
I agree. I won't miss the iphone but I will miss the ipad. There are no really good alternatives to ipad. Not yet
I was thinking that this morning, my iPad Pro is my primary armchair device, I could live easily without the phone, I do have a hp spectre though that I use for astronomy that actually does pretty well in tablet mode, kinda heavy but has a beautiful 15” oled screen, if I saw this coming though I probably would have opted for a 13” one…… anyway, alternatives even if you have to babysit them to keep them secure I don’t think Microsoft is installing spyware in their os
 
Apple is up to something. Is this a distraction from a bigger issue?
It is.. probably because the midterm elections are coming and it's looking good for Republicans . I didn't realised it before but now I believe it. We know that big tech including Apple is a on the left side, funds the Democratic Party and are close friends with Democrats. It seems really fishy to come now with the technology that could censor the opposite opinion or the political opposition. If anti-social medias can do it, why not Apple?
 
Honestly think the best way to get this through apples thick skull is for them to see a glimpse of the advertising campaign Samsung,lg and other competitors could bring…. It’s pretty simple to think of narrowly targeted 30 second spots saying how much they value your devices security and they would never install software on your device to spy on you….. like Apple does, they could even end with a blurb that they support cloud scanning to combat child endangerment …. That would be highly effective and devastating to Apple because it would be all true, one bad apple ruins the bunch
The problem is that most Android phones are not privacy oriented. They scan everything you to to turn you into a marketing profile and sell your eyes to advertisers. There's no moral high ground from which to toss these stones.

If you presume the FBI has zero influence on the NCMEC, presume the FBI doesn't cooperate with similar agencies in other countries (who in turn work with their local equivalent of the NCMEC), and simply ignore the existence of national security letters..... then you have a point.
Also : if Apple really cared about CSAM on their servers, they would do something about existing CSAM on their server.
Let's take that as plausible... that various governments conspire to corrupt the CSAM hash databases the same way which results in an overlap of non-CSAM, but interesting to the government, image hashes winding up in Apple's database in iOS.
Let's take it that someone, as a result of this influence, gets flagged and the vouchers are sent to Apple, decrypted and reviewed. The reviewer would see the images are not CSAM and mark the result invalid and Apple would know the databases were compromised. No further escalation would happen. Apple's only escalation path in this process is NCMEC for known CSAM images. What they do about the compromised data sources I can only wildly speculate.
This scenario would require all the involved governments to somehow coerce/compel Apple to turn over these results and related account information in spite of them not matching the intent of the scan.

That's a LOT of ifs. That's a lot of conspiracy. That's a lot of government effort.

I'm not saying in can't happen, just that it's so unlikely that I don't see how it feasibly could happen.
 
Let's take it that someone, as a result of this influence, gets flagged and the vouchers are sent to Apple, decrypted and reviewed. The reviewer would see the images are not CSAM and mark the result invalid and Apple would know the databases were compromised. No further escalation would happen. Apple's only escalation path in this process is NCMEC for known CSAM images. What they do about the compromised data sources I can only wildly speculate.
This scenario would require all the involved governments to somehow coerce/compel Apple to turn over these results and related account information in spite of them not matching the intent of the scan.
The US govt can order Apple to work with the FBI through a national security letter. An accompanying gag order would prevent Apple from talking about it.
Please inform yourself. https://en.wikipedia.org/wiki/National_security_letter#Contentious_Aspects
That's a LOT of ifs. That's a lot of conspiracy. That's a lot of government effort.

I'm not saying in can't happen, just that it's so unlikely that I don't see how it feasibly could happen.
It's common practice. Again, inform yourself.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.