Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Did you read up on FOSTA-SESTA like I advised? You’ll learn something.


The case is about sex trafficking and FOSTA-SESTA deals with sex trafficking.

Nothing in that case leads me to think anyone is required to actively look for child pornography when federal law says explicitly you don't have to.

"The opinion said that Section 230 prevents companies from being held accountable for the “words or actions of their users,” but does not preclude them being liable for “their own misdeeds,” especially in cases of human trafficking. Justice Blacklock added that the court does not believe that states are “powerless to impose liability on websites that knowingly or intentionally participate in the evil of online human trafficking.”"
 
  • Like
Reactions: ChromeAce
Here’s a better explanation of this whole process (but first some context):

Your iPhone has a neural engine inside it. The neural engine already scans your photos on device to index your images. This makes them searchable. That’s why when you search for the term ‘dogs‘ or ‘trees’ for instance, the phone is able to bring up the shots you requested. It’s also how the memories feature works. Heck it’s how the faces feature from iPhoto back in the day worked (but without the benefit of a dedicated neural engine). All of this ‘scanning’ that people are worried about is happening locally on device. No data is shared with Apple for any of the features described above.

So how does the neural engine do this?

The neural engine is trained by Apple with machine learning algorithms. Taking the example of a Dog, the neural engine knows what a Dog looks like because it was trained with a specific algorithm that taught it what Dogs look like. When you take future photos, if the neural engine finds that the photo matches to it’s ’Dog algorithm’ it gets indexed and sorted as such. The next time you search your library for the term ‘Dogs’, the photo you took is surfaced.

And how does this relate to CSAM?

In the simplest sense, Apple has created an on device machine learning algorithm that looks for CSAM. But much narrower in scope. Narrower than the aforementioned ‘Dog algorithm’. The algorithm has been programmed with neural hashes that represent known CSAM images from a database. This means CSAM not in the datebase won‘t be detected as the algorithm doesn’t recognise a match. At this stage in the process, nothing changes for you as the user and your privacy has not been affected one bit. If you’ve been happily using photo memories, or searching your library for shots you took on that nice family trip to Disney World then understand this. This was achieved in the same way that Apple detects CSAM on device.

User saves photo > photo is run through neural engine on device > photo gets categorised and indexed as ‘Dog’ > user is able to search their library at will for photos of Dogs and find that photo they just saved

But what about my iCloud Photo Library?

Here is where the CSAM feature differs. If and only if the machine learning algorithm on your device has detected known CSAM photos, then those specific photos that were flagged get scanned a second time by another algorithm in iCloud as they get uploaded. Those photos and those photos alone become viewable by Apple. A human reviewer than checks to make sure the algorithm didn’t make a mistake. If it did make a mistake then nothing happens. But if confirmed CSAM content is matched, your account is reported.


Just to summarise

1. Do you use photo memories or faces or ever search your library by location or using key words? All of that happens on device and Apple never sees your photos. And the scanning for CSAM works in exactly the same way but with a much narrower scope. If you don’t like the idea of the system scanning for CSAM then I hate to break it to you, but your phone has been doing on device machine learning and scanning of your photos for many years.

2. Apple isn’t scanning your entire iCloud Photo Library. They’re running a second scan on any images that were first detected and flagged using on device machine learning. This happens as the image gets uploaded.

Machine Learning has been happening on our devices for years for many of the features we use in the photos app. The benefit of keeping it on device is it’s private. No data is processed in the cloud.

The only real thing that has changed with this new system, is that if known CSAM photos were detected using on device machine learning (around 30 images), they would then be scanned on upload to iCloud and get reviewed by a human.

Nothing about the way your iPhone works has changed. Nothing about how your photos are stored or processed has changed. The only time any processing or scanning of images happens in iCloud is if your iPhone detected CSAM images on device using the same old neural engine that’s been running in the same old way that it has since the iPhone X.

My personal view is this. The technology being used is the most thoughtful and privacy preserving implementation used by any company so far. My only concern is if governments try to force Apple to widen the scope of their on device machine learning algorithm. But I also appreciate that as Craig pointed out, the security community would being able to determine if Apple lied to us quickly enough. And Apple has multiple audit levels to prevent interference through government or other means.

I think the fact that Apple has been transparent about this and told us is very reassuring. They could have chosen not to. Which would have been morally wrong. But if it was their intent to start abusing our trust by covertly scanning for other things in the cloud, then telling us upfront about this feature would be a pretty dumb way to start.

Based on everything Apple has done for privacy so far I’m willing to continue to trust them. They didn’t do a good job of explaining this technology but I do believe they’re trying to do what is right, not what is easy. If in the future they go back on their word regarding widening the scope of the system to detect other things, I’ll be the first to call them out on it and call for them to be sued into the ground.

Until then I remain confident in Apple’s commitment to privacy.

Welcome to MacRumors, Tim.
 
the thing is that Apple came the last to this fiasco...so if you are losing confidence in Apple, there is no one else...back to the nokia 6310 era ?!
Personally I think the Nokia 3110 will be the one.
Incredible battery life!
Heck, you can easily bring a spare battery if you stay longer than 2 weeks in the middle of nowhere, for the other two weeks...
 
Has anybody stated how big the database of image fingerprints stored on my iPhone is going to be. If child porn is such a huge problems the database is likely very large and may take up a significant chunk of my available storage.

It's only a hash value per image. I would estimate the size to be 5-20Mb but it depends on how many images and how large the hash is per image and if they have som additional data which needs to be stored.
 
  1. First step - Get people to accept scanning software running on their phones (only to search photos being pushed to icloud) - for the children!
  2. Second - roll this out to laptops, desktops and all devices... You know those pervs who look at images on their Apple Watch!
  3. Third - Expand to scan all Apple software (Spotlight, iPhotos, Safari etc)
  4. Fourth - Scan third party software (video players, image viewers, zip apps)
  5. Fifth - Scan local hard drive and alert authorities
  6. Sixth - Scan for harmful keywords, contacts, messages, documents
  7. Seventh - Children are now safe!
Meanwhile Law enforcement slowly chips away with court orders year after year (let me see Billy Bob's photos - to prevent an imminent child abduction or for national security)

Your phone, thoughts, writing, photos, friends, conversations, political leanings, links (read and shared) - are all under the boot being scanned. And, let's not even talk about China.

Apple - you make phones, you're not the police, you're a corporation. The only reason you exist is because people buy your phones. Making moves that cause people to not buy your phones or identify with your brand is like the Wicked Witch deciding its a good idea to get into a water-gun fight. We may currently still like your products, but at the end of the day we don't need you.

The reason you're as powerful as you are today is cause the shared ethos you managed to get with people - to think different, be different, get people excited about technology and how to use it creatively. I don't remember voting for you and pinning a shiny sherrif star are your chest. Nor do i have any oversight of what you're doing like we 'supposedly' should have with elected officials and law enforcement.

Now introducing a spyware platform across over 700 million active phones (that we never asked for) and trying to convince us it's positive? Meh... Maybe we aren't aligned as much as I thought.

Apple = cool
Apple = evil
 
Your are conflating two different programs. The 16 year old in your scenario will only get lectured if their parents enable that feature. Parents always have the right to monitor a child’s content, also, federal law has a very clear age of majority

Perhaps  should also allow parents to block any iMessages that are pro-vaccine?
 
The height of arrogance. Apple > Nations? Maybe, until they aren't. Another example of 'The path to hell is paved with good intentions'. And then there's the hubris of 'you're too dumb to understand what we really meant to mean' BS.

Well many of Macrumors comment has been calling for Apple to pull out of Nations that do not follow Apple's guidelines for well over two years.

Apple > Nations isn't new at all. And if you read Apple's PR response, every time they list *their* "App Economy" and how many jobs they created.

It is kind of funny MR this time around is extremely one sided against Apple. The list of usual Apple apologist are no where to be seen. Not even in downvoting.
 
1) Where did you get your statistics then. The idea that 70% of women and 50% of men were sexually abused as children sounds absolutely insane to me.

2) We are talking about child sexual abuse here, not whether an individual feels that their parents emotionally abused them. Apple isn't scanning for that anyways (yet). If you got your 70% statistic by including that, your statement was false. I'd also like to see your source for that.

There are many who use the shocking concept of child sexual abuse to embellish the facts for their own purposes, which does a grave disservice to those out there who were actually abused.
 
Required to remove child pornography doesn't imply must actively search for it.

The federal statues governing child pornography explicitly says providers (such as Apple) don't have to actively search for it. They do have to report it when they become aware.

Exactly. I’ve been explaining this repeatedly to “Jimmy Hook” who refuses to believe it.
 
They get an image tagged "this is child porn" and they still store it on their servers. Sounds like they are not making any effort at all. Courts will likely agree that they are not making a good faith effort when they ignore the blatant tagging. What they should be doing is forwarding the image with proper identifying info to the proper authorities in the jurisdiction of the uploader and then deleting it.

Apple doesn't know about the matches until the threshold is reached.

The system is designed in such a way that Apple can't count the number of matches in any meaningful way.

The proper "authority" in the US is NCMEC, a non-profit organization, as required by federal law which is what Apple will do once they know about it and have conducted a review.
 
There are many who use the shocking concept of child sexual abuse to embellish the facts for their own purposes, which does a grave disservice to those out there who were actually abused.

It’s kind of like using the excuse of “But it’s my religion” to justify just about anything.
 
It’s the overall ideology he is criticizing. Use your head.

He is saying that the technology can be misused. I am asking how and don't really get detailed answer taking into account how the system works.

It seems to very few people know about the weakness about this system which make it ill-suited for many applications of surveillance.
 
  • Like
Reactions: VTECaddict
You either have to scan everything on the server and share all of your data with Apple.

The exact same methodology they are going to use on your device would/could/should be used on their servers.

Our iCloud data is already not E2EE - just scan on servers as every other host already does.

What most of us are objecting to is building the infrastructure to do this -- on our devices.
 
  • Like
Reactions: Pummers and BurgDog
It's not.

When the CSAM Detection system scans the photo, the photo is in an unencrypted state. There is no encryption to break.

When the system creates the safety voucher, again the photo is in an unencrypted state. The safety voucher which includes a derivate of the unencrypted photo, is encrypted with a secret key unknown to Apple and only known to the device and not the user. But every safety voucher contains part of the key necessary to unlock this encryption. When Apple has collected enough safety voucher they have enough of the key to read all the safety vouchers and the pictures within them.

No encryption was broken since the encryption was employed to allow only Apple to read it. If the user or some one else, then encryption was broken.

When iOS sends the regular photo to iCloud Photo Library its encrypted with two keys, one held by the user and one held by Apple. This means that both Apple and the user can read the content. Encryption is only broken if some else is able to read it.

Encryption != Only I can read it

All content on the iPhone is encrypted. And  touts that with ad lines like, “What’s on your iPhone stays on your iPhone.” But with the new code, which is inserted INSIDE the encrypted phone (backdoor access by Apple), the encrypted photos are DECRYPTED to be evaluated against the known child porn hashes and then sent to Apple for further evaluation after a match. So everything you thought was private and protected and encrypted on your phone is now accessible by Apple, who does not need a key to see it because they are BYPASSING the encryption protection by running software from within the phone that sends photos OUT of the phone to THEM.
 
  • Like
Reactions: Mega ST
But I’d only want that to be the case if it was possible to do it in a way that meant Apple or others couldn’t see my entire photo library. As far as I know that technology doesn’t exist.

You realize that they can't, right?
This CSAM specific situation is hash database comparison.

(mind you, if they find anything, the walls start coming down..)

The exact process that defenders of all this are telling us is "secure and private" when done on our own phones should be every bit as "secure and private" when done on Apple servers.

Almost all the objection here is simply to the "doing this stuff on our devices" part of things.
It crosses a line
 
  • Like
Reactions: Pummers and BurgDog
If Apple is smart, Craig’s interview will be the last you hear about this from them.

You will all get tired as no one responds to your conspiracy theories.

99% of Apple users will accept this and move on (because that’s the same number of people who will have no idea this is even a thing)…record sales of iPhone 13 this Fall.
 
  • Haha
Reactions: Pummers
Saying I was unaware of child porn on my personal storage systems doesn't get me off the hook for possession. Same should apply to Apple on their storage. Particularly when they are notified that the image is child porn when it is uploaded to storage they own.

Yes, you do get off the hook for possession.

Everywhere in the statue they use the word 'knowingly'.

An affirmative defense if you discover child pornography in your storage system is to promptly delete it.

(2) promptly and in good faith, and without retaining or allowing any person, other than a law enforcement agency, to access any visual depiction or copy thereof—
(A) took reasonable steps to destroy each such visual depiction; or
(B) reported the matter to a law enforcement agency and afforded that agency access to each such visual depiction.
 
  • Like
Reactions: ChromeAce
After reading exhaustive number of replies. for me it comes down to this:
  • Prior to launch: Police need probable cause then get a judge-issued warrant to search through your personal things
  • Post launch: Apple will search your devices automatically and notify police with no warrant required.
If you strikeout the 'child' language and put in just about any other term, people would be outraged. But 'for the children' makes everything a-okay.. Today's 'child photos' are tomorrows 'extremist content'.
 
Who is pushing this idea?

This is a great question. Developing this obviously cost money. And it's not as if Apple is going to gain marketshare through this. Like, is there an android user out there who thinks... ooohh apple now prevents child pornography, my next phone will be an iPhone. Or even retain an outgoing iPhone customer because Apple is pushing this idea. What's the upside for Apple?

And you're right, if Apple has been forced, they should absolutely come out and say that.
 
Yeah it really does, because this needs time and input and discussion and debate..

For which they are allowing very little as the new software releases are imminent
I can agree with you on having time to discuss and debate for sure. But then that is what’s happening now a month before release.

Will any changes be made? who knows. An old Steve Jobs interview comes to mind from the ‘Thoughts on flash’ era and an interview he did at D8. I can’t remember the exact quote in full so you’ll have to forgive me but it was basically to say: People will vote with their wallets. And “if we’re making the wrong decisions we listen to the market. But so far people seem to be liking iPads”.

In other words if any change does come about, I’d like to think it was from consumer pushback. But the biggest pushback (and perhaps the only one that Apple might listen to) is to not buy their products. Which I just can’t see happening personally 😅
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.