Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
OK but this doesn’t change anything.
People who are overly paranoid about this feature are still going to be overly paranoid about this feature.
but I think the funniest thing is, the place where I’ve seen the most paranoia about this feature is Facebook.
If you use Facebook, this feature shouldn’t even slightly concern you because your privacy is already gone
You are mistaken. You can use FB just to get information. You can create a fake account, connect with VPN and use dedicated browser app that doesn’t allow tracing cookies. You don’t have to share all your photos with FB. It tries to get as much information about you as possible but you can reduce it to an acceptable minimum.
 


Apple's Head of Privacy, Erik Neuenschwander, has responded to some of users' concerns around the company's plans for new child safety features that will scan messages and Photos libraries, in an interview with TechCrunch.

Child-Safety-Feature-Blue.jpg

When asked why Apple is only choosing to implement child safety features that scan for Child Sexual Abuse Material (CSAM), Neuenschwander explained that Apple has "now got the technology that can balance strong child safety and user privacy," giving the company "a new ability to identify accounts which are starting collections of known CSAM."

Neuenschwander was asked if, in retrospect, announcing the Communication Safety features in Messages and the CSAM detection system in iCloud Photos together was the right decision, to which he responded:

When asked if Apple was trying to demonstrate to governments and agencies around the world that it is possible to scan for illicit content while preserving user privacy, Neuenschwander explained:

He was asked if Apple had created a framework that could be used for law enforcement to scan for other kinds of content in users' libraries and if it undermines Apple's commitment to end-to-end encryption.

Neuenschwander was then asked if Apple could be forced to comply with laws outside the United States that may force it to add things that are not CSAM to the database to check for them on-device, to which he explained that there are a "number of protections built-in" to the service.

Neuenschwander continued that for users who are "not into this illegal behavior, Apple gain no additional knowledge about any user's cloud library," and "it leaves privacy completely undisturbed."

See TechCrunch's full interview with Neuenschwander for more information.

Article Link: Apple Privacy Chief Explains 'Built-in' Privacy Protections in Child Safety Features Amid User Concerns
so I have a simple solution to this privacy dilemma...... all devices using this update will be 90% discounted in purchase price.. all devices not using this government back door.. regular price..
 
Here is an example on possible matches.


But he is using open source solutions has his examples. The algorithm he is using is from the mid nineties and a lot has happened since then. None of them are using what Apple and Microsoft are using.

Perceptual hashes are a large category of algorithms and Apple's NeuralHash consist of one part which are not perceptual hashes and a second part which use quite a new way of doing LSH.

Facebook is using PhotoDNA from Microsoft. It seems to be that they report millions of cases every year. If PhotoDNA would be as bad as this example I don't think that Facebook, Google, Microsoft and other could have used it.

NeuralHash is probably similar to PhotoDNA in quality.
 
I have read it. Data that contains information about the content on my device leaves the device.

Yes, if you have iCloud Photo Sharing turned on. But that's the purpose of this setting. Sharing information from your device to the cloud.
 
The biggest problem is the fact you have no way to verify what the CSAM software is doing in the background and you just have to take Apple's word that turning off iCloud Photos disables this process.

That's true for almost everything with the OS especially iOS.

When you are using an operating system which you haven't compiled yourself you have to trust the developer of the OS to a large degree.
 
They weren't "already" using our devices to do it.

The Photos app has been scanning and analysing local photos for many years already.

How do you know Apple hasn't been forced to upload certain images by certain governments?

These systems raises no new privacy questions at all for me.
 
In the EULA you already agreed to. They are already doing plenty of on device scanning to which you have already consented. If anything, this is less invasive as it’s based on hashing not using ML to identify faces and items.

The EULA said I agreed to have my on-device data scanned against third party databases in which even the manufacture of the software (Apple) doesn't know the contents?

How did the EULA anticipate something they are just planning on doing now?
Or are you saying there will be a new EULA once these changes are implemented in iOS 15 and macOS Monterey?
 
The EULA said I agreed to have my on-device data scanned against third party databases in which even the manufacture of the software (Apple) doesn't know the contents?

How did the EULA anticipate something they are just planning on doing now?
Or are you saying there will be a new EULA once these changes are implemented in iOS 15 and macOS Monterey?
“14. Third Party Acknowledgements. Portions of the Apple Software may utilize or include third party software and other copyrighted material. Acknowledgements, licensing terms and disclaimers for such material are contained in the electronic documentation for the Apple Software, and your use of such material is governed by their respective terms. “
 
  • Like
Reactions: StralyanPithecus
“14. Third Party Acknowledgements. Portions of the Apple Software may utilize or include third party software and other copyrighted material. Acknowledgements, licensing terms and disclaimers for such material are contained in the electronic documentation for the Apple Software, and your use of such material is governed by their respective terms. “

That's very vague and not really worded in a way that covers what they are actually proposing here. - I'm not sure that would legally fly. I'd bet things get amended and added to when/if this rolls out.

Just out of curiosity - are you in favor of this?
 
That's very vague and not really worded in a way that covers what they are actually proposing here. - I'm not sure that would legally fly. I'd bet things get amended and added to when/if this rolls out.

Just out of curiosity - are you in favor of this?
Also iCloud terms:

“However, Apple reserves the right at all times to determine whether Content is appropriate and in compliance with this Agreement, and may screen, move, refuse, modify and/or remove Content at any time, without prior notice and in its sole discretion, if such Content is found to be in violation of this Agreement or is otherwise objectionable.”
 
Also iCloud terms:

“However, Apple reserves the right at all times to determine whether Content is appropriate and in compliance with this Agreement, and may screen, move, refuse, modify and/or remove Content at any time, without prior notice and in its sole discretion, if such Content is found to be in violation of this Agreement or is otherwise objectionable.”

On iCloud, yes

I think we'd all be fine if they stick to only doing iCloud server side scanning, as they have been.

Everyone is objecting to the scanning implementation taking place on users devices themselves.
That's been made very clear in all the threads of discussion on this.
 
On iCloud, yes

I think we'd all be fine if they stick to only doing iCloud server side scanning, as they have been.

Everyone is objecting to the scanning implementation taking place on users devices themselves.
That's been made very clear in all the threads of discussion on this.
It’s already scanning things, this is less invasive than what it currently does. If you don’t like on device scanning, buy a dumb phone.
 
It’s already scanning things, this is less invasive that what it currently does. If you don’t like on device scanning, buy a dumb phone.

It's not, on device, scanning and comparing to external third party sources.
The issue is the capability now existing, on device, to do things in this way and ramifications of that moving forward.

I also refute the point about it being less invasive - agree to disagree on that I guess.

Don't be so dismissive at the end please.
I respect your opinion - please respect mine.
 
You clearly haven’t given your responses. If you like to be respectful, then go read the documents I linked for you instead of carrying out a silly conversation.

You done?
Please don't reply to me anymore (I'll do the same)
Good evening.
 
Matthew Green had a great thread today.
I'll post it below so as to not force folks to figure out compiling the twitter thread, etc

Everyone keeps writing these doomed takes about how “the US government is going to force tech companies to comply with surveillance, so they might as well just give in preemptively.” Like it’s inevitable and we should just hope for what scraps of privacy we can.

Even I was pessimistic last week. What I’ve seen in the past week has renewed my faith in my fellow countrymen — or at least made me realize how tired and fed up of invasive tech surveillance they really are.

People are really mad. They know that they used to be able to have private family photo albums and letters, and they could use computers without thinking about who else had their information. And they’re looking for someone to blame for the fact that this has changed.

People are telling me that Apple are “shocked” that they’re getting so much pushback from this proposal. They thought they could dump it last Friday and everyone would have accepted it by the end of the weekend.

I think that reflects Apple accepting the prevailing wisdom that everyone is just fine having tech companies scan their files, as long as it’s helping police. But that’s not the country we actually live in anymore.

Anyway, I don’t revel in the fact that Apple stuck their heads up and got them run over by a lawn mower. I like a lot of the people on Apple’s security team (I turned down a job there a few years ago.) But people need to update their priors.

At the end of the day, tech companies do care a lot about what their users want. Apple has heartburn about this *not* because Congress passed a law and they have to do it. They’re panicked because they did it to themselves, and they can’t blame Congress.

A few folks in Congress, for their part, have been trying for years to pass new laws that force providers to include mandatory backdoors like this new Apple one. They failed repeatedly. In part they failed because these systems aren’t popular.

And so the shell game has been to play one against the other. Congress can’t quite pass laws requiring backdoors because there’s no popular support. But providers somehow have to do it voluntarily because otherwise Congress will pass laws.


From his Twitter: https://twitter.com/matthew_d_green
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.