Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

I don't think dictionary catches the nuisance in privacy and data protection. Why would EU have to develop a huge body of directives and laws for GDPR? They could just have looked it up in their dictionaries.

Another problem with this is using a specific language. What if your language isn't English? How would you define privacy then?

"freedom from unauthorized intrusion"

You give Apple authorisation by accepting EULA, TOS and/or using iCloud Photo Library.
 
I've been swaying back and forth on this one.

When Apple refused to unlock the San Bernadino shooter's phone they claimed that once a back door was created it set a dangerous precedent. Is this a back door? Why is it a back door? Because Apple could be compelled by unfriendly entities to scan for other images? I've seen some people comment that they consider it reasonable for other companies (Google, MS etc) to scan for CSAM on their cloud infrastructure. But on that basis couldn't Google and MS etc be compelled in the same manner, just to do it via the cloud?

It seems that at least Apple is not scanning all images unlike Google, MS and others.

Of course if your view is that no images should ever be scanned under any circumstances then I suppose that's a different story. But that ship's already sailed.
 
The fact that they didn’t announce this as a feature at WWDC but saved it for a more quiet announcement makes it all the more alarming.
Yup. The irony is that I think that we would be talking far less about that if Apple announced it in WWDC. It would have been more transparent and honest, and with all the new features, people would probably have other things to discuss anyway. Not that I think that it’s good that people talk less about it, but from an Apple standpoint I don’t understand why they managed so poorly the announcement. I hope that the scanning will be removed, but Apple is probably the most stubborn company in the world, so I’m skeptic.
 
  • Like
Reactions: Stunning_Sense4712
They really don't get it. Its not the mechanics of how its done its the fact its done at all. Once a system is in place to scan images for content you no-longer have any excuse not to implement additional scanning for governments when requested i.e. Budapest declare its illegal to be LBGT, they tell Apple to scan for any such imagery. Apple have no excuse not to. Its the law in the country - they can do it so they would have to. No different to removing apps that the governments force them to do. They have to remove them because they can. This gives all governments the ability to request image scanning.
They have been scanning photos for long before this feature was announced. How do you think they recognize people in your photos? Where was your outrage when that feature rolled out?
 
Wow. People really read what they want to read.
1. What back door?. You are confusing things.
2. You have opt-out, just don’t use iCloud.
3. It scans a hash a the photos, not the photos.
4. Why are you so worried????. About your consented porn?. The system does not care.
All of the security experts think it's a bad idea. You know why. This feature will expand and if we won't fight back now then it's too late already. It has always started with good intentions and with the "think about the children" explanations. We've seen other companies do this before.

Why would I not want to use iCloud? iPhones don't have an SD Card feature.
The person below you replied beautifully to all of that.
 
Because it's on the [phone], security researchers are constantly able to introspect what’s happening in Apple’s [phone] software. So if any changes were made that were to expand the scope of this in some way —in a way that we had committed to not doing—there's verifiability, they can spot that that's happening.
Say what? Oh dear it gets worse and worse with every word they say.
 
  • Like
Reactions: Stunning_Sense4712
I don't think dictionary catches the nuisance in privacy and data protection. Why would EU have to develop a huge body of directives and laws for GDPR? They could just have looked it up in their dictionaries.
Exactly.

Don't. Look. At. User's. Data. It. Is. Private.


Another problem with this is using a specific language. What if your language isn't English? How would you define privacy then?

"freedom from unauthorized intrusion"

You give Apple authorisation by accepting EULA, TOS and/or using iCloud Photo Library.
There is no confusion. The Emperor is naked.

This issue is clear. It's my data. If you say my data will be private, then you CAN'T look at it.

The truth is that Apple exposed the "privacy lie" when it comes to digital devices and data, since they've been tooting the "privacy" horn loudly and constantly lately.

But we all knew that unless a system is standalone, it ISN'T private, regardless of what anyone says, in ANY language.
 
  • Like
Reactions: IG88 and Fawkesguyy


Apple's senior vice president of software engineering, Craig Federighi, has today defended the company's controversial planned child safety features in a significant interview with The Wall Street Journal, revealing a number of new details about the safeguards built into Apple's system for scanning users' photos libraries for Child Sexual Abuse Material (CSAM).

craig-wwdc-2021-privacy.png

Federighi admitted that Apple had handled last week's announcement of the two new features poorly, relating to detecting explicit content in Messages for children and CSAM content stored in iCloud Photos libraries, and acknowledged the widespread confusion around the tools:

The Communications Safety feature means that if children send or receive explicit images via iMessage, they will be warned before viewing it, the image will be blurred, and there will be an option for their parents to be alerted. CSAM scanning, on the other hand, attempts to match users' photos with hashed images of known CSAM before they are uploaded to iCloud. Accounts that have had CSAM detected will then be subject to a manual review by Apple and may be reported to the National Center for Missing and Exploited Children (NCMEC).

The new features have been subject to a large amount of criticism from users, security researchers, the Electronic Frontier Foundation (EFF) and Edward Snowden, Facebook's former security chief, and even Apple employees.

Amid these criticisms, Federighi addressed one of the main areas of concern, emphasizing that Apple's system will be protected against being taken advantage of by governments or other third parties with "multiple levels of auditability."


Federighi also revealed a number of new details around the system's safeguards, such as the fact that a user will need to meet around 30 matches for CSAM content in their Photos library before Apple is alerted, whereupon it will confirm if those images appear to be genuine instances of CSAM.He also pointed out the security advantage of placing the matching process on the iPhone directly, rather than it occurring on iCloud's servers.

When asked if the database of images used to match CSAM content on users' devices could be compromised by having other materials inserted, such as political content in certain regions, Federighi explained that the database is constructed from known CSAM images from multiple child safety organizations, with at least two being "in distinct jurisdictions," to protect against abuse of the system.

These child protection organizations, as well as an independent auditor, will be able to verify that the database of images only consists of content from those entities, according to Federighi.

Federighi's interview is among the biggest PR pushbacks from Apple so far following the mixed public response to the announcement of the child safety features, but the company has also repeatedly attempted to address users' concerns, publishing an FAQ and directly addressing concerns in interviews with the media.

Article Link: Craig Federighi Acknowledges Confusion Around Apple Child Safety Features and Explains New Details About Safeguards
One thing that has yet to be detailed is how this will affect my phone performance and battery life. I assume if computational analysis is happening on my phone both will take a hit.
 
  • Like
Reactions: Stunning_Sense4712


Apple's senior vice president of software engineering, Craig Federighi, has today defended the company's controversial planned child safety features in a significant interview with The Wall Street Journal, revealing a number of new details about the safeguards built into Apple's system for scanning users' photos libraries for Child Sexual Abuse Material (CSAM).

craig-wwdc-2021-privacy.png

Federighi admitted that Apple had handled last week's announcement of the two new features poorly, relating to detecting explicit content in Messages for children and CSAM content stored in iCloud Photos libraries, and acknowledged the widespread confusion around the tools:

The Communications Safety feature means that if children send or receive explicit images via iMessage, they will be warned before viewing it, the image will be blurred, and there will be an option for their parents to be alerted. CSAM scanning, on the other hand, attempts to match users' photos with hashed images of known CSAM before they are uploaded to iCloud. Accounts that have had CSAM detected will then be subject to a manual review by Apple and may be reported to the National Center for Missing and Exploited Children (NCMEC).

The new features have been subject to a large amount of criticism from users, security researchers, the Electronic Frontier Foundation (EFF) and Edward Snowden, Facebook's former security chief, and even Apple employees.

Amid these criticisms, Federighi addressed one of the main areas of concern, emphasizing that Apple's system will be protected against being taken advantage of by governments or other third parties with "multiple levels of auditability."


Federighi also revealed a number of new details around the system's safeguards, such as the fact that a user will need to meet around 30 matches for CSAM content in their Photos library before Apple is alerted, whereupon it will confirm if those images appear to be genuine instances of CSAM.He also pointed out the security advantage of placing the matching process on the iPhone directly, rather than it occurring on iCloud's servers.

When asked if the database of images used to match CSAM content on users' devices could be compromised by having other materials inserted, such as political content in certain regions, Federighi explained that the database is constructed from known CSAM images from multiple child safety organizations, with at least two being "in distinct jurisdictions," to protect against abuse of the system.

These child protection organizations, as well as an independent auditor, will be able to verify that the database of images only consists of content from those entities, according to Federighi.

Federighi's interview is among the biggest PR pushbacks from Apple so far following the mixed public response to the announcement of the child safety features, but the company has also repeatedly attempted to address users' concerns, publishing an FAQ and directly addressing concerns in interviews with the media.

Article Link: Craig Federighi Acknowledges Confusion Around Apple Child Safety Features and Explains New Details About Safeguards
As an opt-in parental control, the Messages feature seems less controversial. The iCloud scanning feature certainly sounds more invasive to the average user - but perhaps the big news here is that the other major companies already scan every single image uploaded to their cloud across the board, whereas Apple is attempting to be more selective.
 
Last edited:
I don't agree with this concept either.

A banker cannot look in my safety deposit box, even if it is stored in the bank.

This is exactly the same thing (in my mind at least).

That's what privacy means. You either have it or you don't.

Turns out, we don't, no matter what Apple says. Oh well.
Those banks can actually be held liable for things in your digital account though. Like if you deposit over $9,000 and they don't report it to the government they are liable for suspected money laundering. The safety deposit box is a bit different obviously.

I do think they should use end-to-end encryption with keys stored on our own devices only for iCloud but that's a wish they seem unwilling to entertain.
 
In summary. if you don't trust apple, you won't believe them
if you trust apple, you will believe them.

All responses here doubting Apple is built on not Trusting Apple to begin with

All responses here trusting Apple is built on true customer loyalty

thats basically it
 
  • Like
Reactions: hans1972
We’re not stupid, we know this is a back door for government surveillance of our personal photos.
People who engage in CSAM don’t store the images in iCloud.

Good job on spectacularly destroying the privacy focus you’ve worked so hard to build.
 
im not seeing the pushback, it is like people are defending child abusers (scum). Aren't they all using Android anyway? (that's a joke)
 
The law states if you download three images you have committed a felony, not 30.

Besides your isp has to report you if they detect your doing this.
 
  • Like
Reactions: DblHelix
This is really bad news for privacy - and I am super sad, that Apple is ruining his reputation with such an action.

to find maybe ONE single culprit - MILLIONS loose their privacy :(
It's completely stupid. I always remember the Metro Bank CEO, when asked why he did not install security glass at his banks which was common place in banks at the time in the UK, he said something along the lines of, what is the point of that, one person, will rob one bank and it isn't very common. So why would be ruin the experience of banking for everyone for such a thing.

Apple is doing the same thing here. It is damaging the brand badly when it wasn't even necessary. Apple already scanned photos in the cloud. That was sufficient enough. The "GestapoScan" as I will call it, why do they even need it.
 
The main goal is to make sure Apple doesn’t get CSAM on its servers, so i don’t know why Apple doesn’t just do what Google, Facebook, etc do and just scan the photos in the cloud. That way if you don’t use iCloud, your stuff never gets scanned on device or off device.
The irony is, this is exactly the outcome that the intelligence agencies want.
 
They have been scanning photos for long before this feature was announced. How do you think they recognize people in your photos? Where was your outrage when that feature rolled out?
Yep. The technology to do that is here (and it's been here from sometime now). We can complain about it, we can be suspicious, we can cry but... they can't just make this 'feature' disappear. The Pandora's box has been opened ;)
 
  • Like
Reactions: hans1972
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.