Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
They can educate everyone as much as possible but I think the social court has already made its emotional ruling.

The people agreeing with this are the ones making an emotional ruling ie child porn is bad so this is good.

The means just don't justify the end. The box it opens is way way too invasive and way too dangerous, nobody should want a mechanism like this in place no matter what it does.
 
It might lead to a mismatch though. And if you sent these photos to someone (for instance via mail), they might well be in the CSAM database.

How do you know that cat images won't have a greater probability of causing a mismatch?

The images you sent by email (?) will only get in the CSAM database if they are forwarded to NCMEC and they determine, by visual inspection, its child pornography and decide to add it to their CSAM database.

At this point, Apple have no way of making that image part of their system. Apple has no access to NCMEC's database.
 
  • Like
  • Disagree
Reactions: mw360 and dialogos
No. There are two separate "features."

The first one, the one you are describing, is just a local self "censorship" for child uner 13 accounts. This is opt-in basis, so you can just ignore it altogether if you want to. Most kids will lie on the birthdate on online accounts anyway, so they can get full features. This uses ML to "guess" some photos, and blurred it out and put out a warning if it thinks it's something offensive. That's about it. And the important part, it's OPT-IN. That's the key. You as a parent, will make the conscious decision whether to enable this or not.

The second feature, the CSAM part, is forced on your with iOS15, at least the on-device scanning part. There are two phases of the process. First, your device with iOS15 with scan and see if hashes of your photos will match with its database built-into iOS15. This will be done whether you want it or not. There's no way to opt out other than to stop using iPhones (and later, macs). If there are positive matches, the iPhone will generate vouchers. If you don't use iCloud photos, the process stop here. If you use iCloud Photos, after a certain threshold (Craig said 30), iCloud will flag the account and those positive vouchers and the photos will get decrypted to be reviewed and reported to authorities.

The first feature, I have no issues. It's an opt-in, meaning us, the users, have control whether we want to use the feature or not. The second feature, I have issues with, since the on-device scanning part is compulsory when you upgrade to iOS 15. The database is coded into iOS15, and it's opaque. There's no way for you to know what's being scanned (Apple doesn't know either, they rely on the hashes from other parties). It's like you're being searched for "illegal" stuff without you knowing what's illegal. Then on the iCloud part, Apple doesn't indicate a way for appeal process. It's basically you're trusting Apple to be the judge and jury based on something you cannot even check by yourself. This is the chilling part, especially once we consider what can be deemed illegal in some countries.
Thanks for the detailed response and explanation.

Are you 100% about the two aspects?
Following the TWIT podcasts all this past week, during the week, more clarity was being understood, and I'm sure they were finally saying there were three aspects to this new Apple plan.
 
Thank-you for taking the time to write a thoughtful post.

The vast majority of the discussion on this issue does not seem to want to examine Apple's motivations for such an apparent heel turn on privacy.

There are quite some valid post here and people do open interesting perspectives, but there is also a number of reactionary post, which is not surprising privacy is a very personal matter. People also have different levels of what they find comfortable, take into account personal space, some might be fine if you breath down their necks, others will need you to stay a meter and half away.

I legitimately concerned for the future of our privacy, but I am also aware that the digital world presents its own set of challenges and opportunities in this regard. Finding the right balance is not easy. I am sure that we will struck the right decision at some point, but that could take years or even decades. We might find ourselves in a world of eroded privacy and will need to fight back to regain it, this is a scenario that would be best avoided. That is why, this needs more time before being implemented. More open discussions, more educational material on the matter and more research done into to. Including sociological and anthropological views.

Only then a decision should be taken. Once something is out, its out. Pandora's box is an old tale, but relevant still today.
 
  • Like
Reactions: BurgDog
and people will refuse to read anything about it and continue to lash out about privacy.

maybe actually understand what really is going on, how realistically it would affect you in a negative way, and what Apple's end game with this tech really is.

and no, "total invasion of your privacy" is not a real answer.

Fundamentally people understand what is going on. They don't like the feel of it.

This isn't just dumb customers who don't understand technology. Apple's own employees have been registering their own concerns on internal communication systems.

Apple did a really awful job of communicating; just laying out technical documents to explain what they were doing and then allowing the media to write the script. This allowed externals to build the narrative around privacy invasion, and all kinds of privacy groups who Apple usually cites as being supportive of them and their policies have been piling on.

Apple did relatively quickly recognise they were shredding their reputation as the privacy company, and are now trying to undo the damage. The problem is it all looks a bit desperate now.

Rene Richie did an outstanding job of explaining the whole mess in a lengthy YouTube video. He's always pretty supportive of everything Apple does (to say the least), but even he registered concerns and some unease about it.

Ultimately people will move on from this in the weeks ahead, and it will be largely forgotten. It's just going to be much harder for Apple to build effective marketing campaigns around privacy for the moment.
 
  • Like
Reactions: BurgDog
Thanks for the detailed response and explanation.

Are you 100% about the two aspects?
Following the TWIT podcasts all this past week, during the week, more clarity was being understood, and I'm sure they were finally saying there were three aspects to this new Apple plan.
I don’t know what TWIT podcast is, but I’d rather read it myself rather than having others interpret it for me (with their own agenda) since the white paper is readily available. You can read Apple’s own white paper that explains it. Imo the infograph from earlier MR posts clearly showed the process.
 
Last edited:
  • Like
Reactions: Piggie
It seems to me Apple are saying we can do this yet others cant (which is a recipe for disaster) and anyway this meets our privacy standards so thats just fine and dandy. Also for those who work in the privacy business who really don't like what we are doing because of its implications, tough luck. We are to far to big and wealthy for you to do anything about it anyway.
 
Last edited:
There is no confusion. The technical details don't matter. Apple has no business scanning my phone for anything, period.

If they go through with this I may have to suffer using an Android phone next...
Google is no different

 
There are quite some valid post here and people do open interesting perspectives, but there is also a number of reactionary post, which is not surprising privacy is a very personal matter. People also have different levels of what they find comfortable, take into account personal space, some might be fine if you breath down their necks, others will need you to stay a meter and half away.

I legitimately concerned for the future of our privacy, but I am also aware that the digital world presents its own set of challenges and opportunities in this regard. Finding the right balance is not easy. I am sure that we will struck the right decision at some point, but that could take years or even decades. We might find ourselves in a world of eroded privacy and will need to fight back to regain it, this is a scenario that would be best avoided. That is why, this needs more time before being implemented. More open discussions, more educational material on the matter and more research done into to. Including sociological and anthropological views.

Only then a decision should be taken. Once something is out, its out. Pandora's box is an old tale, but relevant still today.
Agree. There are groups working in the field that actually invited Apple for a discussions, but Apple declined and jumped the gun. It’s very strange. And the worst part is the statements coming out from Apple’s own privacy head, simply implying for people to just not do “illegal” stuff.

Well guess what, that definition varies widely between countries. And it’s sad if Tim Cook were okay with that statement. In some countries, LGBT are illegal and punishable by jail time or even capital punishment. Apple’s statement is extremely ignorant. I was actually believing that Apple was the last Silicon Valley company that still wanted to do right things. But I guess all the talks about human rights were just marketing. Next time Apple said anything about human rights again in their keynotes, I’ll be rolling my eyes.
 
Apple gives law enforcement agencies in the US the entirety of your iCloud if served with a warrant or similar.
I assume they to the same thing in China.

I have no problem with such a policy.
The difference is Apple is NOT a law enforcement agency and this policy is prejudicing work by these agencies as it telegraphs exactly how a paedophile can evade scrutiny, and in any event Apple should not be incorporating such surveillance on customers hardware, as there is no excuse for it, no reason for it, as if they want to put it as part of their iCloud then that's their choice and our choice whether to use iCloud. If, however they go ahead with this foolhardy idea of adding it to all software on Apple devices, it is surveillance via someone else's property, a very slipper slope, and doing it in the name of child safety, when it does the opposite is rather poor form.

As much as we might not want Apple to do this via iCloud, it would not compromise individual systems, and the proposed method Apple will use DOES compromise systems, and for example many Uk government agencies utilise Apple devices in the field, many local authorities utilise Apple equipment now, and even agencies involved in fighting Child Abuse where such pictures are held and passed to fight the very thing Apple suggest they are doing this for, but where these agencies jobs will be harder and harder, and if Apple goes ahead with the proposed system via the billion or so customers equipment it has sold, then many of these could not use that equipment as it would compromise the specifications that are required for such work. A similar situation would occur where businesses use Apple devices, where companies may be dealing with sensitive work. I could not use the equipment either, and I suspect many others would CHOOSE not to merely because whatever way you look at it, installing software on the USERS machine does give the potential for abuse of the system, whereas there is no reason why any picture hash checks could not be conducted via iCloud, leaving customers equipment as sacrosanct.

Personally even the surveillance suggested, and it is surveillance, is sad because it will not reach the objective that Apple use as the excuse to have this software loaded into YOUR hardware, because it will make life harder for agencies tackling the problem as miscreants WILL take evasive action easily leaving only innocent Apple customers being the subject of having software embedded in their private computers, for no good reason.

Apple can like others have CSAM via the server itself leaving computers sacrosanct, but it seems to want to engage in surveillance on users own equipment, using users processing power and their electricity, which does set alarm bells ringing.
 
Google is no different

With respect Google are different?

They do NOT have such a facility built into operating systems of any devices that users have bought?

They use CSAM on line, not within users own hardware.
 
No big deal, going to move away from iCloud and save myself the $9.99 a month. There’s nothing even remotely bad on my phone but I don’t care enough to read their explanation on how it works and why blah blah blah. It’s their choice to do whatever they want and if they want to scan photos before being uploaded to the cloud then fine, I just need an excuse to rid myself of one more pointless subscription anyways
 
This might be the worst crisis Apple has faced in years.
I wonder if there is anything Apple can say or do, except withdraw it, that will fix this.
The "internet mob" has already decided this is bad and most people will just see the headline "Apple is scanning all your photos and will report you to the police" and have no interest in learning about the tech.
The badwill this has created is massive.

For the last decade Apple has been all about security and privacy. They even denied an access to terrorists iPhone because they were afraid the hack or back door could be used by others for enhanced or even mass surveillance. Apple told us it was like opening a Pandoras box and what comes out of it can‘t be put back.

Few years later Apple is introducing mass surveillance on all iOS users. They will scan all of your pictures based on hashed image database which they can’t control. In matter of fact Apple doesn’t even know what that database contains. The database is controlled by NCMEC, a private non-profit organisation which is headed by a guy who previously was the director of security operations for information systems and global solutions at Lockheed Martin. It’s skunk works all the way… For some reason many big tech companies are using NCMEC list instead of Interpol list which apparently contains far more data than NCMEC list. Funny thing is that none of the big tech know what that NCMEC list exactly contains. Also why is the data only about child porn, why isn’t there a list of missing children etc. ?

So let’s assume Apple and rest of big tech manage to catch some of those with child porn. If it works then why stop there? Why don’t we scan for data relating to murders and other serious crimes? Scanning documents for financial crimes would most likely yield excellent results so we should do that also. If we get results then we should focus on crime prevention. If data suggests that some one is planning a crime or might be tempted to commit a crime authorities should be notified so appropriate steps can be taken in order to prevent the crime. Let’s just monitor everything because we can. Let’s make this 1984.
 
...

I would urge anyone with information that could assist them to do so, as I doubt many people will be against fighting child pornography.

...
I absolutely agree that it must be faught. But I am not the police and I do absolutely not want to be linked to even the slightest knowledge about anything around activities or pictures of this kind.

So assisting any authority personally brings up the question, why do I know that? Could it be...?
And then I can say goodbye to my life.

Let the police and authorities do their work. They are authorized to do that. I just do have to stay away.
 
  • Like
Reactions: 09872738
I don’t know what TWIT podcast is, but I’d rather read it myself rather than having others interpret it for me (with their own agenda) since the white paper is readily available. You can read Apple’s own white paper that explains it. Imo the infograph from earlier MR posts clearly showed the process.

Honestly the TWiT Network of podcasts is made up of very knowledgeable people who are very Apple focussed and wish to fully understand and explain all aspects and they have no personal agenda's whatsoever.

Honestly I'd very highly recommend this if you'd like to watch:

Mac Break Weekly


Enjoy :)
 
What I find interesting, is the timing of Apple's decision to implement this software. It comes shortly before the release of iOS 15, and a month before the official announcement of the iPhone 13.

That's really bad timing for ticking off a lot of users.

It's curious as to what might have prompted their decision...
 
  • Like
Reactions: addamas
What additional data does it upload not available on the iCloud, can it upload your fingerprint data? can it upload your facial ID data along with the positive hash match? can it upload your WhatsApp chat ? What additional data outside of iCloud data can it upload. please enlighten 😂

Thank you for coming out and proving you have no idea how, and where CSAM scanning is taking place.

Since on failing images it sends a voucher with the hash, and image to the server.
 
  • Like
Reactions: ian87w and 09872738
I’m not a lawyer, expert or even an American, but even I know the 4th amendment doesn’t constrain the actions of private companies. I’m fairly sure that nothing else in The US Constitution does either.

Yes, you are correct. However, as I was sent a summary ruling form the 10th Circuit it would seem that NCMEC is deemed to be a federal agency....Pasted what I just wrote in the "Craig F..." thread.

I base this on having just read the appeals decision summary of August 5th, 2016 that determined NCMEC is a federal agency.

In the case the 10th Circuit decided that information, in this instance scanned, seized and sent by AOL to NCMEC and then reviewed by NCMEC was an unlawful search by a government agency.
 
  • Like
Reactions: BurgDog
What I find interesting, is the timing of Apple's decision to implement this software. It comes shortly before the release of iOS 15, and a month before the official announcement of the iPhone 13.

That's really bad timing for ticking off a lot of users.

It's curious as to what might have prompted their decision...
Something must have happened during Tim Cook's summer camp at Sun Valley.
 
Mass surveillance of a billion iPhone users for what – now that every criminal has been warned?

Since it is on the device it looks like a first step, the second step could be a neural network detecting new images (taken with the camera).

It's just unacceptable – I won't update software or hardware.
This is exactly what everyone who cares about Privacy needs to do. Turn off iCloud for photos, messages, don't update, and don't buy the next Apple device.
 
From the document: Apple doesn't know what the hashes represent, and they include any hash that is published by at least 2 child safety organizations, after which the matching images are reviewed by a human.

If someone could somehow convince at least 2 organizations to publish an arbitrary hash, then also convince the reviewers to flag the images it represents, wouldn't that be a way to circumvent the safety measures described in the document, while at the same time providing Apple with plausible deniability?
Apple has always been washing off their hands. They use the pretense of "oh we just following local regulations" when they were asked about their decisions in countries like China, Russia, etc.

Countries don't even need to poison those hashes. Apple already stated that the system can be tailored to per country basis. Let's say, a country has a certain law, or just want to censor certain things. They could simply go to Apple, "Hey Apple, here is our hashes for child safety made by our own." Would Apple deny it? That country can go to the press and claim Apple doesn't love children.

When one starts thinking more about this, it brings up a lot of issues and questions. But unfortunately, looks like any kind of discussions is already off the table. "Pffft screeching minorities..."
 
  • Like
Reactions: BurgDog and opfreak
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.