Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
If only it was that easy, android is trash. 🤦‍♂️😂
Why do people keep saying Android sucks? AFAIK AOSP Android, which is open-source and you can compile yourself, provides far better OS transparency. And in addition to that, Android has copied all of iOS's new privacy features. What's wrong with Android?

Don't tell me Apple has App Tracking Transparency. That feature is merely a renamed switch. On previous versions of iOS and Android it's called "Limit Ad Tracking". (Read Apple's documentation, they said it)

If you don't like Google, just use a Google-less ROM with MicroG.

And in case Android implements a feature you don't like (e.g. CSAM), or has a bug they never seem to fix, you can simply modify the source code yourself. If you don't know how to do that, there are plenty of custom ROMs that will do this for you. AOSP Android actually gives you total control over your device.
 
Last edited:
They do content matching. Apple's NeuralHash algorithm extracts features from the picture and computes a hash in a way that ensures that perceptually and semantically similar images get similar fingerprints. Apple so far has not told how flexible this system is, but its aims go well beyond traditional pixel-based hash matching.
Are you sure?
As far as I understand the issue has been that images can be scrambled etc.. so its possible that there hash will include a method that takes into account descrambling images. But essentially the hash is looking for the exact image, not something that looks like it. In other words, you couldn't take a photo that looks like an image in CSAM and have that match. I dont think thats how it works at all.
 
Why do people keep saying Android sucks? AFAIK AOSP Android, which is open-source and you can compile yourself, provides far better OS transparency. And in addition to that, Android has copied all of iOS's new privacy features. If you don't like Google, just use a Google-less ROM with MicroG. What's wrong with Android?

iPhone just works, with android it's a fragmented, bloated mess, have to run around in circles through pointless settings disabling a 100 things just to set your phone up and hope it works properly, that's just the beginning. 🤦‍♂️
 
But why? Why bother with the CSAM step at all? If they wanted to use the technology for detecting copyright infringement (or anything else), why not just go straight for that? What would they stand to gain by using CSAM scanning as a first step towards that?
Because they know this is a wildly unpopular idea and the best way to sell it is as though it's intended to combat CSAM.

That's called 'manufacturing consent'.
 
How do you not understand this? Nobody is afraid of their images being scanned. What people are worried about is the fact that once this door is open, it can never be closed. Apple cannot simply refuse a demand from any national government for the furtherance of this technology to be used in other applications. It WILL be abused. And there will be no turning back once it is released.

Honestly, if people like you don't understand the true issues of this situation, you really shouldn't be talking about it.

The door already opened in 2015 when the technological capability came about with the launch of iCloud Photo Library; just because Apple haven't actively used that technology up to now doesn't have any bearing on the potential for a government to have compelled them to use it at any time. If a government says to Apple "we know you have access to people's photos and the technology to scan them for pictures of XYZ", Apple can't just turn round and say "we could, but we don't yet offer that feature, sorry".
 
  • Like
Reactions: Stunning_Sense4712
It’s the technology itself that is terrifying people, not Apple’s use of it, which I understand. You could essentially build a database of anything digital (pictures, movie files, pdf’s), attribute a unique hash to them and scan for them across people’s devices.

The implications of this tech are indeed huge. But is there any evidence that suggests it isn’t already being used covertly on existing devices already? Obviously no company has ever gone public about having knowledge of files stored locally on a device in this way, but could operating systems not already be analysing people’s files from a purely meta-data centric perspective and cross-referencing that data with existing databases without the users knowledge?

It seems like a very light and easily implementable mechanism to have built into an operating system and I would be more surprised if certain companies (not Apple) haven’t already implemented some level of this tech onto their devices with their users none the wiser.
 
Last edited by a moderator:
the whole point of this isn’t to find and prosecute child abusers, it is to get a system in place for monitoring everything we own digitally. by saying they are going after paedophiles they have chosen a subject that it is very hard to argue against, witness the posts earlier saying you must be a paedo if you are against it. so the public will think it is all a good idea, until their door gets kicked in for a meme about transgender people or whatever their government takes offence at.
you never know quite how slippery the slope is until you are in it and it is too late, and if you give power to anyone they will ALWAYS abuse it

Agreed. Still within the period of living memory people were literally sentenced to death or the Eastern Front for nothing more than making jokes about Hitler or calling Goering fat.
 
Longtime reader, like since Ti 667 days... anyway, my initial reaction was, never will I ever use iCloud for photos despite it being really convenient now that bandwidth is cheap. But then I realize, with this kind of enforced compliance to searching, not unlike that we accept with email virus scanners (btw), is that without an encryption offer to boot, scanning of this nature is a simple invasion of privacy. In this sense, it's an invasion of ones right to be free to enjoy the ability to simply be confident of that privacy, and feel good about it, with your inner self. The proposition that even the thoughts you have or artworks you might make are not subject to criticism or control, no matter what they are, has political flaws citizens of free societies should consider.

But Apple is a private company and therefore, I accept they have the right to make conditions of their service.

The philosophical societal issue that prevails, in my opinion, with Apple's new found crusade is this:

That should I not load my photos to iCloud, for the reasons outlined above, what is it that I am hiding? Therefore, I must submit to this service, be scanned and prove my innocence. I am innocent this is certain, but why should I be put into a position that makes me feel compelled to submit to a search to prove as much?

Deep down, and I might not be surprised, is this a business building channel whereby I can outwardly prove my innocence to society because I bought Apple? "Using Linux? Are you some kind of pervert?"

The underlying message I get is: Should I use Linux, back up to an external hard drive, that people view me as likely guilty of a crime...
 
Last edited:
Because they know this is a wildly unpopular idea and the best way to sell it is as though it's intended to combat CSAM.

That's called 'manufacturing consent'.

In what way is "we're going to extend our current searching of your photos for CSAM materials on our iCloud servers to scanning your entire local computer for copyright material" an easier sell than just announcing it directly?

If this was Apple's end game do you honestly think this would be the best possible way they'd come up with approaching it?
 
  • Disagree
Reactions: Robert.Walter
Without judging either side, I feel like the voices against the scanning get stronger by the day. It will be most curious to see what will happen during the rollout.
I imagine a class action lawsuit will ensue.

Better would be that suit now with an injunction preventing Apple from releasing this software.

Once that Pandora’s Box On A Slippery Slope becomes real it’s more likely than not the concept will be weakened at best rather than removed. In either case it will lurk waiting to expand its intrusions.
 
  • Like
Reactions: schneeland
That should I not load my photos to iCloud, for the reasons outlined above, what is it that I am hiding? Therefore, I must submit to this service, be scanned and prove my innocence. I am innocent this is certain, but why should I be put into a position that makes me feel compelled to submit to a search to prove a much?
Apple is not asking you to prove your innocence. Apple is required to ensure that nothing illegal is stored in their servers. Apple is already scanning for CSAM photos since 2019 for all photos uploaded to iCloud Photo.
 
Because if they announced E2E encryption on iCloud Photos before doing this there would political blowback which if sustained enough might end with us all getting some US Senator’s ‘solution’ for child porn on iCloud, rather than a solution from technologists, cryptography experts and privacy advocates.
I doubt Apple would do e2e encryotion on icloud. If they planned to do it, they would've made some noise in WWDC because that's their marketing strategy. E2e encryption on icloud would be a huge marketing opportunity.

The current Apple had no problem announcing things for future releases, even if it turned out not happening (AirPower), for marketing hype.
 
  • Like
Reactions: Pummers
Whatever the content searched, this is Apple reversing course on their privacy stance.

IRS is loving this advancement!
 
  • Sad
Reactions: Larsvonhier
Are you sure?
As far as I understand the issue has been that images can be scrambled etc.. so its possible that there hash will include a method that takes into account descrambling images. But essentially the hash is looking for the exact image, not something that looks like it. In other words, you couldn't take a photo that looks like an image in CSAM and have that match. I dont think thats how it works at all.
As per Apple's technical whitepaper:
"Perceptual hashing bases this number on features of the image instead of the precise values of pixels in the image. (...) ensures that perceptually and semantically similar images have close descriptors (...) Perceptually and semantically different images have descriptors farther apart..."

A lot here depends on the thresholds and specific features. Apple may very well have tuned it so strictly that images need to be close to identical to match. But semantic similarity means a similarity of content, e.g. are both images showing similar people doing similar things. Analysing this is hard, and Apple would not have to bother with this if they only cared about visual similarity, minor alterations and the like.
 
  • Like
Reactions: Pummers
No. The hash algorithm in question is designed to be resistant to basic image manipulation. In particular, steganography, which typically doesn't change the appearance of an image, would not have any effect on the image hash.

I imagine the tricks used by people to disguise an image by making it not match a reference hash will draw to a close. It probably doesn’t exist yet but I could imagine a powerful future kind of AI driven flexible subhashing that analyzes key regions of an image by zooming in and out looking for a hash match, and when two or three hits are found in an image it is flagged for review.
 
Last edited:
Apple is not asking you to prove your innocence. Apple is required to ensure that nothing illegal is stored in their servers. Apple is already scanning for CSAM photos since 2019 for all photos uploaded to iCloud Photo.
Never used iCloud due to Wozniak advising it's a bad idea - he was right all along.

But you missed the harsh point my friend, it's psycological warefare of the economic kind using a political hot potato.

I still like and use apple hardware, no question.
 
In what way is "we're going to extend our current searching of your photos for CSAM materials on our iCloud servers to scanning your entire local computer for copyright material" an easier sell than just announcing it directly?
Slow boiling the frog.

They're starting with something that's universally reviled, then they'll add other stuff gradually.

Transphobia in the UK, insufficient reverence for black people in the US, homosexuality in Saudi Arabia etc.
And at every step of the way, people like you will say:

"Well as long as you're not a pedo / a racist / a homosexual / infringe on copyright / have impure thought / criticize the government you have nothing to fear!"
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.