Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
You had an Apple I?
The guys "involvement with Apple predates his by TWENTY TWO years". He may still have owned an Apple I, but...

I reviewed one of the first Apple Mac's (box with cathode ray tube and floppy) for a local tech magazine around 1987. And it was returned in due course and fully working order, as is - and especially was - the custom and requirement by any serious tech journalist.

Now, let's see, if you can calculate how long ago this was...
 
FFS, all Apple had to do was release something like this: https://blog.cloudflare.com/the-csam-scanning-tool/
That is basically (if not exactly) what they are introducing to run on the local device. Even if it weren't run locally, cloud storage companies use hashes to detect corrupted uploads (comparing the initial, local hash to the uploaded file's cloud hash).
Once again, Apple's biggest issue is a lack of an appropriate level of transparency.
 
Last edited:
I’m so happy to see so many of us pushing back on this! Most seemed already aware, but for some, this was a wake-up call.

Don’t stop there! This isn’t just about catching child predators. You’ll soon be rightfully suspicious of many of the big lies, contortions, and omissions of the last 70 years! And hopefully more enlightenment is obtained over the big lies and manipulations of the past 4-5 years.

don’t accept the BS!🤠
 
  • Like
Reactions: UltimoInfierno
Your privacy is absolutely affected.

Apple is playing the "but the children" card to open this door because it knows it can frame anyone who raises legitimate concerns as pedophiles or privacy nuts. This was absolutely a strategic choice; Apple needed a way to sell a terrible, E2E encryption undermining technology and this is it.

The issue, as many have pointed out, is that there's no protection against Apple expanding its hash database when forced to do so by governments. Apple even implicitly acknowledges this, noting that it will resist such requests without stating that it will refuse them.... because it can't refuse them should a government demand it do so.

The reality is likely this: Apple has been facing pressure from law enforcement and governments to implement a backdoor. To accomplish this without infuriating people, it created this tech and is attempting to sell it as a positive.

It is not.

Examples? Sure.

  • Someone sharing a document critical of the president of Oppressistan? Add the hash.
  • Photos taken at a protest, shared between attendees who were masked up to remain anonymous? Add the hash.
  • Want to target a political dissident but you can't find her? Add a has of her Facebook profile photo and ID phones that have that photo saved. Congrats, the search has been narrowed (and you get to scoop up people reading her posts too!)

Maybe you trust Apple not to cave to this. I'm not so optimistic.
 
It's interesting. Apple has been scanning your photos for a long time to identify faces, to create "Memories" and whatever else, and nobody has complained about that. Now that they want to start preventing sharing child porn, everyone is on the barricades. Are you all pedophiles?

I've no problem them scanning (using software) my library to see it's not full of nude pics of children.

Not to mention everyone else is already doing that to everything you put into cloud. As they should. That should only be scary who don't understand how it's done.
 
  • Disagree
Reactions: GhostAtty
A very long and happy association with Apple, and I do hope Apple listen to the concerns and step back from the abyss!
While I support this, for me it doesn‘t change much. Because if forced to abstain from doing the evil rathen than seeing its wrong, its likely they introduce it just at a later time. Ir just clandestine.

They cannot be trusted any more, whatever their decision
 
Your privacy is absolutely affected.

Apple is playing the "but the children" card to open this door because it knows it can frame anyone who raises legitimate concerns as pedophiles or privacy nuts. This was absolutely a strategic choice; Apple needed a way to sell a terrible, E2E encryption undermining technology and this is it.

The issue, as many have pointed out, is that there's no protection against Apple expanding its hash database when forced to do so by governments. Apple even implicitly acknowledges this, noting that it will resist such requests without stating that it will refuse them.... because it can't refuse them should a government demand it do so.

The reality is likely this: Apple has been facing pressure from law enforcement and governments to implement a backdoor. To accomplish this without infuriating people, it created this tech and is attempting to sell it as a positive.

It is not.

Examples? Sure.

  • Someone sharing a document critical of the president of Oppressistan? Add the hash.
  • Photos taken at a protest, shared between attendees who were masked up to remain anonymous? Add the hash.
  • Want to target a political dissident but you can't find her? Add a has of her Facebook profile photo and ID phones that have that photo saved. Congrats, the search has been narrowed (and you get to scoop up people reading her posts too!)

Maybe you trust Apple not to cave to this. I'm not so optimistic.


Another example is that you can make an entire innocent group look bad by association after a bad actor is caught doing something illegal / bad. Let’s say some agitators burn down a racist urban police station. Those agitators have some silly anti-cop anti-nazi memes on their phones.

Now you scoop up college professors, college students, housewives, teachers, artists, and working class people who also have these anti-nazi memes on their phone.
 
Your privacy is absolutely affected.

Apple is playing the "but the children" card to open this door because it knows it can frame anyone who raises legitimate concerns as pedophiles or privacy nuts. This was absolutely a strategic choice; Apple needed a way to sell a terrible, E2E encryption undermining technology and this is it.

The issue, as many have pointed out, is that there's no protection against Apple expanding its hash database when forced to do so by governments. Apple even implicitly acknowledges this, noting that it will resist such requests without stating that it will refuse them.... because it can't refuse them should a government demand it do so.

The reality is likely this: Apple has been facing pressure from law enforcement and governments to implement a backdoor. To accomplish this without infuriating people, it created this tech and is attempting to sell it as a positive.

It is not.

Examples? Sure.

  • Someone sharing a document critical of the president of Oppressistan? Add the hash.
  • Photos taken at a protest, shared between attendees who were masked up to remain anonymous? Add the hash.
  • Want to target a political dissident but you can't find her? Add a has of her Facebook profile photo and ID phones that have that photo saved. Congrats, the search has been narrowed (and you get to scoop up people reading her posts too!)

Maybe you trust Apple not to cave to this. I'm not so optimistic.
All it would take would be a government like China telling Apple either do what they want or they will ban Apple from the Chinese market. Bet you $1M they will comply with whatever the Chinese government wants.
 
Nice paranoid fantasy.
In reality they have on multiple occasions refused requests by the FBI and other agencies to do things like that and taken a lot of heat for it. But sure, Apple has been secretly doing the governments bidding this whole time, and decided to publicly announce this particular feature because?
If Apple wanted to secretly spy on people THEY WOULDNT HAVE ANNOUNCED IT.
They would not be able to hide it forever even if they had wanted to. But they did something almost as suspicious by announcing it right before they release it, instead of taking at least one year to ask for consumer and expert feedback, inviting reviews from security researchers, etc. They did none of that openly. Now we are making all this noise but everything is already all set and decided and about to be released.
 
  • Like
Reactions: DanTSX
This is spyware. It’s unethical software that’s incompatible with a free society. It will be abused by the worst governments. Consider that after 9/11, the Patriot Act could have mandated client side cryptographic hash matching of terrorist names, emails, and phone numbers. Why are you defending the terrorists?
 
They would not be able to hide it forever even if they had wanted to. But they did something almost as suspicious by announcing it right before they release it, instead of taking at least one year to ask for consumer and expert feedback, inviting reviews from security researchers, etc. They did none of that openly. Now we are making all this noise but everything is already all set and decided and about to be released.

I think that there may be a possibility that Apple is being pressured into this by either US intel apparatus and/or China. And this is a clever move by apple to obtain an alibi to unwind their work to date in this privacy intrusion. Y deliberately leaking and causing outrage…
 
Yikes. Reading the Technical Summary yields masterpieces of double-talk such as:


"Nearly identical" doesn't mean "identical". If an image that is only nearly identical can generate the same number, then the number isn't "a unique number specific to that image". If you've encountered hashes as a way of verifying the authenticity of downloads or while reading about blockchain, that's not what is happening here. OK, they're talking about images that differ in size and quality so maybe you could call that "nearly identical" but that "nearly" makes a huge difference in the likelihood of a false match.


OK, so let's just trust that Apple have read about Sally Clark and understand the difference between independent events (tossing a fair coin) and possibly correlated events (e.g. if one of your photos triggers a false match, how likely is it that there will be other "nearly identical" photos in your collection?) and haven't just multiplied the probability of a hash collision by the number of matches (... which would work but for that pesky "nearly").


Which is not the same as "reviews each report to confirm that the match really has found CSAM and, if so, disables the account and sends a report ti NCMEC". If that's what they mean, why not say it clearly?


and


...so ignore the technicalities (which aren't technical enough to recreate and critique the process) and focus on how terms like "number unique to the image" or "identical" have gradually morphed via "nearly identical" and "perceptually and semantically similar" into "visually similar"... and that we're suddenly talking about analysing the features of the image (which is precisely what some people here are saying isn't happening "because hash").

Then we follow up with the truly impressive and reassuring demonstration that a colour picture of a palm tree generates the same hash as exactly the same image converted to monochrome but a completely different cityscape (with nary a palm tree in sight) generates a different hash. Wow. Anybody reading this critically would be asking "what about a different picture containing palm trees, or maybe a similarly composed picture of a cypress tree? How about some examples of cropped/resized images which couldn't be spotted by simply turning the image to B&W before hashing?" Maybe the system can cope with that - if so, why not show it rather than a trivial Sesame Street "one of these three things is not the same" example?

I'm not questioning whether the technology makes a good effort at a very difficult task (matching images without being fooled by inconsequential changes) but the summary reeks of "positive spin" and avoiding the difficult questions: and for any technology like this the #1 question has to be "what are the dangers of a false match" and is the risk justified by the rate of successful matches?

...and will people please, please stop saying "it's not scanning your images, it's only checking hashes" - that's a distinction without a difference even before you replace "hashes" with Apple(R) NeuralHashes(TM).
Again, read Cloudflare's description of "fuzzy hashes" at https://blog.cloudflare.com/the-csam-scanning-tool/
And yes, there is a huge difference between hashes and file comparisons. Hashing is a one-way function from which the original input cannot be reconstituted. This *is* just hashing, but in a way that mitigates easy circumvention and has been done and implemented by others without all the negative hyperbole.
Even if I fully concede that this is completely different from traditional hashes, it is roughly equivalent to other form of physical fingerprinting. Just because you don't understand the technology behind the "fuzzy hashing" doesn't mean there is not a scientific method behind it.
 
Ok, so even though you've established yourself as a troll, I'll bite. What part am I not understanding? (And before you make a comment about me being computer illiterate or whatnot, I'm a systems administrator who has been on the Internet since 1986)
Do you really want me to explain how it is impossible for your personal photos (even if they were child pornography) are NOT part of the database of hashed images Apple is putting on your phone??

Okay...the hashed images, as clearly stated in every document Apple has shared but you have failed to either read or understand, only come from the CSAM database.

How does an image get into the CSAM database? Well, you could look that up for yourself and spend about 3 minutes reading...but okay...

It has been around for more than 2 decades and collects images from KNOWN child pornography sources, including those self-reported and investigated by the organization (imagine some poor teenager who has inappropriate images of them spread online...happens every day). Simply sharing your naked baby pics or personal sex vids/pics online is not enough for them to be purposefully or accidentally added to the database, much less, included in what Apple is checking against.

As I mention above, EVEN IF you are creating child pornography and sharing online to your friends, those images, just by being online, are not added to the database. They must be reported either by someone who has received them so the individual(s) can be investigated and prosecuted or if law enforcement happens to come across them through normal investigation.

But again, this is all clearly laid out in the 5 minute FAQ read or if you want more details on how the hashed images are compared, Apple has that online as well in a series of white papers on the technology.
 
Expected this reply.

I’ll give you a hand digging that hole for your head first.

You couldn’t be further from the truth. Imagine that.

Big tech and US agencies has been forced to admit more nefarious data collection within the past 6 months. Have you become so accustomed to intrusion to be indifferent to it? If so, don’t insult people that still give a **** about their identity.
Link to credible story on either big tech or US Agencies admitting to this?
 
It's interesting. Apple has been scanning your photos for a long time to identify faces, to create "Memories" and whatever else, and nobody has complained about that. Now that they want to start preventing sharing child porn, everyone is on the barricades. Are you all pedophiles?

I've no problem them scanning (using software) my library to see it's not full of nude pics of children.

Not to mention everyone else is already doing that to everything you put into cloud. As they should. That should only be scary who don't understand how it's done.
There's a difference between scanning and saving locally for your personal use and scanning and uploading for the use of a third party... especially as there's no way to check that they're only scanning for exploitative images.

As far as iCloud? Sure, their right to scan it is part of the TOS and you agree to that when you open an account. Don't want someone pawing through your files? Then use a proper, E2E encrypted solution.

Problem is that Apple sold the iPhone as a privacy forward device and is now saying "LOL except for this gaping hole we're poking in that security framework."
 
  • Like
Reactions: BurgDog
Don't get fooled like so many on these bb's. It is nothing to do with protecting children. It does not and will not. If anything it will put more children at risk simply because those who engage in such disgusting 'pursuits' will merely go underground, dark web via Tor, VPN's or heavily encrypted material and that will make the job of agencies employed in charging these people much more difficult.

Apple do NOT have access to all hash's re: abused children, they are merely using ONE NCMEC database that contains data on missing children and exploited children, but where that provides the tip of the iceberg.

NCMEC do great work, no argument, but they have sufficient data on their website to allow ANYONE to send their concerns, or pictures and I urge anyone with concerns to go to that site and send them data if they have concerns.

However their database is literally a drop in the ocean, and police and authorities data both in the UK and the USA are not given to companies, not even Apple, so the idea this is to help safeguard children is simply flawed, NCMEC try, and good luck to them, but many government agencies will now be impeded if Apple go ahead with this, because the culprits will go underground, making children LESS SAFE, and where then its best to communicate direct with either the authorities or NCMEC as their website suggests.

But even with all of that, this is NOT about safeguarding children, it is SURVEILLANCE in the name of safeguarding children.

I was surprised yesterday to find an article suggesting that the software to do this will be embedded in IOS and OS for all Mac devices, and where its technically incorrect to say just switch off iCloud pictures, because the systems software does a recheck before such pictures are even transmitted to iCloud.

That means everyone's Hardware that they have paid for will be used, processing power reduced, and even the cost of electricity born by the USER.

Once this happens, whatever a company says about 'being designed not to do this or that' becomes irrelevant, and where Apple now has demonstrated its stance on privacy and being against surveillance so publicly aired cannot be trusted, because surveillance IS what is going to happen.

Can you imagine companies with multiple Macs whose livelihood depends on privacy, now trusting Apple kit to be used within their organisation? I won't be able to. No publishing house would be able to because a precedent would have been set that your hardware that you bought off Apple in good faith will now be compromised by surveillance, surveillance in this case in the name of child abuse which IT DOES NOT PROTECT AGAINST, but makes it harder for the authorities to pursue those responsible, but in any event whatever way Apple try to dress this up. IT IS SURVEILLANCE and pernicious, and its downhill from then on.

Could you imagine those who think its a good idea because its stated to be about safeguarding children, being in favour if it was sold as SURVEILLANCE OF YOUR SYSTEMS. of course not, which is why this idea is so pernicious.

This whole "will go underground" argument has been used so many times. I'm not sure it works. Its already underground! So underground that no one can solve this problem right now. The people who do it are using encryption tools, methodology and services so powerful its been impossible to stop for a long time now.

I think maybe Apple's stance is that it may exist in other spaces but we'd rather at least scare anyone thinking of doing it on there device as we will flag it. In practise its not much different to a virus scanner running on your machine. Does anyone object to that?

Also, if you send any device/computer into repair even the repair companies are legally obliged to tell the police if they find any dodgy images. They do it all the time.
So I'm wondering what morality / privacy rules that people think Apple are over stepping. When so many of those boundaries are imaginary or already broken.

I just think there is a naivety that people have about modern computer systems. You dont "own" them because you use a license to use any of the software they provide. And when you are connected in any way via a network (i.e. the internet), its not private either. They are two fundamentals that people need to understand.

Its like worrying about the rain in an outdoor swimming pool..
 
Considering Apple has been advertising privacy and security are better in iPhones for recent years, and then are pulling this stuff, I wouldn't support them. Intentions are clear too, they can implement the feature for more and more photo scans of whatever a government ask them for.
 
  • Like
Reactions: BurgDog
I’ve been following this news and thinking about it extensively since it broke. I’ve been a Apple user for decades, though I’ve used PCs and Android phones in a work capacity on and off during that time. This year is a big replacement year for me—I was planning on changing out for an iPhone 13 Pro, iPad Mini 6, and MacBook Pro when they are released this fall. The slippery slope of on-device scanning is a step too far for me though and creeps me out. I’ve been seriously tempted though by Sony’s new Xperia 1 III and by the Viao Z. I’ll be taking a hard look at both if Apple doesn’t reverse course…
 
"Human review"

A while back I found a way/vulnerability/bad UI design/ that allowed another user on a Mac to access files on an external drive that was formatted with only mounting with a password... without knowing the password of the other user. I was mindblown at how easy it was when I accidentally stumbled upon it. I'm no computer pro, but I know my way around a Mac and worked at the Apple Store around 2010.

I went to the security/bug bounty page and did all the contact stuff, made videos, was in talks with someone from Apple Security. The whole deal lasted several weeks, lots of emails and video uploads...

And then when I had finished helping them with all that I knew to show, stopped getting replied to, didn't get the first dime or dollar though the category fit their 'up to $250,000' level on the bounty page.

Hopefully these "human reviews" will go better than my interaction with Apple Security.
 
I think maybe Apple's stance is that it may exist in other spaces but we'd rather at least scare anyone thinking of doing it on there device as we will flag it. In practise its not much different to a virus scanner running on your machine. Does anyone object to that?
Apple doesn't care if anyone has illegal contents stored in their devices. Apple cares if anyone intents to upload those content to their servers, because if they do not do anything about it, they get into trouble with the law.
 
  • Like
Reactions: MozMan68
I wonder what other spying Apple is capable of engaging in that they aren’t acknowledging because it isn’t repugnant child porn? I wonder what the NSA’s, the FBI’s, the CIA’s or any of the 17 government intelligence agency’s involvement is? How would anyone even know? It’s not as if FISA courts emphatically don’t allow spying on Americans. all they need is a “good reason” and the reason doesn’t even have to be true.

This ”slippery slope “ issue at the heart of this applies to all data mining, not just kiddie porn. it can be used for an endless number of good and bad reasons. Centralized data repositories, that are exactly what cloud storage servers are, facilitate the mining process but aren’t an exclusive category of devices that can be mined. Any and all networked storage devices can be mined for data or clandestinely spied on It can be used against someone or just used to silently silence their voice so no one hears them when the speak.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.