Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
That’s a pathetic excuse followed by a mainstream concern when you don’t look forward, let’s hope someone don’t follow you for your political beliefs or any other activity not illegal for uploading photos to your personal privacy we never look at Apple iCloud inc because a government or political agenda wants it.
I don't think its an excuse, because I acknowledge that this is already happening.

EDIT: i.e. opposition activists and gays are already being spied on, followed, intimidated or even killed.
 
To all the people that think this is Apple spying on you. Spying is done to see what you're doing and extract information.
Since for 99.99% of people, no information will be seen by Apple, or leave your devices... how is this "spying".

It's MUCH more like anti-malware software. There's a set of fingerprints, or what you know better as virus definitions, sent TO your devices.
How often does a malware scanner mis-identify a normal file as a threat? I'd bet no-one here has ever seen that. It's possible a few have but unlikely.

Apple's devices are already "scanning" your photos... how do you think those name tags appear? How do you think it knows there's a tree, or the photo is mostly orange?

This new system is simply leveraging the existing things your iOS (All devices to day) do already, and just focuses on one particular aspect of possible abuse.

So, please... tell me... where's the 'spying' you're all screaming about?

Yeah but how often have you heard people say “oh my computer’s not working right at the moment because the malware scanner is running”

Apple has installed a scanner on your devices that will consume hundreds of gigs of data doing billions of checks
 
So long, Apple? What potential alternatives and actions are others here taking?

Long term, looking at solutions like PinePhone and Librem. Neither is adequate today. Opportunity to invest in the future. Build trustless systems by investing your time and money in up and coming solutions. Also working on setting up NextCloud for self-hosted cloud storage. It's crappy. But iOS 15 is DOA.
 
  • Like
Reactions: femike
Think it's too late. The mob has already made up it's mind. For what it's worth, I think the idea behind the implementation is pretty sound and being done for the most respectable of reasons and ultimately if you don't want it, you don't use iCloud to store your photo's. I understand the outrage though because of the scope for abuse.. and the fact that today it's just looking for CSAM images but who knows what it could be used for in the future? But with all this power surely they have some duty of responsibility to look out for those who are incapable of defending themselves.
The MR mob can be tough. Imo, the average consumer will say this is a good thing.

The downside is apple possibly viewing legal, intimate photos for that one in a trillion mistake.
 
The difference is that these companies don’t use privacy as excuse and human right to sell their products.
I think I was pretty clear when I said I trust them more than others...so, yes, they are correct in promoting privacy in my opinion as a benefit over other companies.

If you don't trust them NOW versus before when they were using your data to improve their products and the world in general, that's your issue. They aren't doing anything different now than they were in the past.

iProducts are defaulted from the factory to track your movements for the greater benefit of the world when it comes to providing traffic info (whether you even have a mapping app on your phone) and another setting to improve Maps when used.

I've never heard an uproar about that....why? Because they do it ina way that protects your privacy. Why is it any less likely that the government could use this process to track your movements?

Apple will scan your photo to determine what type of flower or dog is in the pic....why wasn't this brought up as an invasion of provacy or allowing governments to access this process to scan photos for other things?
 
Yeah but how often have you heard people say “oh my computer’s not working right at the moment because the malware scanner is running”

Apple has installed a scanner on your devices that will consume hundreds of gigs of data doing billions of checks
Citation as the amount of data consumed?
 
  • Like
Reactions: ohio.emt
Can the CSAM detection system in iCloud Photos be used to detect things other than CSAM?
Our process is designed to prevent that from happening.
Apple’s answers are vague at best and some are seriously misleading. Take the quoted question as example. The honest and correct answer would be “Yes”. Instead, they talk about who provides the CSAM data, which doesn’t answer the question.

In essence, Apple’s system is detecting X by scanning Y and reporting to Z. Currently, X = CSAM, Y = iCloud Photos, Z = Apple. Tomorrow X, Y and Z could be anything.

Thus, this system effectively is a backdoor on your personal device capable of searching anything on it without probable cause, a warrant or any suspicion whatsoever. This is a type of mass surveillance that stretches even beyond the imagination of the worst dictators in history.
 
But with all this power surely they have some duty of responsibility to look out for those who are incapable of defending themselves.

I think regardless and this won't shift people's opinions about the rollout, hence the word regardless- Apple has the moral imperative to publish the impact this technology has had on erasing scum from society- and I fear it won't have really any for all of its good intentions which will make it all the more puzzling. We just have to trust what's going on behind the curtain rather than any pretense of transparency.

I would expect, if effective, we'd see lots of high profile celebrities and such, who have associated with a certain man who has a name that rhymes with Reffrey Leftpeen, begin to drop like rocks and screech every day until then that this technology is fascistic and beyond the pale for Hollywood's 'untarnished' morality. Some of them may even drive profits for Apple, with a certain service called Apple+ and Apple Music... and be comfortably lodged in the entertainment industry, and represent a fundamental conflict of interest.
 
Last edited:
Why do Apple think their customers might be child molestors and why do Apple think they are some sort of police unit who need to scan people's phones? It's insane, an extraordinary abuse of power, and shows how out of touch with reality this company is.
I suppose you're the type that thinks reduced sugar groceries are an awful idea...
 
Apple’s answers are vague at best and some are seriously misleading. Take the quoted question as example. The honest and correct answer would be “Yes”. Instead, they talk about who provides the CSAM data, which doesn’t answer the question.

In essence, Apple’s system is detecting X by scanning Y and reporting to Z. Currently, X = CSAM, Y = iCloud Photos, Z = Apple. Tomorrow X, Y and Z could be anything.

Thus, this system effectively is a backdoor on your personal device capable of searching anything on it without probable cause, a warrant or any suspicion whatsoever. This is a type of mass surveillance that stretches even beyond the imagination of the worst dictators in history.

I understand what you're saying but... isn't it a tad naive to think that any user of a consumer device or service, such as any mass market mobile phone or cloud service of just about any kind, isn't already haemorraging personal data to the parent company.. they're all at it and have been for years. We all know it, and we know it's probably worse than you can even imagine given how vague and lacking in detail all of these companies are about what they gather from you. Yes I appreciate this is next level in terms of it's your personal photo library.. but the point still stands. Apple are essentially taking a pasting for being, vaguely, transparent about their intentions.
 
People do realise that companies such as Google, Adobe, Facebook et. al already use some form of automated technology to scan for and detect CSAM? Adobe does it with Creative Cloud:


That's just one example.
Do you realize you just put Apple in the same group as 3 of the most untrustworthy companies in the world when it comes to data? All 3 are creepy as can be.
 
  • Like
Reactions: mr_jomo
The problem is not that they are doing this for CSAM images. It's well intentioned. The problem is the backdoor that they've opened.

Imagine this scenario. The FBI is trying to find someone very specific. They contact Apple but Apple holds its ground and gives a firm "No". Fine right? Do you think the FBI will stop there? No of course not, they will go to the NCMEC and have them upload a set of images. Chances are, they won't even bother with Apple, they'll go directly to the NCMEC and start finding people of interest.

Now replace FBI with another country's government. Say, the China, and the pictures are sets of pictures from the Hong Kong protestors.

Now replace China with a bunch of hackers who hack the NCMEC.

Lots of things can go wrong here

Apple did put their foot down and refused the FBI's request to unlock the iPhone of the San Bernardino mass murderer...

But that was a few years ago.

Do you think their stance has changed now?
 
Apple did put their foot down and refused the FBI's request to unlock the iPhone of the San Bernardino mass murderer...

But that was a few years ago.

Do you think their stance has changed now?

Apple isn't this immutable divine being, its an organization comprised of individuals - so yes.
The possibility their chance has changed, is at minimum an entirely plausible scenario.


And some might argue that was wrong to not comply in the San Bernardino, and this is also wrong to now implement this new thing, since it was an isolated occurrence to get to the bottom of something from someone who clearly caused harm to many others. Rather than cart blanche access to everyones everything.

Pegasus proved any moral grandstanding was futile anyways. And Apple has been mostly radio silent on that one which still puzzles me but has proven an effective strategy to let it 'go away' in that they don't have to, they just managed to skirt past massive blowback.
--

If they wish to seek real change for youths that are harmed, without potentially riling up a large percent of their user base (who aren't nefarious just concerned with the 'slippery slope' precedent this is setting) a place to start and to add credibility would be with their supply chain. It's got issues.

Until then - the verdict isn't out for me. And it's a complicated one but it does seem like a bad year to implement this, after Epic Games lawsuit + Pegasusgate. Way to really go across field for the touchdown after a fumble and overturn.
iPhone Unlucky #13 is here indeed. And iOS 15 which is one of the more lacking looking releases.
 
Last edited:
And yet people trust third party anti-virus/anti-malware and firewalls to do the right thing (you give them permission to access your files and disks). Similarly for your ISP which may be doing a variety of things (DPI, traffic shaping, monitoring) without you ever knowing about it.
Yes. Consent is everything. I have installed A/V on many machines over the years. More importantly I have uninstalled and replaced A/V that I no longer trusted.
 
  • Like
Reactions: movielad
Solution is to further increase marketing budget and lip service.

DwGoq2uV4AA_Aov.jpg-large.jpeg
 
Google has been doing this with GMail since 2014


No one bats an eye for that

But when Apple does it, NOW everyone gets upset

Why is this?
We all knew Google was untrustworthy, but if you advertise privacy and then don’t provide privacy, that’s when people speak out. Google is creepy and now Apple just became as creepy.
 
Extremely bad Apple, you know and we know the motive to implement this tool is not to 'save the children'. Some maybe naive to believe you and there are those who know but too afraid to speak out in case they get accused of not caring for these children. Nice plan according to you but your credibility, sincerity and trust just dropped to zero.
 
Simple question: if I take a photo of my son in the tub and you can see his thingy, will that image get 1. Reviewed by an apple employee, and/or 2. Reported to the police?
 
Why do Apple think their customers might be child molestors and why do Apple think they are some sort of police unit who need to scan people's phones? It's insane, an extraordinary abuse of power, and shows how out of touch with reality this company is.

Well, Apple certainly doesn't like you taking screenshots of screen recordings of iTunes/Apple TV movies and TV shows...
 
Seems like Apple Campus is on fire!
It's very suspicious to release a FAQ at approx. 2-3am(Cupertino Time) in the morning.
They must have been writing this story tale during the weekend.

What a crap marketing team!

Quarterly profit chasing Tim cook and Apples crisis management team are worried that the share price will tank when trading opens today.





This pathetic damage control faq by the clueless crisis department is all because Apple massively underestimated the backlash from people who didn't expect Apple to curb stomp their privacy with this BS "protect the children" trojan horse. Sadly we accept that it is the norm for Facebook, alphabet and amazon to rape our privacy and datamine every byte of our existence but not Apple



My main concerns after the complete destruction of privacy is that this system isn’t foolproof, one example: go to Microsoft and search for “kinki kids”- a multi platinum selling musical duo from Japan from the kinki 近畿地方 region situated in south central Japan. Bing search ai assumes the user misspelt kinki as kinky and thus searching for "kinky kids" and you get a child abuse warning under the search bar. If the ‘geniuses’ at Microsoft who pioneered the ai behind CSAM cannot implement it on their platform without clear dumb false positives I guarantee scanning trillions of icloud photos will flag innocent pictures as abuse.

I cannot wait until I see a false positive news story of (1 in a trillion is a made up non audited number to make the system appear perfect) bubba getting a no knock armed raid by the fbi at 3am because skynet ai made a false positive on his collection of pictures of his petite Filipino wife..



Also what is stopping power hungry authoritarians like the ccp abusing this to flag and arrest people with pictures of winnie the pooh xi memes, pro democracy literature, tankman, pro Taiwan Hong Kong independence, Uyghurs in camps etc all under the pretence of protecting national security? Protecting the children is a perfect catch all trojan horse (if you are against that you must be a sick monster with something to hide right?) to seed ‘extra’ functionality at the flick of a switch..


This is complete PR disaster by Apple who are destroying their customers trust and band value as a "privacy focused " company
 

Attachments

  • kinki kids.jpg
    kinki kids.jpg
    224 KB · Views: 70
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.