Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
What a load of nonsense.

You do realise that this technology is being deployed in the way it is because the actual illegal material is the images themselves?

These images must be removed from circulation.

And considering that false positive rate is one in a trillion, and there will be one trillion photos uploaded to iCloud this year, the fearmongering nonsense is ludicrous.
 
People do realise that companies such as Google, Adobe, Facebook et. al already use some form of automated technology to scan for and detect CSAM? Adobe does it with Creative Cloud:


That's just one example.
Wonder why you would get down voted? I have been reading about people on the Apple centric sites how they are going to dump iCloud and go to Google Microsoft or other cloud service. Good luck with that because as you clearly point out more companies do this but since it is Apple, let’s hop on the let’s bash Apple train.
 
Child advocacy groups always use the line 'think of the children' when they want to push/force their agenda onto who ever they can. Adults are being expected to give up their privacy rights for the rights of children.

If Apple ignored CSAM then they would be accused of ignoring the abuse of children but as we can see, if they accept CSAM, then they are accused of interfereing with the privacy of Adults.

A very good example of how 'think of the children' agenda is pushed through is illegal immigration. The US and the UK have been in the media for many reasons for trying to tackle illegal immigration and what did we see, a photographer took photo's of a dead migrant child washed up on a US shore. The same happened to the UK, a dead migrant child washed up on a UK beach, a photographer took numerous photographs and boom, both incidents made headlines around the world and it forced the UK and the US to soften it's polices on illegal immigration with the result being more and more adults..not children, exploiting the system.

People need to learn, when something affects both adults and children, the wealthfare and safety of children is always put first and adults have to put up with it. Technology is no different as we are seeing ALL tech companies get hit with the same line from child advocacy groups.
 
I find it really scary how many people on this forum defend Apple for this technology. I fear the day when governments will dictate to Apple what images to search for to make it easier for them to track down certain people (like people in China who want a democracy or homosexual people who enter places in Poland's LGBT-free zones).

Yes, other companies have been doing this in one way or another for years, but Apple until recently stood as a sign of privacy for their customers, which I was willing to pay a premium for. Over the weekend I migrated all my data from iCloud to my private server and will never use iCloud again.

The trust I once had in Apple has been irrevocably destroyed.
I find it scary how many people on this forum are losing their minds over a reasonable policy.

Apple is the proprietor of iCloud. They set the rules. If you don't like it, buy a Windows phone on eBay. They're cheap.
 
Wonder why you would get down voted? I have been reading about people on the Apple centric sites how they are going to dump iCloud and go to Google Microsoft or other cloud service. Good luck with that because as you clearly point out more companies do this but since it is Apple, let’s hop on the let’s bash Apple train.

What's interesting is the majority of those who appose this back it up with some "what if" scenario, rather than the actual proposed functionality. It's very tinfoil hat.

If Apple implemented half the stuff people believe on this forum, it would end Apple.
 
Wonder why you would get down voted? I have been reading about people on the Apple centric sites how they are going to dump iCloud and go to Google Microsoft or other cloud service. Good luck with that because as you clearly point out more companies do this but since it is Apple, let’s hop on the let’s bash Apple train.

There’s the downvote squad at full throttle in threads about this.
They will downvote technical explanations, math, probability, cryptography, examples of other companies doing the same, etc.
This drama is a wet dream for Apple bashers and sockpuppet accounts from other companies.
 
I find it really scary how many people on this forum defend Apple for this technology. I fear the day when governments will dictate to Apple what images to search for to make it easier for them to track down certain people (like people in China who want a democracy or homosexual people who enter places in Poland's LGBT-free zones).

Yes, other companies have been doing this in one way or another for years, but Apple until recently stood as a sign of privacy for their customers, which I was willing to pay a premium for. Over the weekend I migrated all my data from iCloud to my private server and will never use iCloud again.

The trust I once had in Apple has been irrevocably destroyed.

All companies check for CSAM of stored contents on their servers.
Apple does exactly the same at the moment.
Apple just moved the process from the server to the device. This is more privacy focused and paves the way for fully-encrypted photo libraries while still complying with child safety laws.

People just go with the hype and don't even bother to read the documentation of how this feature works 🤦‍♂️
 
How long before USA and/or China demands Apple to start scanning for content other than CSAM on all iPhones (not just the ones with iCloud Photo Library enabled), or they would not be allowed to sell devices in those markets? These two countries represents two-thirds of Apple's revenue and therefore have a lot of leverage on the company.

How do you know they're not already? The problem with all this is IF and how. The IF belongs to the category of trying to predict the future. You might as well buy a lottery ticket. Yes, it COULD happen, but then again, maybe it won't. Then there is the how. Would it be implemented stealthily or would it be announced? If it was on behalf of a foreign government, does Apple have to declare that to the US government, for example?

Loads of ifs - may never happen. But it could. But then it might now. But it could. etc.
 
These images must be removed from circulation.

And considering that false positive rate is one in a trillion, and there will be one trillion photos uploaded to iCloud this year, the fearmongering nonsense is ludicrous.
I cannot upvote this enough.

If this paranoia had any logic, people wouldn't leave their home for fear of having police watching them... For the speed of their car to be monitored by cameras... For towns, cities and stores to have CCTV... It's laughable.
 
Why putting ut the FAQs now instead of last week, or during WWDC? Apple can claim that their process is privacy proof, but when there's a human involved, whether inside or outside Apple, the system can be compromised, as human can be coerced.

Thus the key is to not have such potential backdoor to begin with. This was Apple's own excuse during the FBI request. The FBI asked for a specific backdoor, just like Apple putting specific scans of hashes.

The cat is out of the bag.
 
Step 1. Apple does something new, but in a different way than the rest of the industry.

Step 2. The internet cries it’s evil and they will vote with their wallets and leave the ecosystem. Completely ignoring Apple does it in a better and privacy oriented way.

Step 3. Apple explains some better what it did.

Step 4. Everybody shuts up and the industry follows Apple’s lead.

Step 5. Industry and/or governments then try to sue Apple for monopolistic behaviour.
 
BWAHAHA what a joke!

Could governments force Apple to add non-CSAM images to the hash list?
Apple will refuse any such demands.


Just like they refused to co-op with China for mass surveillance?! LOL

LOL Apple, a FAQ wont rebuild the trust...
The privacy train just has departed.
They are just a bunch of liars!
 
How do you know they're not already? The problem with all this is IF and how. The IF belongs to the category of trying to predict the future. You might as well buy a lottery ticket. Yes, it COULD happen, but then again, maybe it won't. Then there is the how. Would it be implemented stealthily or would it be announced? If it was on behalf of a foreign government, does Apple have to declare that to the US government, for example?

Loads of ifs - may never happen. But it could. But then it might now. But it could. etc.

You actually have a WAY better chance of winning than lottery than this accidentally flagging more than one pic to the point where an Apple employee would even have to review the flagged pics (yes, it has to be more than one pic)
 
There’s the downvote squad at full throttle in threads about this.
They will downvote technical explanations, math, probability, cryptography, examples of other companies doing the same, etc.
This drama is a wet dream for Apple bashers and sockpuppet accounts from other companies.
its not about 1 in trillion. Its about principles. The basic fact is that Apple will be scanning your imiges (if you using Cloud they say...) but than why to do it on my phone? Please use your technology on the iCloud servers, you can decrypt all back ups and photos anyway. Stay away from my device!
 
BWAHAHA what a joke!

Could governments force Apple to add non-CSAM images to the hash list?
Apple will refuse any such demands.


Just like they refused to co-op with China for mass surveillance?! LOL

LOL Apple, a FAQ wont rebuild the trust...
The privacy train just has departed.
They are just a bunch of liars!

So, when the US government asked them to help unlock the iPhone of a KNOWN terrorist/murderer and they refused? What was that?
 
To recap, Apple is the LAST big company to do this, not the first. (they reported very few CSAM incidents up until last year)

They’re doing in a way that it’s more privacy-minded, mostly on-device for people not owning CP and in a way that makes it cryptographically impossible for Apple and its employees to look at the supposedly offending photos if you don’t go over a certain THRESHOLD.

And what are people complaining about?

1) slippery slopes for slippery slope sake (could be said about any other server-side search other companies do routinely)

2) “but they said they were champions of privacy” (no contradiction here, they’re complying with CSAM laws in the most privacy-minded manner they could come up with, which costed millions in R&D)

3) various menacing buzzwords (backdoor, spyware, etc.) and semantics to the point of EMPTYING word of any meaning, hey dumb*sses maybe stop with the semantics virtue signalling and let’s hold these companies ACCOUNTABLE for what they ACTUALLY do wrong


Nothingburger and drama.

Not to mention the grossness of Facebook/Whatsapp and Epic chiming in, wow stay classy.
 
So as I said in the other thread regarding "slippery slope", this isn't some nefarious ploy and Apple has no intention of doing anything other than what they have stated.

They are fully aware that if they step out of line with something like this the backlash would be huge.
Not only that, because it is Apple, it brings attention to the rising concerns of such technologies. Obviously other entities are employing some of the same technologies without a sending out a press release or with the slightest transparency.
It’s great that Apple stuck it’s neck out and started this conversation. It deserves all the scrutiny it is getting.
 
  • Like
Reactions: flowsy
So as I said in the other thread regarding "slippery slope", this isn't some nefarious ploy and Apple has no intention of doing anything other than what they have stated.

They are fully aware that if they step out of line with something like this the backlash would be huge.
And what is the consequence to Apple if this is abused? Was it not long ago we heard that apple contractors hear confidential details of people via Siri:

https://www.theguardian.com/technol...-hear-confidential-details-on-siri-recordings


So, if they make it known that if abuse is found to have occurred, then people can sue, that is a good step. Just saying you won't do something for an organisation that has so much people using it's products is not enough. Stating consequences when abuse occurs lets me know Apple takes it seriously. Anyway, time will tell but Apple's talk of privacy is not it as they tout. Although, they are still better than others in my view. However, we customers have to hold them to account and getting clarity and answering people's concerns is a good step. Their goal is definitely VERY GOOD but they often make it know one should not sacrifice privacy for this. So, my only issue is that the system can be abused. If they can demonstrate in time it won't be then am very fine with it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.