Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
My understanding is that SCANNING already occurs in your iDevice library. Or else, how would the IA knows how to group all the photos with say, a bridge, cars, waterfalls, boats etc.
The software scans your cars and hills, but doesn't call home to Apple.

Now, if the software notices a unique identifier (or enough unique identifiers) of known child porn, it sends a message to Apple "hey, better have a real person take a look at these."
 
Curious that people don’t want Apple to “overreach“ by policing CSAM ‘cause “they’re not the police” but they’d like them to export democracy (sometimes not even 20 years of war are enough for that) to China, antagonize dictators, etc.

Whoever said that Apple should “export democracy”, and what 20 year war did Apple fight? You’re talking nonsense.
 
Clearly these organizations haven’t read all the documentation on how this program works.

However, Apple needs to take some blame for the heat given how poorly they announced/described this initially.

Imagine how many positions would come available if this botched launch happened under Jobs’ watch!?

I would definitely “promote to remove” who
- decided it was a good idea to announce the 3 features together
- decided it was a good idea not to remove that NCMEC ”screeching voices” quote from the internal memo

On the other hand, it was a good call to do this in August on a weekend.
 
The worried groups are not going to name names but in my opinion I think they are looking at Russia and China when then talk about governments subverting CSAM and using it for other means per the countries laws. Russia has laws banning and preventing information about homosexuality. China has laws to prevent what ever they want to prevent from being seen, especially if it's anything negative about the CCP.
 
Thing is, as sick as CP is, its just another thing that is illegal. There are lots of things in this world that are illegal. Animal abuse is illegal, dealing drugs is illegal, trespassing is illegal...Why aren't we scanning for that type of stuff too if we're just looking at doing the morally right thing here? I can understand the suspicion.
 
Whoever said that Apple should “export democracy”, and what 20 year war did Apple fight? You’re talking nonsense.
Some people talk like Apple should antagonize Xi head first.

I say I’m confident Apple will champion privacy in functioning democracies. Elsewhere, it is what it is, I certainly hope they resist as much as possible but I don’t expect them to overthrow governments.

20 years of war not being enough to make a change was a reference to the failure that was Afghanistan for the US and their allies.
 
The software scans your cars and hills, but doesn't call home to Apple.

Now, if the software notices a unique identifier (or enough unique identifiers) of known child porn, it sends a message to Apple "hey, better have a real person take a look at these."
Exactly!! So the scanning already happes, not new at all! What is new here is that AI will report what it perceives to be a child porn, which seems to be what people are having issue with.
 
Last edited:
  • Like
Reactions: bobcomer
What are the chances they completely drop this after all the negative reactions or at least withhold it for future release?
 
Another thread destined to by jam packed with misinformation I see! All the armchair privacy advocates on macrumours claiming that any cloud unencrypted mass scanning is better than encrypted client side methods. Hilariously misinterpreted and the absolute epitome of uninformed bandwagon jumping.
Go read up a bit on the various privacy conscious methods of doing stuff, and compare that with what Apple is doing (by reading apples white paper) verses what virtually every other company does.
I dont want to be searched without a warrant. Simple as that. Especially by the guy who says privacy is a human right.
 
Except the Taliban have a terrible track record and Apple has (mostly in functioning democracies) an industry-leading track record of championing privacy?
F639A909-5772-4E1A-9E4F-D90B0D7E8BF5.jpeg


I didn’t find any difference.
 
I think the Siracusa on the latest ATP is a good listen. If you read all the docs, I think it's clear Apple does not want the ability to ever see a persons photos. Right now Apple has the encryption key to view any photos uploaded to iCloud, and it sounds like they do not want this ability.

Because Apple is so secretive though, it's hard to know if this is one feature in set of features that will overall make iCloud more secure.

One thing we do know is that this was a complete fail by Apple PR with how this was announced.
Its not a PR department fail. They can announce it however they want. It is wrong to do.
 
Scan whatever you can on the server. I know much of the traffic will be encrypted. But Apple needs to remove any potential for a back door on the device.

I hope this is the nail in the coffin of this dumb idea by Apple that no one was asking for.
Which is basically what it is doing

The scan will only happen for photos being uploaded server side anyway.

if you photos are not due to be uploaded to iCloud, they will not be scanned.

the client side scan is an additional step that provides additional meta with the photo that was already going to iCloud.

don’t want your picture scanned? Don’t save your photo to your photo library - store it in a separate app instead or externally.

this is just an improvement on already existing server side scans for photos that would have been scanned server side no matter what.

pretty much all cloud services so scanning, so why does one extra scan prior to upload make any difference?
 
  • Disagree
  • Like
Reactions: rme and giggles
Which is basically what it is doing

The scan will only happen for photos being uploaded server side anyway.

if you photos are not due to be uploaded to iCloud, they will not be scanned.

the client side scan is an additional step that provides additional meta with the photo that was already going to iCloud.

don’t want your picture scanned? Don’t save your photo to your photo library - store it in a separate app instead or externally.

this is just an improvement on already existing server side scans for photos that would have been scanned server side no matter what.

pretty much all cloud services so scanning, so why does one extra scan prior to upload make any difference?
Then why not only do it in the server side, why the scanning in our devices?

Even your logic to explain the reason fails.
 
Which is basically what it is doing

The scan will only happen for photos being uploaded server side anyway.

if you photos are not due to be uploaded to iCloud, they will not be scanned.

the client side scan is an additional step that provides additional meta with the photo that was already going to iCloud.

don’t want your picture scanned? Don’t save your photo to your photo library - store it in a separate app instead or externally.

this is just an improvement on already existing server side scans for photos that would have been scanned server side no matter what.

pretty much all cloud services so scanning, so why does one extra scan prior to upload make any difference?

Because it’s on your device. Get off my device.
 
I dont want to be searched without a warrant. Simple as that.

Where’s the warrant for explosive sniffers devices at airports?
Where’s the warrant when doctors report a gun wound or a beaten wife?
Where’s the warrant when my bank report me because somebody wired me 200k€ from abroad or because I deposited 10k€ cash?
Where’s the warrant when Google/Facebook/MS/others mass-scan for CSAM my private pictures (not actively shared with others) on their clouds?

Sometimes companies have to report illegal activities happening on their premises, be it virtual premises or physical ones.
 
  • Angry
Reactions: Wildkraut and dgrey
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.