Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

LeeW

macrumors 601
Feb 5, 2017
4,213
9,159
Over here
So cloud scanning is still OK according to the EFF?

If you put something on the cloud you are transferring it to someone else's servers, whether Apple, Google, your web hosting account, and so on. Someone else's hardware and responsibility, so yes, a company scanning what you put on their servers is expected and should not be a surprise. Put something on the cloud you do not want to be scanned? Encrypt it.

Quite different from coming into my private space uninvited and looking through my content.
 

mdatwood

macrumors 6502a
Mar 14, 2010
907
877
East Coast, USA
From the trial emails it's clear Apple believes they have a lot of CSAM in iCloud Photos and they don't want it there. So, starting with this premise that they are going to change something, they have 2 options. Decrypt all the photos in iCloud and scan them there or use the method they presented.

The method as presented is ironically more secure and privacy oriented than scanning in the cloud. All the arguments against are 'what if'. The negative press may push Apple off the new method, but make no mistake, they are going to do something.
 

russell_314

macrumors 603
Feb 10, 2019
6,037
8,939
USA
If you put something on the cloud you are transferring it to someone else's servers, whether Apple, Google, your web hosting account, and so on. Someone else's hardware and responsibility, so yes, a company scanning what you put on their servers is expected and should not be a surprise. Put something on the cloud you do not want to be scanned? Encrypt it.

Quite different from coming into my private space uninvited and looking through my content.
Exactly. There’s a big difference between scanning data on my personal devices versus scanning data on cloud devices which are owned by whatever corporation owns those servers. Apple has always scanned iCloud data for CSAM. It just never got much media attention because no one really cared. OK you’re catching bad guys so go ahead
 
  • Like
Reactions: progx

mdatwood

macrumors 6502a
Mar 14, 2010
907
877
East Coast, USA
Apple has always scanned iCloud data for CSAM.
That's not completely true. They scanned iCloud email, but not iCloud photos. That's why FB had in the 20M range of CSAM reports vs. Apples hundreds. Apple execs even said in recently released docs they know they are a big place for CSAM [1].

 

russell_314

macrumors 603
Feb 10, 2019
6,037
8,939
USA
From the trial emails it's clear Apple believes they have a lot of CSAM in iCloud Photos and they don't want it there. So, starting with this premise that they are going to change something, they have 2 options. Decrypt all the photos in iCloud and scan them there or use the method they presented.

The method as presented is ironically more secure and privacy oriented than scanning in the cloud. All the arguments against are 'what if'. The negative press may push Apple off the new method, but make no mistake, they are going to do something.
Remotely accessing data on someone’s personal device might be more secure as in preventing hackers but it gives Apple access to those devices and it gives governments access to those devices as well with the proper paperwork. People keep pretending that isn’t there but it is.

Also this only scans against known CSAM images. At least with the current implementation there is no image processing to detect what the image is of. If someone were to use an iPhone to create CSAM or just upload it to store on iCloud it would not set off any alarms.

I would be fine with iCloud scanning but installing the capability to remotely scan my device is not acceptable. While Apple claims it will be use for CSAM there’s nothing stopping them from using this capability to scan against a different database. Even the current CSAM database gets updated so they can update that with pictures of certain dissidents in China. You might say well they wouldn’t do that but that’s where you’re wrong because it’s Apple’s policy to comply with legal requests from governments they do business in. They gave the Chinese government full access control over the iCloud servers in China…
 

russell_314

macrumors 603
Feb 10, 2019
6,037
8,939
USA
That's not completely true. They scanned iCloud email, but not iCloud photos. That's why FB had in the 20M range of CSAM reports vs. Apples hundreds. Apple execs even said in recently released docs they know they are a big place for CSAM [1].

Even so this is not a reason to put a backdoor in the iPhone. Basically they’re giving the government access to your iPhone under the premise of protecting children but we both know it will be used for much more than that.

Unfortunately it’s inevitable so it is what it is. We can talk about this all day but have zero control over the outcome. Right now there’s two mega corporations that control every phone in the world. Neither one of those corporations will stand up to any government because they can’t. I think the reason why people are upset is because their iPhone was one last thing their government didn’t have access to and of course that was going to go away sooner or later. In the USA both Republicans and Democrats were pushing hard for Apple to install a backdoor and we’re seeing it now
 

hotstreaks

macrumors member
Sep 18, 2016
66
88
i expected the macros to rock this thread. thank you all for the early morning entertainment
 

msackey

macrumors 68030
Oct 8, 2020
2,501
2,924
Has anyone investigated the origins of Apple’s decision to attempt this whole CSAM thing? It looked to have pop out of nowhere, but I’m guessing there’s a history of this that’s largely not public. Anyone know of an article that talks about the history?
 

nikaru

macrumors 65816
Apr 23, 2009
1,119
1,393
What part of this is NOT a good idea!!! You do realise don't you your own precious Cat, Dog or Family photo will not be actually looked at!!! - it simply compares the Hash for the photos against already identified hashes of indecent images of children!! It doesn't actually look for what it thinks are indecent images, just the hash!
The problem is that this same technology could be used for different purposes, which are not that honorable as trying to catch pedophiles. The whole point is that back doors are not good, period. If you want to catch pedophiles, that's good, but don't do it while putting in place a massive data collection from people's smartphones or cloud storage, even if its only the Hash. If for an example a hacker breaches your iCloud password, he could upload this type of indecent images and you will have a hard time explaining to the cops that these photos are not yours. This same technology could also be used for different purposes if you substitute child violence with for example images of political pamphlets. Virtually half the world's population lives under some level of totalitarian regime where only a few care about privacy or democracy, so the iPhone should be a fortress as far as personal information is concerned IMO.
 

russell_314

macrumors 603
Feb 10, 2019
6,037
8,939
USA
The problem is that this same technology could be used for different purposes, which are not that honorable as trying to catch pedophiles. The whole point is that back doors are not good, period. If you want to catch pedophiles, that's good, but don't do it while putting in place a massive data collection from people's smartphones or cloud storage, even if its only the Hash. If for an example a hacker breaches your iCloud password, he could upload this type of indecent images and you will have a hard time explaining to the cops that these photos are not yours. This same technology could also be used for different purposes if you substitute child violence with for example images of political pamphlets. Virtually half the world's population lives under some level of totalitarian regime where only a few care about privacy or democracy, so the iPhone should be a fortress as far as personal information is concerned IMO.
Unfortunately while most people talk about privacy those same people are willing to give up their privacy if the government promises them some form of safety. Unfortunately without any real competition in the smart phone market I believe this to be a lost cause.
 

russell_314

macrumors 603
Feb 10, 2019
6,037
8,939
USA
Meanwhile Google et al are glad Apple has drawn the attention away from what they are doing with your images stored on their services.
I think Google has so many images now that they have already trained their AI. That’s why now they’re charging to store your pictures. It was free before because you were giving them something of value but now they don’t need it anymore
 

Zab the Fab

macrumors regular
Nov 26, 2003
145
121
Promoting ones own moral vanity, is much easier, than forming an educated opinion. But I know a few of you out there actually cares more about child safety. Here's the Rene Ritchie video that changed my mind:

I fully support Apples technology and it cannot come soon enough. And it will.
 

IIGS User

macrumors 65816
Feb 24, 2019
1,094
3,068
bob.jpg


Nothing to see here. No problem with iOS 15. No one is concerned with the scanning of the pictures on your phone.

The SkyTrolls are are not real.

LOL
 

bmac89

macrumors 65816
Aug 3, 2014
1,388
467
I think Apple will skip this one. The risk for them is too great. Makes no sense. Why use the photo library of nearly billion devices to catch few criminals? Let the people who get paid by our taxes to do their job without using our devices.
Perhaps they are being forced to implement this directly by a government organisation and Apple have little choice over the matter. Maybe this is even a compromise which Apple have negotiated behind closed doors as an alternative to something even more invasive. Who knows! If they don’t back down despite continued backlash and possibly loss of sales then it would seem reasonable to think that this is coming from outside of Apple.
 
  • Sad
  • Angry
Reactions: -DMN- and mainemini

0924487

Cancelled
Aug 17, 2016
2,699
2,808
Tim: ”Okayyy, where on earth should I send the DMCA to this time around?”

Apple Lawyers: Crickets…
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.