Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Therefore, by employing E2E encryption, people could upload massive amounts of CP to their heart's content. And that would be a-ok because Apple wouldn't know about it?
So.... are you against e2e encryption because for you, it's only used for CP?
 
Apple Inc. is the technology department of the CCP. No thinking Chinese person has been buying Apple for years, and certainly has no reason to change their mind now.
Well, I suppose some influence of CCP, anyway.... I know some Chinese, and they know why (for better compromise in certain details)... sorry
 
Last edited:
So.... are you against e2e encryption because for you, it's only used for CP?

Huh? How in the world did you come to that conclusion? I'm fine with e2e if Apple wants to deploy it - that would be a plus in general for security.

But it sounds like what you're saying if there was e2e, people could upload tons of CP. And because Apple can't see what's uploaded, that would be just fine having CP on their servers.
 
  • Like
Reactions: Morgenland
There probably is political pressures on Apple to do something about these kind of images. They reported very few cases last year compared to Facebook and Google.

They might one time in the future be forced to scan everything in iCloud which they don't like. Faced with this possibility they developed this system where they scan on device for users of iCloud Photo Library.

If your photo is going to be scanned anyway, it doesn't matter where the scanning happens. If it happens in the cloud you have no control or can't see indirectly at all what's happening.

This system will also allow Apple in the future to implement end-to-end encryption for iCloud and still be able to say that there is very little child pornography in iCloud, thus alleviating some of the pressures from US government who want full access to iCloud which they have today using the legal system.
They shouldn't be reporting any images. They are not bounty hunters and they are not police. As much as I want to discourage those sorts of images it's more important that the police follow strict protocol and work independently to secure a warrant.

Maybe Apple and its mimics should offer limit legal representation for all customers to secure attorney-client privilege.
That's fine, but Apple disagrees.

Apple implicitly says, if you want to use our property, you have to agree to a scan for all photos coming to our property, independent of if you believe it benefits you or not.
All that suggests is that the software industry is begging for regulation.
 
I still want to know: how much of my storage space is Apple going to take up with this scanner? How much RAM will it consume? How much of a CPU hog is it? And finally, how will all this scanning affect my battery life?

If Apple want to scan stuff on their servers that’s fine, but don’t slow my phone down and use up its resources with scanners I don’t need or want.
 
  • Like
Reactions: BurgDog
if they purely did the checks in the cloud, then no matter what, turn off cloud, no Risk. As it stands with them checking on device, the only thing gating them from just scanning everything is them saying they won’t.

What's stopping Apple from using iCloud backup to copy everything on your phone?
Only your trust in that they won't do it.
 
Weird, because I haven’t seen a single coherent explanation of why it’s a problem if your own device scans your photos for child porn, and only does so if you are trying to upload onto apple’s servers, and only produces information to Apple if you have at least thirty child porn photos that you are trying to upload.
But that also makes this even more questionable. If iCloud is already scanning for these, why bring it to on-device in the first place?
 
I still want to know: how much of my storage space is Apple going to take up with this scanner? How much RAM will it consume? How much of a CPU hog is it? And finally, how will all this scanning affect my battery life?

If Apple want to scan stuff on their servers that’s fine, but don’t slow my phone down and use up its resources with scanners I don’t need or want.

Hashes are really small. I would guess this will take up about 5-15Mb of storage. Maybe a little more in memory.
The system will use the neural engine in the phone, so probably not going to use your CPU for much.

Also, they could do it while the phone is on power and you're not using it.
 
  • Like
Reactions: Morgenland
I took photos of my 3month old daughter taking bath for her first year album, thats mean that this algorythm consider that is abuse/crime/i dont know what and remove that photos from my icloud and iphone? what we done with this word?
I think if you uploaded the pics to a public facing site like IG or FB, someone could conceivably flag them / have them uploaded to a CSAM database. The public at large doesn't get a vote on what is or isn't p#, the definition is by the people that have access to the database.

At that point, if the pics are in the database and you enabled iCloud, yeah the pics will get flagged and authorties will be alerted.
 
I still want to know: how much of my storage space is Apple going to take up with this scanner? How much RAM will it consume? How much of a CPU hog is it? And finally, how will all this scanning affect my battery life?

If Apple want to scan stuff on their servers that’s fine, but don’t slow my phone down and use up its resources with scanners I don’t need or want.
Look at the activity manager in macOS (Face detection is exactly the same hash-process in Photos). It works irregular from time to time and consumes process power.
 
Again it is happening on your Apple device scanning your personal photos, and again you can’t trust their bs because you don’t have access to the source code and algorithm.

You are just trusting a company that used to call privacy a human right. Can you understand all their rhetoric is just bs?.

If you can't trust Apple, how can you know they won't misuse any other feature of iOS?
They could turn on iCloud backup secretly and copy everything from your device without you knowing.
 
If you can't trust Apple, how can you know they won't misuse any other feature of iOS?
They could turn on iCloud backup secretly and copy everything from your device without you knowing.
They simply need to crosscheck hash. That means small traffic. If you are interested, check the photos-library structure in macOS.
 
Why on earth are you going on about iCloud settings

Maybe if you go back and read my post in the context of what is was a reply too, the answer would be patently obvious.

, because the software is on your hardware rather than on iCloud.
Apple could have chosen to have all checks via iCloud and its server and the fact they haven't makes it far more of a concern as whether you use iCloud or not, that coding is still within your system, still capable of being modified at any time, and if there was no intention of doing that then why are they seemingly so intent on doing it via installing the software on customers hardware rather than iCloud?

If you think Apple is out to get you, then nothing I can say will sway you from that irrational fear. The reasons why Apple is moving the scanning process to devices has already been explained ten thousands times.
 
Hashes are really small. I would guess this will take up about 5-15Mb of storage. Maybe a little more in memory.
The system will use the neural engine in the phone, so probably not going to use your CPU for much.

Also, they could do it while the phone is on power and you're not using it.
100% correct! Same currently working in macOS („face detection“ and more…)
Grain varieties are also recognized and much more. Since I do not own any child porn, I do not know if this has also been done secretly.
 
No, I'm saying all of your photos have been scanned since day 1 of iPhone. Granted, they were scanned for different reasons (applying effects, indexing, etc.), but they've been scanned since day 1.
Then why move it to on-device? Leave it on iCloud. This just screams government pressure. And I am not the conspiracy type, but I would NOT be surprised if they are scanning for something else but just publicly stating its for CSAM. We have no way of verifying after all. National security might mean Apple cannot say ONE THING about the true intentions.
 
Then why move it to on-device? Leave it on iCloud. This just screams government pressure. And I am not the conspiracy type, but I would NOT be surprised if they are scanning for something else but just publicly stating its for CSAM. We have no way of verifying after all. National security might mean Apple cannot say ONE THING about the true intentions.
iPhones are updated in countries outside the USA by local providers on iOS level, everyone knows that, since Apple highlights this upload request. You will notice that when you associate your iPhone with a provider for the first time. This gives governments worldwide easier access to the iPhone than to Apple's cloud, which is probably too complicated and complex to analyze.

and for US (you know)
 
Last edited:
  • Like
Reactions: BurgDog
Something is very fishy about this. The fact that Apple, suddenly and the first time ever, is actively supporting both iOS 14 and iOS 15 and this is NOT coming to iOS 14 just seems quite odd.
 
  • Like
Reactions: 09872738
Some of us come from free societies where we are not used to having our civil rights stripped.

I am just amazed how some people can defend what Apple is doing here, the inspection of their private data and blindly trust a corporation known to concede to any Chinese government demand. You really gotta have an empty brain.

I believe Apple would be forced to scanning in the cloud if they didn't implement something like this.

Then it only becomes a choice where the scanning is occurring. I don't really care where. The end result is about the same.

You have a civil right to authorise Apple to search your belongings, which is what will happen if you use some version of iOS 15, iCloud Photo Library and use an US Apple ID.
 
  • Like
Reactions: Morgenland
How many of you saves pictures from the internet in the photos app? Because this is what Apple is assuming. I’m generally curious if this is such a common behavior.

People could be saving photos on the iphone or icloud drive. My photos app only have pictures from my camera and screenshots.

Since Apple only matches against known CSAM pictures, they will not catch new CSAM-material from camera.
An half-intelligent peddo could create a folder on iCloud drive called “kiddie porn” and save his/her material there and would circumvent this implementation.

I wonder if Apple really thought this through. It’s so easy to go around. Either don’t save the photos app (which is not default for non-camera pictures so no biggie) or disable icloud photos. How are they really protecting the children?

And they would still have CSAM in iCloud by peddos not utilizing iCloud photos but rather iCloud Drive. Well done Apple.
 
  • Like
Reactions: BurgDog
I believe Apple would be forced to scanning in the cloud if they didn't implement something like this.

Then it only becomes a choice where the scanning is occurring. I don't really care where. The end result is about the same.

You have a civil right to authorise Apple to search your belongings, which is what will happen if you use some version of iOS 15, iCloud Photo Library and use an US Apple ID.
Why go through all that trouble. Why not force a neighbor to break into someone's house and report what they find.
 
  • Like
Reactions: sog1927
The software is capable of scanning more than what is synced with iCloud photos and it is capable of sending data whether iCloud is enable or not. Just because that function isn’t active at launch doesn’t mean it won’t be used.

You need to convince me and others that Apple would be willing to completely abandon the China market when they are ordered scan for other items.

The same is true for iCloud backup. Why weren't you worried when that feature was introduced.

To be honest, I don't really care what happens in China or any other country. I only care about me and people I know. I don't care if they are treated badly by their government. Most people in China seems to be happy with their government.

Let's say Apple withdraws from China. Previous iPhone users would then have to move to phones with Chinese Android-versions created by Chinese company under strong influence from the government. It will probably provide even worse privacy than an iPhone with CSAM Detection system misused.

Also, why would the Chinese government be so stupid to use an inefficient way? There are many better technologies in the iPhone today they could misuse.

And yet, you don't seem to be worried about those features.
 
How many of you saves pictures from the internet in the photos app? Because this is what Apple is assuming. I’m generally curious if this is such a common behavior.

People could be saving photos on the iphone or icloud drive. My photos app only have pictures from my camera and screenshots.

Since Apple only matches against known CSAM pictures, they will not carch new material from phone.
An half-intelligent peddo could create a folder on iCloud drive and save called “kiddie porn” and save his/her material there and would circumvent this implementation.

I wonder if Apple really thought this through. It’s so easy to go around. Either don’t save the photos app (which is not default for non-camera pictures so no biggie) or disable icloud photos. How are they really protecting the children?
There is another way to save pictures from the internet without them being accessible by the photos app? I just tried a photo and didn't even see an option to save anywhere else. How do you do it?
 
How many of you saves pictures from the internet in the photos app? Because this is what Apple is assuming. I’m generally curious if this is such a common behavior.

People could be saving photos on the iphone or icloud drive. My photos app only have pictures from my camera and screenshots.

Since Apple only matches against known CSAM pictures, they will not catch new CSAM-material from camera.
An half-intelligent peddo could create a folder on iCloud drive called “kiddie porn” and save his/her material there and would circumvent this implementation.

I wonder if Apple really thought this through. It’s so easy to go around. Either don’t save the photos app (which is not default for non-camera pictures so no biggie) or disable icloud photos. How are they really protecting the children?

And they would still have CSAM in iCloud by not utilizing iCloud photos but rather iCloud Drive. Well done Apple.
Yes, it‘s too obviously not about CSAM
 
  • Like
Reactions: sog1927 and BurgDog
There is another way to save pictures from the internet without them being accessible by the photos app? I just tried a photo and didn't even see an option to save anywhere else. How do you do it?
I don’t usually save pictures from the Internet. But you can use share and save it using the files-app or any app in the share menu.
You can actually save directly from internet to icloud drive and still have CSAM in icloud without them looking for it.

Or copy and save it elsewhere.
Some browsers allow you to save directly to the browser.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.