Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Please provide a source for this other than the speculation of others.

This is a grand idea, and would be great. But Apple has given zero evidence this is their end goal. This has mostly been postulated by those trying to find a logical reason for Apple's methodology.
well, watch whats new in CloudKit WWDC video you’ll see they’re adding the ability for devs to end-to-end encrypt user data with very little effort. Of course, it’s hard to say what their plans are but it definitely indicates direction and commitment to privacy

 
My point was and is clear — if this is a big issue to the majority of users, why is there such a low number of petition signatures?
Believe what you want to believe, but the numbers don’t support this being a concern for the vast majority of users.
I will give you an example, because I have made my decisions and am willing to share my point of view about the numbers.

In January 2020 I have received a warning from a friend about horrible situation in Wuhan, he gave me collection of censored from all search engines clips. My reaction was skeptical at first, but I have started researching the issue. Not long after this I checked all the available information about Sars 1 and Mers from respective sources and peer reviewed publications.
For my big surprise World Health Organization reacted with "Don't stop international flights, there is no danger".
My reaction was immediately to check how Hong Kong had dealt with the situation towards Sars1.
I started warning my friends and colleagues only to be ignored and called "paranoid". Did I cared? No.
Why? Because people want to believe in the best outcome, they are not prepared psychologically to see the world as it is.
I closed my office in March, changed workflows and moved to mandatory remote production.

The results are clear. Instead of taking government subsidies, my business is booming. My colleagues started thinking rationally and followed the prescribed and proven procedures of social distancing and high level of hygiene.

The situation with CSAM on device scanning is similar. There are informed people and uniformed people. There are technically educated people without system design skills and there are people who know what is this all about.
And finally there are a regular users who are so addicted to Apple as a virtue signalling outlet that they are ready to accept anything that comes from Cupertino as a holy gospel.

And mind you, a lot of people understand that petitions don't work, the real solution is to leave the apple ecosystem, not to update and take privacy in your own hands.
We were stupid and naive to believe that biggest corporation on earth will care about user privacy and now we are dealing with the results. Globally. Just as in the pandemic situation.

So good luck with the numbers. They represent a smaller fraction of a fraction of privacy conscious Apple users.

To quote Mr. Cook as a final remark:
Cook said that privacy is "one of the top issues of the century" and that it's important to put "deep thinking" into that to figure out how to "leave something for the next-generation that is a lot better than the current situation." Cook said privacy "should be weighted" like climate change, another huge issue the world is facing.

On the topic of why people should care about their privacy, even when there's nothing to hide, Cook said that he tries to get people to think about living in a world of constant surveillance, something that Apple did this morning with the release of a document called "A Day in the Life of Your Data" that details how third-party companies track user data across websites and apps.
 
Last edited:
Nice job trying to explain. It still comes down to “Before iCloud”. Just not when.
So are they prescanning/hashing the photo and it waits in queue till the next upload? Or are they executing the whole process only when upload is triggered. I could see the former for a large photo album and the latter for small albums.

In the original interview it was said that no on device scanning occurs till the upload begins.
Then we get a doc from Apple that says “before iCloud”.
Then we get the more detailed tag can be interpreted a couple of ways.
None of them answer the question - When?

I've read some of the technical document as well as consulted with some technical experts in this (yes, I realize how generic that sounds lol). From what I understand, you could think of it like this.

Right now, all of your photos are sitting at the train station - i.e. the Photos app on your phone. As soon as you turn on iCloud photos, or choose to manually upload some photos to iCloud, those photos hop onto a track that runs through the hash scanning system before they leave the train station. Otherwise, they're just sitting idle at the station with no hash scanning being done.
 
  • Like
Reactions: dk001
well, watch whats new in CloudKit WWDC video you’ll see they’re adding the ability for devs to end-to-end encrypt user data with very little effort. Of course, it’s hard to say what their plans are but it definitely indicates direction and commitment to privacy


That does help, and I do agree that given their history, it's a logical path. But why not just come out and say it? That could help their case with this whole thing.
 
  • Like
Reactions: dk001
Nice job trying to explain. It still comes down to “Before iCloud”. Just not when.
So are they prescanning/hashing the photo and it waits in queue till the next upload? Or are they executing the whole process only when upload is triggered. I could see the former for a large photo album and the latter for small albums.

In the original interview it was said that no on device scanning occurs till the upload begins.
Then we get a doc from Apple that says “before iCloud”.
Then we get the more detailed tag can be interpreted a couple of ways.
None of them answer the question - When?

the upload process starts as soon as you add an image/photo. I suppose it could be delayed by a slow connection. I can’t think of any reason they’d do the hashes ahead of time. Hashing would be quick, no reason to separate it from the rest of the process. when they say “Before an image is stored in iCloud Photos” I understand it that the whole thing is done in one go
 
  • Like
Reactions: dk001
well, watch whats new in CloudKit WWDC video you’ll see they’re adding the ability for devs to end-to-end encrypt user data with very little effort. Of course, it’s hard to say what their plans are but it definitely indicates direction and commitment to privacy


So if I am reading this correctly, if implemented for an app, it will allow the ability to encrypt specific files sent to iCloud.
 
I'm sure we can all agree on this? our fbi, nsa, cia or law enforcement, would never plant an image on a political opponents iCloud? lets say a normal person has thousands of photos in the cloud, how often do they inventory them?
 
the upload process starts as soon as you add an image/photo. I suppose it could be delayed by a slow connection. I can’t think of any reason they’d do the hashes ahead of time. Hashing would be quick, no reason to separate it from the rest of the process. when they say “Before an image is stored in iCloud Photos” I understand it that the whole thing is done in one go

I suspect the same.
Thx.
 
The thing is that in a way that tech is already there but without the 'reporting' feature or comparison against a database of 'matching' things. iOS has this indexing mechanism that allows you to search your photos for things like 'chair', 'wedding', 'summer', etc.

Is there information regarding what usage Apple is giving out from this indexed data? Is is stored only in the user's device or also in the cloud? Is it already being shared for 'shared for statistical analysis'?

Maybe photos are encrypted but what about the metadata? Under some potential laws, could governments require Apple to deliver information regarding user's photos metadata?
The first version of the index was not exported from the device, and every device had to do the work individually of indexing.

the current version synchronizes an encrypted data store across your account
 
Want to protect our children? Either donate fund to FBI team who’s dealing with this issue or talk secretly to Congress to pass a law requires ALL who store our photo to scan for CSAM. Apart from these GTFO of my devices!
The law you are referring to here is the CyberTipLine Modernization act of 2018.

Google, Dropbox, Facebook, Microsoft, even LinkedIn do server-side scanning of images for CSAM. have for years.

Apole does today of mail attachments, but cannot do so for photos because accounts are individually encrypted. There is a key escrow system for account recovery/law enforcement requests, but that is still an individual action.

laws are squirrelly here because of the rights against search and seizure, the government can’t directly force tech companies to scan. But I’m sure they’ll fine them for derivative effects (eg finding out from metadata that a CSAM ring is operating via a shared album but Apple can’t track the users involved).
 
When you receive this much backlash over a feature intended to protect kids from sexual abuse material and prevent adults from distributing said material, you know you’re doing something right.

I think many people are truly unaware of the staggering prominence of child abuse in society, if people knew how common and widely distributed the material is they might throw some support behind this.

Meanwhile, your government is actively tracking your location everywhere you go, QR code check ins show what places you visit and how long you stay. CCTV exists on every corner, every traffic light, monitoring your movement patterns through facial recognition & number plates. Every time you tap and buy something you reveal more of yourself. None of this surprisingly makes people revolt in protest, when it should, and yet the idea of Apple implementing a child-protection feature has everyone crying “encryption!”
Yep. And I grew up without almost any of it….yep. Let’s just add to the above. That way, when some gubmint proposes ‘cameras not just on every front door….but in every living room’ someone will only need to add ‘…..and hardware/software/programs on your phone to catch child abusers ……’ to the tail end of your argument above to further the argument for such a position. And there’ll be no reason to wonder how we got THERE…..Yep. The more wrongs we pile up….the more right we must be.
 
Tim Cook is acting like the bad guy from Titanic who used the little girl to get into the lifeboat 😬 that’s how I feel Apple is trying to do. To use children as an excuse when in reality it’s for other reasons they want to spy on us.
Apple is being forced to do this by powerful backers and institutions, so leave Tim alone. He's trying to keep Apple clean, so don't piss on him!
 
with all the kids being trafficked through our southern border, I would imagine all you pro csammers are also pro border wall?
I don't think anyone here is a "pro csammers", but I sure as heck I'm against the border wall. It will do no good and cost lots of money doing it.
 
  • Disagree
Reactions: Alex_Mac
I think this is appropriate to mention here:

"If you think there’s a functional difference between “sent to the cloud” and “published,” then you haven’t been paying attention." - https://tidbits.com/2019/12/02/vuescan-not-the-scansnap-replacement-youre-looking-for/

In the '90's when I found out that email can be examined by ISPs when it travels from server to server, I decided to self-censor what I wrote online. I determined I had no expectation of privacy when the data left my computer. If someone can look at my data, they will! I also have no control over how someone else uses the information that I choose to publish. What I want has no bearing on the facts.

If you really want to be private in your affairs, don't use the internet nor buy a smartphone. Use paper and pencil and a safe. In other words, don't publish the information by making it accessible to the internet. You always lose security in order to gain the convenience of what a smart phone can do. The internet was never made for privacy as an end goal; it was made by ARPANET for sharing information and survivability of transmitting that information after a nuclear war.

End-to-end encryption designed where only I have access to the key (and no other company does) is the only partial answer. Even then it will take a long time to work out all of the details, because security researchers always seem to be able to break any protection scheme they see.
 
The law you are referring to here is the CyberTipLine Modernization act of 2018.
You have answered yourself here
the government can’t directly force tech companies to scan.
Apple only is required to report to law enforcement IF they knew there is a crime involving the sexual exploitation of children happened. There is no law to FORCE Apple to do CSAM. They're doing it out of their own free will.
So yeah, GTFO off my devices!

I couldn't care less what Google, Facebook, Microsoft doing. I INTENTIONALLY avoid their products since I'm all in on Apple. Apple always tell us they're different so.. be different!
 
Last edited:
this is the only photo left in my iCloud! and I have cancelled my upgrades!
 

Attachments

  • clinton-painting.jpg
    clinton-painting.jpg
    798.9 KB · Views: 51
I keep reading that it’s not Apple‘s role to play police here and hunt criminals. At the same time you are asking them to police other countries‘ governments.

Let‘s be real. If the Chinese government wants to scan people’s photos, they will issue a law that all devices store backups on Chinese servers. And then Apple can either comply or leave the market. It’s delusional to think Apple can actually control a foreign government.
 
well, watch whats new in CloudKit WWDC video you’ll see they’re adding the ability for devs to end-to-end encrypt user data with very little effort. Of course, it’s hard to say what their plans are but it definitely indicates direction and commitment to privacy

Developers could always (and already did) use (their own) encryption for data (they store in iCloud). So for certain apps end-to-end encryption is nothing new.
All this does not mean, that having spyware on the iPhone is a good thing. If Apple still believes CSAM scanning will prevent harm they should implement it in iCloud. (Even when implemented in iCloud it's possible to encrypt user data afterwards in a way that Apple cannot decrypt it by using asymmetric encryption.)
 
Why would the Chinese government do that when they can simply search through the entire iCloud user content as can the US government (with a warrant)? And yes, they don5 have to tell you about it either. What‘s astounding is your lack of comprehension of basic facts
Because it can easily be repurposed to search data that isn't being uploaded to iCloud or that may, in fact, not be image data at all.
 
  • Like
Reactions: Philip_S
You honestly believe private companies should start monitoring their users? You want Apple Police, Microsoft Police, Google Police etc. of gathering info for some other private organisation so they can use it to their own purpose? Shouldn’t we leave hunting down the criminals to governments and law enforcement agencies and not to some shady groups who are not governed by the laws like law enforcement?
While I honestly agree with you and think they should scrap this whole project because it opens up a can of worms, this whole thing makes me think that Apple knows something we don’t with regards to this and is trying to get ahead of it by implementing their own, better solution. It’s just too out of character for them otherwise. I think governments are going to start forcing their way onto our devices globally and they will use the children as the excuse, just like they always do. And don’t get me wrong, I’ve got kids of my own. But if that is the case, and that’s a HUGE if, then I’d rather have Apple take away their excuse right out of the gate to go any deeper. But by doing so they’re also opening a Pandora’s box of sorts. This whole situation is terrible and I don’t see any easy solutions. I feel like Apple can only hold off the governments for so long. It will be awful to see how this plays out because I don’t see it going well for anyone.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.