Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
And Apple's servers will remain full of CSAM content. You really believe politicians, the FBI, and the NCMEC will be fine with that, because they won't be able to add more stuff?
What kinds of *stuff* that conforms to Apple's complex and very specific system of checking hashes of iPhone photos against a database of hashes (of known CP photos, from two matching sources)? A database that's integral to the signed OS.

Pictures of America's Most Wanted criminals, pictures of illegal guns, pictures of bales of drugs, etc? Obviously that won't work.

Or is it as some people here believe, Apple will now all of a sudden be installing a simple secret backdoor to get unfettered access to users' personal information in bulk, to share with the government? Which has absolutely no relation to comparing iPhone photo hashes against a list of known CP photo hashes.

Apple has had the ability to install secret backdoors since the first iPhone. Where has the outrage been since then?
 
  • Disagree
  • Like
Reactions: JBGoode and rme
That is what allows government agencies etc., to bypass the 4th Amendment because if they were considered government agencies, the 4th amendment would not allow their actions. It is because they are considered private entities and a PRIVATE SEARCH.

The Bill of Rights (the 10 first amendments) has never restricted private citizens or companies. Originally it only restricted the power of the federal government and it wasn't until the 1920's it started to be applied to the individual states.

A private search has never been a violation of the 4th amendment since a privat party is not the federal government or an individual state or its representative. It might be trespassing and maybe even a form of breaking and entering.

Apple can't break the 4th amendment and as long as they ask for permission before scanning, they won't be trespassing either.

Nothing in the cases you refer to changed the legal status of Apple searching customer's photos.

The question with these cases were what kind of warrantless search can the government perform on the basis of information from a private search. None of these cases would have happened if law enforcement agency (or their agents) had gotten a warrant prior to performing a search.
 
Most people just don't care. See the Will you leave the Apple ecosystem because of CSAM? poll.
Only a small minority cares about privacy or freedom of speech.
The vast majority of people in the US or Europe would be perfectly happy living in a china-style dictatorship.
I doubt that. But I agree most are complacent and are unaware of how important freedom (which includes unconditional privacy) actually is. They take it for granted, not realizing how many victims the way to freedom did cost.

People would not be happy at all living in a dictatorship imo. Unfortunately most would realize this only when its too late. As proven by comments here
 
Last edited by a moderator:
What kinds of *stuff* that conforms to Apple's complex and very specific system of checking hashes of iPhone photos against a database of hashes (of known CP photos, from two matching sources)? A database that's integral to the signed OS.

Pictures of America's Most Wanted criminals, pictures of illegal guns, pictures of bales of drugs, etc? Obviously that won't work.

Or is it as some people here believe, Apple will now all of a sudden be installing a simple secret backdoor to get unfettered access to users' personal information in bulk, to share with the government? Which has absolutely no relation to comparing iPhone photo hashes against a list of known CP photo hashes.

Apple has had the ability to install secret backdoors since the first iPhone. Where has the outrage been since then?
Predictably not addressing the question.
 
Predictably not addressing the question.

Nice evasion.

It was me that posed the question; re the FBI not being fine without the ability to add more stuff. I'm seeking clarity on what this more stuff might be.

Again, what is this "more stuff?"
 
Last edited:
What kinds of *stuff* that conforms to Apple's complex and very specific system of checking hashes of iPhone photos against a database of hashes (of known CP photos, from two matching sources)? A database that's integral to the signed OS.
I'm going to assume you misunderstood the question.

Apple's planned CSAM-scanning spyware will be on-device. What about previously-uploaded CSAM? I believe that's what @rme meant by the "more stuff" that pedophiles wouldn't be able to add.

Here's another question for you: I've since discovered Apple is breaking the mold in another respect: Even after iOS 15 is released, they're going to allow users that do not wish to "upgrade" to it to stay on iOS 14 and will even continue to supply critical security updates. So if CSAM prevention were really so all-fired important to Apple, why are they allowing a portion of their customer base to explicitly dodge it?
 
I have ProCamera on my iPhone, and I just signed up for their yearly service "ProCamera Up" which is only $5.99 a year, and with this upgrade, I get what they call their "Private Lightbox" where you can store your photos there and they will not be uploaded to iCloud Photo Library.

Here, I will let you guys see and read what they are telling their customers. Also, you get a bunch more stuff with the "ProCamera Up" Subscription:

IMG_53953AB306D7-1.jpeg


View attachment 1821976

IMG_248D5BF487F8-1.jpeg


IMG_53953AB306D7-3.jpeg


:apple:
 
Last edited:
I have ProCamera on my iPhone, and I just signed up for their yearly service "ProCamera Up" which is only $5.99 a year, and with this upgrade, I get what they call their "Private Lightbox" where you can store your photos there and they will not be uploaded to iCloud Photo Library.

Here, I will let you guys see and read what they are telling their customers. Also, you get a bunch more stuff with the "ProCamera Up" Subscription:

View attachment 1821975

View attachment 1821976

View attachment 1821982

View attachment 1821977

:apple:

Great value and works well!.
 
The on-device encrypted CSAM database contains only entries that were independently submitted by two or more child safety organizations operating in separate sovereign jurisdictions, i.e. not under the control of the same government.

National Center for Missing and Exploited Children (NCMEC) is the only non-governmental organization legally allowed to possess CSAM material. Since Apple therefore does not have this material, Apple cannot generate the database of perceptual hashes itself, and relies on it being generated by the child safety organization. The threat model explicitly assumes the possibility of non-CSAM images being included in the perceptual CSAM hash database provided to Apple: either inadvertently, such as through an error at a designated child safety organization, or maliciously, such as through coercion. It goes on and on, page 8 of Security Threat Model Review of Apple's Child Safety Features, is where it talks about protections against mis-inclusion. I know most aren't reading any actual data and just parroting what they're being told to think, but there's the paper for what it's worth.
Thank you for the link. Given their description, I’m satisfied about the nature of the source material. I’ll have to read further to see about the threat model of a generated collision masked to legal pornography, but I assume checks at the national center would catch that.
 
I have ProCamera on my iPhone, and I just signed up for their yearly service "ProCamera Up" which is only $5.99 a year, and with this upgrade, I get what they call their "Private Lightbox" where you can store your photos there and they will not be uploaded to iCloud Photo Library.

Here, I will let you guys see and read what they are telling their customers. Also, you get a bunch more stuff with the "ProCamera Up" Subscription:

View attachment 1821975

View attachment 1821976

View attachment 1821982

View attachment 1821977

:apple:
I have used Pro Camera since it came out. It is a beautiful camera app. It has had its’ own light box for years. I love them even more for adding a “private” light box.
 
  • Like
Reactions: SRLMJ23
The Bill of Rights (the 10 first amendments) has never restricted private citizens or companies. Originally it only restricted the power of the federal government and it wasn't until the 1920's it started to be applied to the individual states.

A private search has never been a violation of the 4th amendment since a privat party is not the federal government or an individual state or its representative. It might be trespassing and maybe even a form of breaking and entering.

Apple can't break the 4th amendment and as long as they ask for permission before scanning, they won't be trespassing either.

Nothing in the cases you refer to changed the legal status of Apple searching customer's photos.

The question with these cases were what kind of warrantless search can the government perform on the basis of information from a private search. None of these cases would have happened if law enforcement agency (or their agents) had gotten a warrant prior to performing a search.
Who said anything about Apple breaking 4th. I did the opposite showing that Apple are under the status PRIVATE search under 4th as are the hash database suppliers which in why that agency is not considered a law enforcement agency as that case demonstrably shows. Please do not misquote
 
I don’t agree with this. Apple says they will not scan photos if iCloud is turned off but if they can scan photos with iCloud turned on then they can scan those same photos with iCloud turned off.

I think you’re confusing what Apple says they will do with what Apple is capable of doing.

I don’t understand what you don’t agree. That Apple or any software builder, as builders have the capability to build in whatever are their ideas … that is a matter of fact. Without it how could they build anything in the first place?

The core issue is that before they were against back doors. Now they changed their minds, backdoors with CSAM as long they are the ones doing the policing its ok for the sake of Children. Pair this with some past … say their privacy and security concern, failing to aid the democratic institutions responsible to take down criminal in the context of a killer of children … They already assuming themselves as the regulators of digital goods in their devices while enforcing a “Tax”, over 2 billions of people using them. Now assuming the role of digital police …

What’s the agenda?
 
Last edited:
  • Like
Reactions: scvrx
I don’t understand what you don’t agree. That Apple or any software builder, as builders have the capability to build in whatever are their ideas … that is a matter of fact. Without it how could they build anything in the first place?
Agree. I think people forget

DwGoq2uV4AA_Aov.jpg-large.jpeg


There is no *well only if it is legal materials. Or *subject to us policing it.

That is THEIR OWN slogan. It is true or false.

It doesn't have to be CSAM either. We can get away from the "won't someone please think of the children" argument. That's the easy target to get people to go along with it in an emotional plea.

What if you have a pic of pot or some drug in a state it is not legal in? Felony is a felony is a felony. What is to say at some point in the near future drug photo hashes are added (or anti-gov material, etc, etc, etc). The door is open for Apple to add whatever hashes to its check it wants to; or a government compels them to.

And let's not even begin pretend Apple has not already caved to China on many issues.

What can anyone here do about it exactly if they do that? Notta. Would Apple even say anything? Maybe not, maybe bury it in the TOS agreement that no one ever would read.
 
And Apple's servers will remain full of CSAM content. You really believe politicians, the FBI, and the NCMEC will be fine with that, because they won't be able to add more stuff?
But politicians, the FBI and NCMEC could be asking AND and receiving access to iOS and your data for over a decade now. I don’t see what’s changed now.
 
Last edited:
But politicians, the FBI and NCMEC could be asking AND. and receiving access to iOS and your data for over a decade now. I don’t see what’s changed now.

I'm not sure what you're saying there.

Law enforcement could always get access to your icloud WITH A WARRANT. Go to a judge with probable cause you committed a crime and SOME evidence. Evidence, proof, the presumption of innocence. Key legal concepts. Then and only then they get your cloud data and can see if you did or did not commit a crime and go from there. And not Apple but law enforcement and prosecutors determine if a crime was committed.

Not knock on your door daily to search your house just in case you've been bad. That is what this is here. The equivalent of routinely searching every household daily without any cause to catch the few criminals. Rather than doing actual police work and getting warrants for the real criminals without violating the majority's rights.

I'm really trying to figure out if society as a whole is getting dumber or it's a 2020s thing arguing against one's own self-interests in the name of corporations and politicians. It's rather baffling.
 
Last edited:
But politicians, the FBI and NCMEC could be asking AND. and receiving access to iOS and your data for over a decade now. I don’t see what’s changed now.

Setting aside for a moment what jk1221 said about warrants and probable cause, are you suggesting that we've never had any privacy from Apple to begin with? What are you basing this on, any sources?

If that is the case, then why would Apple make a big deal of announcing CSAM now, if they have been scanning our data for a decade?

Lastly, are you actually trying to calm the situation by suggesting the privacy violations are in fact much worse than what we're getting agitated about?
 
  • Love
Reactions: Jemani
Setting aside for a moment what jk1221 said about warrants and probable cause, are you suggesting that we've never had any privacy from Apple to begin with? What are you basing this on, any sources?

If that is the case, then why would Apple make a big deal of announcing CSAM now, if they have been scanning our data for a decade?

Lastly, are you actually trying to calm the situation by suggesting the privacy violations are in fact much worse than what we're getting agitated about?
No. What I’m trying to say is that this CSAM thing doesn’t change anything.

Either you trust Apple will continue to safeguard your data as they claim they have been doing for decades.

Or you assume Apple will allow governments to use it for evil. But if you think that’s Apple’s intention, then they could have installed back doors a decade ago and your privacy would have been out the window a long time ago. Neither Apple or the government had to wait for this CSAM thing to invade our privacy, IF that was their intend.

I trust Apple has my privacy at heart. In the past. And also now.
 
I'm going to assume you misunderstood the question.

Apple's planned CSAM-scanning spyware will be on-device. What about previously-uploaded CSAM? I believe that's what @rme meant by the "more stuff" that pedophiles wouldn't be able to add.

Here's another question for you: I've since discovered Apple is breaking the mold in another respect: Even after iOS 15 is released, they're going to allow users that do not wish to "upgrade" to it to stay on iOS 14 and will even continue to supply critical security updates. So if CSAM prevention were really so all-fired important to Apple, why are they allowing a portion of their customer base to explicitly dodge it?

That's about giving their customers choice; an option.

I interpreted "more stuff" as others have been claiming; that once the CSAM is OK, the floodgates are open for both Apple and the government to go much further on spying and collecting anything that's on your phone. People have said that here.

Also...it's not that Apple is so "all-fired" preventing CSAM; it's about the pressure being applied on Apple by the government to deal with it.

The other option is giving the government access to iCloud and iPhones; either openly or via secret backdoor. The clever system Apple came up deals with the CSAM pressure but stops short of wider government snooping.
 
Last edited:
  • Like
Reactions: hagar
No. What I’m trying to say is that this CSAM thing doesn’t change anything.

Either you trust Apple will continue to safeguard your data as they claim they have been doing for decades.

Or you assume Apple will allow governments to use it for evil. But if you think that’s Apple’s intention, then they could have installed back doors a decade ago and your privacy would have been out the window a long time ago. Neither Apple or the government had to wait for this CSAM thing to invade our privacy, IF that was their intend.

I trust Apple has my privacy at heart. In the past. And also now.

Of course CSAM changes things, because it is in contrast to Apple's earlier claims about privacy and security. Clearly we disagree here, which is fine. If you ask me, when someone pitches privacy for years and then all of a sudden says "Oh yeah, about the photo library, funny thing...", it doesn't help their credibility. I am in the group that feels CSAM is fundamentally opposed to the privacy policy Apple marketed all these years. Which is why this changes things drastically, as I cannot trust them anymore because for me they went back on their promise to leave the iPhone alone.

About this suggestion that Apple could have installed something in secret years ago, I disagree. If you don't have any source for this, why suggest that it maybe happened? There are people who search iOS code for all sorts of irrelevant things, like clues to new devices, their colors and such. If there was scanning software in iOS, it would have been detected at some point because iOS is too big and attracts too much scrutiny for Apple to be able to hide something like that. If hidden and detected, that would have been worse for Apple in terms of user trust than announcing a much more drastic version of CSAM scanning. Remember thermal throttling and think what would happen in this case.

So no, we fundamentally disagree that this CSAM announcement is Apple's good will and that they could have done without it, because their business model in part relied on users believing Apple guards their privacy. You still believe Apple, which is your right, but I don't.
 
  • Like
Reactions: Schismz
Also...it's not that Apple is so "all-fired" preventing CSAM; it's about the pressure being applied on Apple by the government to deal with it.

The other option is giving the government access to iCloud and iPhones; either openly or via secret backdoor. The clever system Apple came up deals with the CSAM pressure but stops short of wider government snooping.

I understand this idea that Apple basically succumbed to government pressure and rolled out CSAM because they had to. It is a legitimate possibility. I don't know if it's actually true (I don't think so), but it is a possibility.

The thing is, I don't care. I don't mean to sound arrogant or anything, I just don't think it's my problem. I pay Apple a premium to be in the ecosystem and to use the most secure and most private devices I can find on the market. Apple advertises this. How Apple delivers that package is not my problem. Whether they are having a field day with it or they have to fight off legislation, makes no difference to me. I paid to have a certain experience and safety and I don't have it. If Apple succumbed to pressure, it doesn't change the fact that this tells me I should no longer pay for this.

Because whether Apple did this deliberately of their own accord, or they were pressured into it, makes no difference to me. In either case I don't have the privacy I wanted, and I expected Apple to have the power to fight off pressure. They either won't or don't, either way the outcome is the same, my money going elsewhere.
 
Because whether Apple did this deliberately of their own accord, or they were pressured into it, makes no difference to me. In either case I don't have the privacy I wanted, and I expected Apple to have the power to fight off pressure. They either won't or don't, either way the outcome is the same, my money going elsewhere.

Where are you going to go and when?
 
  • Sad
Reactions: Jemani
@zkap I agree. I don‘t think the whole thing is about outside pressure.

I rather believe some high ranking execs believed this would be a good idea for some strange reason.

What is a bit disconcerting is the fact Apple is pretty mumm. No response to the open letters, no explanations on their motivations, nothing.
They seem rather tonedeaf and appear to contine on this bad path no matter what. I really really wonder why. Its tremendously stupid, nothing I‘d expect from highly smart people that their engineers are
 
  • Like
Reactions: dk001
Of course CSAM changes things, because it is in contrast to Apple's earlier claims about privacy and security. Clearly we disagree here, which is fine. If you ask me, when someone pitches privacy for years and then all of a sudden says "Oh yeah, about the photo library, funny thing...", it doesn't help their credibility. I am in the group that feels CSAM is fundamentally opposed to the privacy policy Apple marketed all these years. Which is why this changes things drastically, as I cannot trust them anymore because for me they went back on their promise to leave the iPhone alone.

About this suggestion that Apple could have installed something in secret years ago, I disagree. If you don't have any source for this, why suggest that it maybe happened? There are people who search iOS code for all sorts of irrelevant things, like clues to new devices, their colors and such. If there was scanning software in iOS, it would have been detected at some point because iOS is too big and attracts too much scrutiny for Apple to be able to hide something like that. If hidden and detected, that would have been worse for Apple in terms of user trust than announcing a much more drastic version of CSAM scanning. Remember thermal throttling and think what would happen in this case.

So no, we fundamentally disagree that this CSAM announcement is Apple's good will and that they could have done without it, because their business model in part relied on users believing Apple guards their privacy. You still believe Apple, which is your right, but I don't.

I don’t see how Apple is backtracking on its privacy stands here: they do a check before you upload files to their servers. That’s it. They don’t want to host illegal stuff on their servers, so they check by comparing it to a list of known images. They don’t apply AI, they don’t scan the content, there’s no mass surveillance, there’s no bending for government interference, … if they did server-side scanning I would be upset and consider turning off iCloud.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.