Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
If you told your neighbor that you have a telescope pointed at her window just to make sure nobody is going to assault her and she agrees that it's okay, then it's perfectly fine.

Also, she has the ability to close her blinds (aka, turn off iCloud Photos).

Actually it isn’t. Depending on your location that can be deemed to be stalking.
But I digress - now back to our topic.
 
Actually it isn’t. Depending on your location that can be deemed to be stalking.
But I digress - now back to our topic.
What if you got it in writing? Like, you know, she signs an agreement?

After all, we all agree to the terms and conditions when installing software on our devices. That is our consent.
 
What if you got it in writing? Like, you know, she signs an agreement?

After all, we all agree to the terms and conditions when installing software on our devices. That is our consent.

Good question and I don’t have an answer.

I have been involved in a few civil cases where what seemed to be apparent legally turned out not to be after a judge got done with it. Even on appeal. That can vary by location.
 
Good question and I don’t have an answer.

I have been involved in a few civil cases where what seemed to be apparent legally turned out not to be after a judge got done with it. Even on appeal. That can vary by location.
Regardless of the telescope thing... we all consent to the software that is installed on our devices. If we don't agree with it, we must click decline and cancel the install.

I think everyone's biggest fear about this is that Apple will flip a switch at some point and scan your photos regardless if you're using iCloud or not.

If that were actually a valid point, then couldn't Apple flip a switch and make it so your photos are uploaded to iCloud regardless if you have it enabled or disabled? If you really think they're that nefarious, then they absolutely could do that. They'd get caught pretty quickly though and it would be featured on every tech website.
 
Regardless of the telescope thing... we all consent to the software that is installed on our devices. If we don't agree with it, we must click decline and cancel the install.

I think everyone's biggest fear about this is that Apple will flip a switch at some point and scan your photos regardless if you're using iCloud or not.

If that were actually a valid point, then couldn't Apple flip a switch and make it so your photos are uploaded to iCloud regardless if you have it enabled or disabled? If you really think they're that nefarious, then they absolutely could do that. They'd get caught pretty quickly though and it would be featured on every tech website.

Yes and no. I do think a lot of folks are very concerned over that potential aspect. Apples TOS / EULA is a mix of specific and broad. “We can” or “Reserve the right” and other similar phrases leave Apple with a lot of latitude. That’s if the language is clear cut and a layman can understand it.

The bigger picture for many of us is trying to understand “why” this has to be the solution and “why” on our personal devices. Concern for precedent.

As for scanning our photos, Apple already does. What Apple doesn’t do currently is report the output of that scan if deemed “possibly illegal” to the authorities.
 
What I don't get is how this is any more a back door than anything else on the iPhone? Since it's a closed system and Apple controls the software on it, couldn't they do anything with the code and simply not tell us?

Hell the fact that they're telling us it's there is a both a good thing and a bad thing. It's a bad thing because now people think they're being spied on (they're not) and it's a good thing because it shows that they're being transparent and honest with what they're doing.

I'm obviously not against the scanning since it's completely optional and there's no evidence that it will be forced.

I also don't understand how people think having all of their photos being unencrypted and scanned on someone else's servers is somehow BETTER than only unencrypting the very rare false positive. I actually feel safer with the new system because I know the scan data only stays on my device unless there's a match (and I guarantee there won't be), so in that sense, none of my photos will ever be seen by any person.
 
  • Like
Reactions: kalsta
What I don't get is how this is any more a back door than anything else on the iPhone? Since it's a closed system and Apple controls the software on it, couldn't they do anything with the code and simply not tell us?

Hell the fact that they're telling us it's there is a both a good thing and a bad thing. It's a bad thing because now people think they're being spied on (they're not) and it's a good thing because it shows that they're being transparent and honest with what they're doing.

I'm obviously not against the scanning since it's completely optional and there's no evidence that it will be forced.

I also don't understand how people think having all of their photos being unencrypted and scanned on someone else's servers is somehow BETTER than only unencrypting the very rare false positive. I actually feel safer with the new system because I know the scan data only stays on my device unless there's a match (and I guarantee there won't be), so in that sense, none of my photos will ever be seen by any person.

It isn’t a black and white situation nor are the concerns all black / whte.

1. Backdoor - maybe. Could be argued that way. Knowing how it is really built would put that question to rest.
2. Legal backdoor - Did Apple lose it’s Constitutional or legal protection on being forced to build something because now a method already exists.
3. What in this design prevents Apple from scanning some other target on phone/tablet/laptop/pc and bouncing it against a target file.
4. Is this a standalone item or does it tie into other phone functions. Is it part of a bigger design.
5. Why does this have to be on device instead of in the cloud like everyone else.
6. Is Apple’s claim “we won’t allow it” good enough. They have caved before.
7. …

There are so many concerns and questions. I would love to see Apple and a bunch of knowledgeable peers plus other experts take some time and dive into this.
 
It isn’t a black and white situation nor are the concerns all black / whte.

1. Backdoor - maybe. Could be argued that way. Knowing how it is really built would put that question to rest.
2. Legal backdoor - Did Apple lose it’s Constitutional or legal protection on being forced to build something because now a method already exists.
3. What in this design prevents Apple from scanning some other target on phone/tablet/laptop/pc and bouncing it against a target file.
4. Is this a standalone item or does it tie into other phone functions. Is it part of a bigger design.
5. Why does this have to be on device instead of in the cloud like everyone else.
6. Is Apple’s claim “we won’t allow it” good enough. They have caved before.
7. …

There are so many concerns and questions. I would love to see Apple and a bunch of knowledgeable peers plus other experts take some time and dive into this.
So in reality, everyone's problem with this is that they just don't know enough about it.

Everyone jumping the gun and switching to other devices and operating systems is just panicking over "who knows what the future holds".

Interesting.
 
So in reality, everyone's problem with this is that they just don't know enough about it.

Everyone jumping the gun and switching to other devices and operating systems is just panicking over "who knows what the future holds".

Interesting.

IMO I think that is a big part of it.
Poeple are executing based on what we know. Since Apple is remaining silent on this, folks have to go with what they know. Whether you stand pat, switch, or continue to dig, you are taking an action based on what you know.
 
Give me a break.
Unlikely ;)

When you were a young child, and you wanted the toy that another kid was playing with, you had to learn a hard truth—that the other kid’s freedoms sometimes encroached on yours. That inconvenient truth doesn’t magically disappear as the issues get more complex.
That example fails on two points: First: Children are not free. They are (or used to be) constrained in their freedom by parents and guardians as they're taught the limits of freedom and the rights and responsibilities attendant with exercising freedom. Secondly: Freedom doesn't mean do whatever you want to do, regardless of the outcome or the impact of your actions upon others.

I’m aware that the tide of opinion is against Apple on this one. Mostly, it seems to come down to an underlying philosophy that the contents of one’s private possessions are sacrosanct. It’s a gross oversimplification in my opinion.
Huh. The Founding Fathers of the U.S. didn't find the concept a gross oversimplification.

Consider the house where you live. It’s your private space, a safe place where no one has any right to invade, right? You are free to enjoy this privacy and safety, and the law protects that right. But let’s say you decide what you really want to do with your freedom is have a slave in your basement.
This example also fails on two points: First, it's a fallacious argument (straw man) because it clearly involves one individual infringing upon another individual's freedom (and in a most egregious way). Plus it's false equivalence. My not having my images scanned does not infringe on anybody else's freedoms.

I can guess what you’ll say next... That Apple’s technology would be like the government putting 24 hour video surveillance in all our homes.
You guessed wrong :)

No, what Apple wants to do would be more like Chevy wanting to put passenger scanners in all their cars to preempt the possibility of them being used to kidnap children.

Ok, so: I will agree with you that for a society to function there must be limits on freedom. People are imperfect. Many are callous, selfish, have no conscience, etc. Others are simply unable to foresee the consequences their actions. So we make laws. Laws are prior restraint. Prior restraint is generally considered a bad thing, but it's an unfortunate necessity. Being antithesis to freedom, I submit that all prior restraint must be carefully considered lest we find ourselves throwing the baby out with the bath water. That we're employing a cure worse than the cold. In judicial parlance it's called "strict scrutiny."

In my opinion what Apple proposes to do does not pass the test.

Would you be okay with Apple checking your photos after they are uploaded to iCloud?
No. I'm never "ok" with anybody pawing through my stuff, at any time, or for any reason, without cause. (In legal terms: Probable cause, or at least reasonable articulable suspicion [RAS].)

Even if this means you can never have end-to-end encryption with iCloud?
I don't accept the premise, but, for the sake of the argument I'll go with "yes."

Or do you believe that Apple (and other service providers) should have no duty whatsoever to monitor its servers for CSAM?
If, by "monitoring," you mean "scanning users' private files and data for it": Yes. I believe their duty is to provide products and services to their customers in exchange for fees. My moral and ethical code would demand I report it if I saw it, but I wouldn't go looking for it and I will not impose my moral code on Apple or its other customers.

Btw: There was a similarity between Apple and me while I was still employed as a systems and network administrator. I had managers attempt to get me to poke through employees files and email looking for things. I outright refused to do it. (This despite an I.T. policy that I had written clearly stating "If it's on the company's systems, or it's on or crosses its network, it belongs to the company.") I did, in one instance, accede to such a request when the manager in question was able to express RAS. And even then I required both the CEO and HR to sign off on the request, in writing. (Yes: It turned out the requested investigation was justified.)

Here's the difference between Apple and me: I had my employer's and my coworkers' utter trust.

I take privacy very seriously.
 
Last edited:
  • Like
Reactions: zkap and dk001
then couldn't Apple flip a switch and make it so your photos are uploaded to iCloud regardless if you have it enabled or disabled?
They've done that before! I reckon a couple of times Apple have turned on icloud photos on me after a major update. I hate that Apple reset settings after a major update. I def know that photo stream was turned on for me recently after an update when I 100 percent didn't have it on. I will say though, that it seems much better than it was several years ago.

In saying that, I'm not disagreeing with you. I reckon this is a really complicated issue that is way above my 2nd rate, PI pay grade. Honestly though, I'm more concerned with the fact I'm two months behind on the rent on this lousy, third floor office in a building at the ass end of town. So I pour myself a drink, it's 10:53am, almost midday. The first drink of the day always tastes the best. It's like your mind and body forget you're an alcoholic and you can actually taste the bourbon instead of craving it. I flip cards in a game of solitaire. Neat lines of red and black. Just like my life used to be, all my ducks in a row. But now my life is messy, so I pull the drawer and linger on the only clean thing in this office, my iPhone. I contemplate the easy way out, slam the drawer shut and get an Android, but I got unfinished business. So I grab the iPhone from the drawer, turn it on and marry myself to another day in purgatory.
 
Wait til you find out what your VPN knows 😉

Is there really a need for your all knowing, snarky responses? I am fully aware of what my VPN provider knows, remember, I'm the one that gives it to them. The question is what do they do with it? My ISP 100%, for sure, sells that info, I do not believe my VPN does, if they do then the joke is on everyone using their service but just as you are advocating for "trusting" Apple to do the right thing(s) with this CSAM technology I do the same with my VPN.
 
  • Like
Reactions: dk001
Is there really a need for your all knowing, snarky responses? I am fully aware of what my VPN provider knows, remember, I'm the one that gives it to them. The question is what do they do with it? My ISP 100%, for sure, sells that info, I do not believe my VPN does, if they do then the joke is on everyone using their service but just as you are advocating for "trusting" Apple to do the right thing(s) with this CSAM technology I do the same with my VPN.
That was exactly my point.
 
@zkap: Hey, first of all, thanks for the respectful and thoughtful discussion. I appreciate it.



Again there are really two issues here, the ‘install this on my phone’ part, and the ‘reports to the outside’ part. I’ll start with the latter. When someone uploads content to iCloud, that content is in Apple’s possession. It is illegal to possess CSAM, and the issue is deemed so serious, that mandatory reporting of child abuse overrules other rights that a person would ordinarily have, such as doctor-patient confidentiality (at least in some countries). Now I’m no lawyer, so I can’t really say what Apple’s legal responsibilities are here, but they have been very poor reporters up until now, and part of the problem is a technical one. How do you scan for abusive content when it’s all encrypted? This brings us to the first issue. The obvious solution is to assess an image before it gets encrypted. By doing this on the device, Apple can actually improve privacy and security for all its law abiding customers by introducing end-to-end encryption for iCloud uploads. That’s only speculation at this point, but it seems perfectly reasonable.

Look, I know I’m talking technology again, and this is where we get stuck. I simply can’t see how we can discuss the pros and cons of this without talking about the way it’s been designed and why. It’s technology that both threatens and protects our digital privacy—we really can’t get away from that.


That’s really not a new phenomenon. I’ve been around for a long time now, and Apple has always had its issues. Snow Leopard was probably the pinnacle of reliability for me, and I would love to see that restored, but I also acknowledge that their OSes are far more complex than they used to be, and that is going to come with some issues, programmers being people and people being fallible.

It’s a valid concern though. Apple does have its work cut out convincing us all that they have got this one right.


Yes, it is a violation of privacy. You’re quite right. If I knew (and I may never know) that someone at Apple saw 30 of my private photos, I would be unhappy about that. But would my life be significantly impacted by it? No. If I knew that the technology had significantly reduced the spread of CSAM, I would accept that personal cost. (See also my previous comment to another commenter on the same topic.)


Good question. I think if innocent users were regularly getting flagged, that would be a serious failure of the system, and one Apple should be forced to address or else shut it down. I guess we can agree on that?

What would be acceptable to me personally... A single digit figure per year maybe? Ideally though, I think it should be less than one per year on average.

I do think Apple should be accountable and transparent here, and report on false positives if and when they happen.


Well, either way, a human has to get involved at some point, even if you have a layer of AI before that. I don’t think it’s unreasonable for Apple to want to take responsibility for the initial review, but you make a fair argument for them handing it over too.


People need to understand that no hash function is perfect. Some hash collisions are unavoidable. Did you see my earlier comment about that? So you do need to set a threshold for the number of matches. But yes, I agree that 30 seems too high, as I also said earlier. This tells me Apple are erring on the side of privacy, whereas you see a premature admission of failure. A case where they’re damned if they do and damned if they don’t??


You’re right. Some emotion on both sides is perfectly understandable. It’s when emotion overtakes rational debate that there is a problem. People set fire to 5G towers because they are angry, but it’s an anger based on fear based on misinformation. I could use many other examples which would take us into the realm of politics, but the moderators will start deleting posts if we go there. (Believe me, I know! 🙂)

Yeah, likewise, man. I think the discussion improves with the back and forth, as you can get a better picture of the other side's arguments after several replies.

About the technical side of things, I think there are two sides of the coin. I agree we obviously can't ignore the technical aspect of the issue as we're talking about technology. I don't focus on the technical aspect for two reasons: 1. I don't have knowledge in that field so I can't get into it, and 2. because any tech is made by people and this is why some of us worry, we know that Apple can just create a different system for other purposes. I don't object that strongly to this particular scanning system for this particular purpose. I mean, I object to it because don't like it and I think we can poke holes in it and discuss the many questions that need answering, but that in and of itself is not the core of the problem. For the most part I object to the path we're taking, because I am convinced they won't leave it at that. In my opinion, there is absolutely no way they won't come up with a different scanning system for a different purpose later, and this is an introduction to a new paradigm where iPhone isn't what it was.

They are easing us into this and it's why people say "slippery slope," this is why Snowden said "If they can scan for this today, they can scan for something else tomorrow." I believe the core message is the same. Be it a tweak in the CSAM technology or something else, the conviction is that this will be developed further. Apple even said so themselves, it's in the announcement that this technology will be improved and expanded over time. Imagine how concerning those words can be if you are wary of this tech. This is a discussion to be had. What are they going to expand the technology into? How else can they do it? Does this mean a better system later (which doesn't bode well for their confidence in the current one) or do they plan to include other aspects of the iPhone other than just photos?

I believe the only real chance we have, if there is any at all given that it's Apple we're talking about, of resisting this (at least for some time still) is if we do it now, when they first begin to roll it out. If it goes through now, resisting it becomes hopeless because next time there will be even less opposition, until it becomes business as usual. When you ask the Chinese what they think of WeChat, they'll say it's perfectly fine if you don't do something illegal. This is why technology, while obviously a big part of it, for me isn't at the core of the discussion, but rather Apple's position and willingness to compromise on this. So, your views on it from a technical standpoint may be completely correct, it may actually be praiseworthy what Apple have come up with in a technical sense, I don't know. Even if you can convince me that the system has as little shortcomings as can be reasonably expected for a piece of tech, it wouldn't calm me on the other aspect of the issue because I absolutely do not see any way that governments and agencies will resist tampering with the iPhone now.

About bugs, you're right that they're always there, they are an unavoidable part of tech, but you can still have levels to that. One manufacturer will have some bugs that do not impact any significant function while others will have crippling problems. Apple is somewhere in between and I have to say their software performance lately doesn't inspire me with confidence. Allow me to unpack something here which doesn't really concern my main gripe described above, but rather some details of how this may actually be implemented. As @smoking monkey says below, certain settings can be turned on and off with a software update. I've had this (as far as I know) happen with Contacts in iCloud (switched off without me doing anything) and automatic iOS updates (switched on instead of being off as always). You can see where I'm going with this.

What happens when someone realizes their iCloud settings have Photos turned on when they didn't want them turned on? Remember that Apple attempted to calm people by saying you can opt out if you turn off iCloud Photos, as is regularly emphasized in this thread. Reasonably assuming that Apple can't eliminate all bugs regarding iCloud activation, one way I see around this is to make iCloud Photos activation require authentication, biometry or passcode. That way, the system won't allow a bug to upload Photos to iCloud and you'd need to opt in by confirming your identity. But then, what does Apple write in the prompt? Let's assume that Apple does the logical thing and writes the actual reason why the prompt is there to begin with. What paedo would then go ahead with the upload? Any person who possesses CSAM will either cancel the activation, or they are too stupid to use any tech to begin with. Imagine Apple saying "You can opt out of this by choosing not to activate iCloud Photos, unless a software bug changes your setting for you." That's a disaster. On the other hand, include a prompt and you're basically telling paedos who can read that they shouldn't tap on Activate.

So this brings me to the next question, and it ties in with your argument about reducing CSAM - who are we catching with this tech? It seems to me we are not catching creators, only users of the content, and not just any users - only the stupidest ones. This is why I don't see a reduction in CSAM, because the more we talk about it, the more traction it gets and the smaller the chance of actually achieving something. To address your point, I agree, I would consent to a reduction in my privacy if I thought the following: 1. there was reasonable chance of reducing CSAM even slightly, and 2. that I knew we aren't going anywhere else with this and that my consent will not open the door for other things. Like I said, you may be right that this is tech-wise an impressive feat by Apple, but everything else concerning this roll-out is most definitely a dumpster fire and it's why I have very low confidence on both points.

About false positives, I'm gonna do an exercise in logic. I may be wrong, but if Apple says you need 30 matches to be flagged... doesn't this mean that you need 30 matches to get an average of 1.00 or more CSAM images? I mean, usually you have these margins for a reason, like the +-4% statistical error in polling, or +-10 km/h error in measuring driving speed. This is a result of empirical data. So when Apple says 30 matches are needed, this is chilling to me. As far as I can tell, it says one of two things: 1. either Apple is fine with less than 30 CSAM images (you have CP, but you did not cross our randomly set threshold where we would think you have A LOT of CP so we'll let it slide) or 2. they determined through testing that, on average, 30 matches mean someone will have one or more CSAM images. The no. 1. option cannot possibly be true, so while I'm risking false dichotomy, this leaves us with no. 2. And following through with the logic, doesn't this mean the hashing system cannot be trusted because this seems like an unreasonably high error rate? Because ultimately that's what it is, 29 images is the error rate. What else could it be (really asking)?

This is already long, most of this isn't even in reply to something you said, just putting thoughts into text because there's a lot to unpack on this subject. Two more things.

Regarding the human reviewer, I read in one of the links posted here (or in the other thread) about the guy who tested the hashing system and found vulnerabilities, that he said the point of the human reviewer on Apple's side is that it eliminates the need for a warrant on the government's side. Basically, if Apple as a private entity conducts a search while providing a service for the user, it wouldn't be enough that the hashing system turns up a match, but a person would have to review the match and if they do it that way (human reviewer on the private entity's side), then the government would just repeat the same exercise and would not need a warrant as they'd just be verifying the findings of the private entity who made the report (as they should). Otherwise, if Apple just forwards the info that a match has come up, the government would still need a warrant. I have no idea if this is true, they didn't cite a source and this is a legal question whereas the rest of the posts concerned the technical aspects so I don't know how reliable it is. I live in the EU so I don't know US law, but if this is actually accurate, it would explain some things.

Lastly, when I read "5G," I thought to myself "Could this be my 5G? Am I going to the same loony school as the 5G nuts, just taking a different class?" I think and hope not, because at least I didn't focus on the technical side of things...

They've done that before! I reckon a couple of times Apple have turned on icloud photos on me after a major update. I hate that Apple reset settings after a major update. I def know that photo stream was turned on for me recently after an update when I 100 percent didn't have it on. I will say though, that it seems much better than it was several years ago.

In saying that, I'm not disagreeing with you. I reckon this is a really complicated issue that is way above my 2nd rate, PI pay grade. Honestly though, I'm more concerned with the fact I'm two months behind on the rent on this lousy, third floor office in a building at the ass end of town. So I pour myself a drink, it's 10:53am, almost midday. The first drink of the day always tastes the best. It's like your mind and body forget you're an alcoholic and you can actually taste the bourbon instead of craving it. I flip cards in a game of solitaire. Neat lines of red and black. Just like my life used to be, all my ducks in a row. But now my life is messy, so I pull the drawer and linger on the only clean thing in this office, my iPhone. I contemplate the easy way out, slam the drawer shut and get an Android, but I got unfinished business. So I grab the iPhone from the drawer, turn it on and marry myself to another day in purgatory.

I think it's a bug, the setting sometimes change, but rarely.

About the 11 am drink, how did it progress over the years? 5 pm, then 3 pm, then you stop minding the clock or what?
 
I think it's a bug, the setting sometimes change, but rarely.
-------
About the 11 am drink, how did it progress over the years? 5 pm, then 3 pm, then you stop minding the clock or what?
Yeah, I'm sure it is a bug, but it's an unpleasant one.
-------
I stopped drinking when Jess walked back into my life. I vowed to be a better man, desperate to keep a hold of her this go round. But then things got twisted up worse than a licorice stick. So now I sit here in a park looking out at a manmade lake with a fountain waiting to see it spout. Ain't nothing special, but it's as good a way to pass the day as any. In one hand I got my iPhone 13 Pro Max with icloud storage turned off and in the other, a bottle of Kruto vodka I got scammed into overpaying for in Little Kiev. What time is it you ask? I haven't got raise to wake turned on for my Apple Watch, so all I know is it's time for a drink.
 
  • Like
Reactions: Schismz
Bypass as in turn off Backup Photo to iCloud
Okay, gotcha. What you meant was, people can opt out of iCloud Photos and therefore the CSAM hash matching.

That still doesn’t answer the question why now and why this solution.

This technically clever solution doesn’t change that. What’s the ROI?
It’s the same loaded question again. Again I say, why not now? If not now, when would be a good time?

You seem to be assuming that a corporation must only ever act in its own selfish interests and the interests of its shareholders. I have challenged that very narrow view of capitalism on MR before. Apple is run by people, and people do sometimes have social consciences!

We have a lot of guess work, some of likely very good, and a lot of questions with Apple being silent.
Apple has published a technical summary and responded to criticism with a FAQ document as well as interview comments. Maybe they haven’t answered your presumptuous ‘why now’ question, but they’ve hardly been ’silent’.

Yes. Most Cloud providers do this today in an effort, I assume, to keep this data off their platforms. When a user elects to use a Cloud, we are using a service that we are opting into.
Right. And as we just clarified above, this opt-in is still 100% there with the proposed technology.

At this stage, from a legal perspective, we know that these providers have nol responsibility to scan for CSAM. That they do so is likely due to pressure from several angles.
I didn’t know that. Maybe they just want to make the world a better place? How naive of me to even suggest that possibility!
 
  • Like
  • Sad
Reactions: dk001 and MozMan68
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.