Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It’s. Not. About. CSAM.

It’s. About. The. Tool. Being. On. Devices.

Oh my goodness
I think some people are simply trying to not see it from other perspectives at this point.

What tool. Apple doesn’t control that tool. It’s not tethered to Apple. It doesn’t phone home to Cupertino. You’re seeing ghosts. Apple will just update that tool’s database every now and again when they ship an iOS update that you will actively decide to install. Apple has no other relation with that tool, until you upload your pics to their servers.
 
Is Tim Cook about to retire, and like a disgruntled employee walking away from a dumpster fire they started, leaving this mess for someone else to inherit?

This isn’t the same Apple I’ve bought from for almost three decades.
as said above, they are up to something or their hand is being forced. Caring for children? Then why do album covers with near nudity get prominence on the Music app browse page?
Why do people keep reading headlines about caring for children and reinterpreting them. This is about a database of pre-existing underage sexual content. It has nothing to do with what your kids might see in daily life on an Apple device or the internet itself.
 
not reducing the system down worse than what we have now, no. the only exception would be if you don't use iCloud photos or backups at all which I'd wager most customers do use and will continue to use in the near term. and realistically can only mean better privacy for those customers.

I don't know. Again, I can see your perspective on this. In my mind, however, my personal device and the cloud are two completely separate entities with separate expectations. They shouldn't overlap in practice or principle on this. My own device should have a higher standard of privacy than my stuff on Apple's servers.

So when you say, "worse than what we have now", I think, "It's not bad now. I have zero problem with how it's setup now. On iCloud, I expect to be scanned, I'm ok with being scanned, and I don't expect full privacy. But on my iPhone, my expectations are the opposite of that, where I expect the maximum privacy possible - and feel like I have that now. Throw this 'path' on my phone, and that dwindles a bit, and I don't like that trade-off just to get higher privacy on the cloud where I don't expect it anyway."
 
When one really think about the actual goal, like another poster here said, the efforts should be on those making those photos in the first place, not applying a blanket scanning for literally everyone in the world. Scanning for some photos afterwards doesn't solve the fact that the abuse had happened, and will only encourage them to do newer photos, or even go underground.
I am vehemently opposed to the upcoming system. That being said, I can see the importance of going after the CSAM-consumers, too. CSA-photos may circulate long after the producers have been prosecuted, and thereby they still contribute to maintaining the market. Also, catching a consumer for a known picture can uncover hitherto unknown CSA-photos, and provide leads to the producers. On a more cynical note, I suspect catching consumers this way is attractive from a LEO perspective, as it is relatively low effort (the photo is the crime and the evidence) with high yield - everyone loves it when the police catches a pedophile.

But Apple's method is reprehensible. The ends do not justify these means.
 
This mechanism is implemented in the image pipeline of iOS, which means any image taken or downloaded/processed is potentially liable.

It's supposedly now only limited to CSAM hashes, but Apples behaviour in China, Belarus and Saudi-Arabia - where its privacy measures aren't enforced - shows clearly, that any state actors legal authority can implement any baseline for censorship there on a device basis.

It is absolutely intransparent to the user.

We as a family feel kind of knocked out at the moment, as we just switched back to Apple and invested significantly, because of better longtime support and focus on privacy compared to alternatives.
 
  • Like
Reactions: BulkSlash
I don't know. Again, I can see your perspective on this. In my mind, however, my personal device and the cloud are two completely separate entities with separate expectations. They shouldn't overlap in practice or principle on this. My own device should have a higher standard of privacy than my stuff on Apple's servers.

So when you say, "worse than what we have now", I think, "It's not bad now. I have zero problem with how it's setup now. On iCloud, I expect to be scanned, I'm ok with being scanned, and I don't expect full privacy. But on my iPhone, my expectations are the opposite of that, where I expect the maximum privacy possible - and feel like I have that now. Throw this 'path' on my phone, and that dwindles a bit, and I don't like that trade-off just to get higher privacy on the cloud where I don't expect it anyway."

That's bit abstract. You're fighting to simply preserve your "expectations", "principles", and your "feelings" but you're not really stating how that preservation is going to realistically benefit you in the near future. Yes I understand the benefit of not having "a backdoor" through CSAM but most of us are going to continue using iPhones as if the front door remains wide open.

To continue this (now getting ridiculous) analogy, what Apple is possibly doing is closing that wide open front door (assuming my best case scenario) for everyone and (assuming your worst case scenario) installing a small pet door in the back. Sure some people can still get in through the back, but it's not like they can easily take everything you have in the house.
 
Sorry for the long post, but I thought this was an interesting, albeit anecdotal, perspective on all of this.

I just finished a conversation with a friend about this. She's an Apple fan, but doesn't follow tech news and had no idea about any of these recent announcements. She's been in the child protection "realm" for almost 20 years, in various capacities from law enforcement to CPS to the legal side of things, with numerous situations overlapping in the CP/CSAM world as well. I would definitely consider her a "boots on the ground" type of role in this world, literally on the front line multiple times.

Her take? She kind of laughed at all of the drama around this. I'm giving a shortened version, but the conversation essentially went like this.

When I explained to her what Apple is doing (I read her Apple's take, not the naysayers), her response was, "Ooo, that does feel a bit more creepy and than the cloud scanning. But it doesn't really matter."

How come?

Because Apple and the other companies that scan for this stuff - that does nothing, NOTHING, to slow down this industry.

What??

Ok, not nothing, but very little.... Listen, is it worth it to scan this stuff? Yes. But all you're doing is catching the pervs on their couches at home downloading this stuff. Sure, you might get a small ring of these creeps from time to time. And I'm not saying that's not worth it - because it is. Saving even one kid from that horror is worth it. But it's not slowing down trafficking. It's barely slowing the spread of those images. It's not getting to the people that really matter in that world. Like I said, it's catching the pervs that sit on their couch and download this stuff. They're potential perps, too. But you're bringing a bucket to the sinking Titanic. The material that gets caught on these servers has already been redistributed so many times... this is more of a CYA move for all of these companies than anything else. It's a move so they can pat themselves on the back. But it's making little difference.

Ok, c'mon now, you're sounding pretty cynical...

Maybe so, but I've seen first hand how this world works. If Apple, or any of these companies, want to truly make a difference, they would take the millions of dollars they pour into this stuff and donate it to the many underfunded law enforcement agencies both at home and abroad. There are SO many cold cases and cases sitting idle and cases involving some real heavy hitters in that world that need funding and resources to push forward (because the other side has no shortage of funds). THAT would make a difference. That would actually help get us closer to cutting off some of the heads of this massive beast. It's worse than drugs. With drugs, you can eventually track what you find on the street straight back to a dealer, and then go on up from there. This problem is so massive at the base level, policing that level makes little to no difference on the over-arching problems and decision makers in that world.

Trying to bring her back to the topic at hand, I asked her about using her iPhone with this new system on it.

Like I said, it's a bit creepier doing it that way, but I'm not going to stop using my iPhone. It does what I need it to do, and I don't have anything to worry about, so I'm not going to take the time to switch. I have more important things to worry about.

So, an interesting take from a non-techie who has some experience in that world... 🤷‍♂️
 
Whether you agree or not with the principle, the most likely reason here is to move the computation cost off their server to your devices - less cost for Apple to run their datacenter. At their scale its probably a good chunk of change saved.
 
That's bit abstract. You're fighting to simply preserve your "expectations", "principles", and your "feelings" but you're not really stating how that preservation is going to realistically benefit you in the near future. Yes I understand the benefit of not having "a backdoor" through CSAM but most of us are going to continue using iPhones as if the front door remains wide open.

To continue this (now getting ridiculous) analogy, what Apple is possibly doing is closing that wide open front door (assuming my best case scenario) for everyone and (assuming your worst case scenario) installing a small pet door in the back. Sure some people can still get in through the back, but it's not like they can easily take everything you have in the house.

I'll have to think through this as you bring up some good points here.

Agree on the analogy getting out of hand... but I'll continue it anyway.

You call it the front door, I call it the park across the street. It's not mine. I don't expect to go to the park and keep people from talking to me or seeing me or anything along those lines. I'll use the park, but I know it's not as private as my house. But if the city decides to make the park private for me, but require the installation of this pet door, that's going to be a bit unnerving. Even though highly unlikely, and even though that pet door severely limits what they can or cannot do, they can use it without warning, and it's there whether I want it to be or not.

I don't need or want the park to be private. But I do want my house to be as private as possible. Whether they can or do take anything or not, I simply don't want the pet door there.

Again... analogies. But that's how I think of it within these fallible analogies.
 
  • Like
Reactions: huge_apple_fangirl
Sorry for the long post, but I thought this was an interesting, albeit anecdotal, perspective on all of this.

I just finished a conversation with a friend about this. She's an Apple fan, but doesn't follow tech news and had no idea about any of these recent announcements. She's been in the child protection "realm" for almost 20 years, in various capacities from law enforcement to CPS to the legal side of things, with numerous situations overlapping in the CP/CSAM world as well. I would definitely consider her a "boots on the ground" type of role in this world, literally on the front line multiple times.

Her take? She kind of laughed at all of the drama around this. I'm giving a shortened version, but the conversation essentially went like this.

When I explained to her what Apple is doing (I read her Apple's take, not the naysayers), her response was, "Ooo, that does feel a bit more creepy and than the cloud scanning. But it doesn't really matter."

How come?

Because Apple and the other companies that scan for this stuff - that does nothing, NOTHING, to slow down this industry.

What??

Ok, not nothing, but very little.... Listen, is it worth it to scan this stuff? Yes. But all you're doing is catching the pervs on their couches at home downloading this stuff. Sure, you might get a small ring of these creeps from time to time. And I'm not saying that's not worth it - because it is. Saving even one kid from that horror is worth it. But it's not slowing down trafficking. It's barely slowing the spread of those images. It's not getting to the people that really matter in that world. Like I said, it's catching the pervs that sit on their couch and download this stuff. They're potential perps, too. But you're bringing a bucket to the sinking Titanic. The material that gets caught on these servers has already been redistributed so many times... this is more of a CYA move for all of these companies than anything else. It's a move so they can pat themselves on the back. But it's making little difference.

Ok, c'mon now, you're sounding pretty cynical...

Maybe so, but I've seen first hand how this world works. If Apple, or any of these companies, want to truly make a difference, they would take the millions of dollars they pour into this stuff and donate it to the many underfunded law enforcement agencies both at home and abroad. There are SO many cold cases and cases sitting idle and cases involving some real heavy hitters in that world that need funding and resources to push forward (because the other side has no shortage of funds). THAT would make a difference. That would actually help get us closer to cutting off some of the heads of this massive beast. It's worse than drugs. With drugs, you can eventually track what you find on the street straight back to a dealer, and then go on up from there. This problem is so massive at the base level, policing that level makes little to no difference on the over-arching problems and decision makers in that world.

Trying to bring her back to the topic at hand, I asked her about using her iPhone with this new system on it.

Like I said, it's a bit creepier doing it that way, but I'm not going to stop using my iPhone. It does what I need it to do, and I don't have anything to worry about, so I'm not going to take the time to switch. I have more important things to worry about.

So, an interesting take from a non-techie who has some experience in that world... 🤷‍♂️
Thanks for sharing. And yes, majority of people wouldn't even be aware or realize the potential implication. And yes, this is basically a bucket vs the actual titanic, if we are looking specifically at the original intention of child abuses. Thus it begs the question, why implement such a blanket scanning system baked into the OS if the known result will be just minimal?
 
So now people, beside seeing ghosts and backdoors that are not there since they can’t phone home to Apple, are also role playing conversations with imaginary friends about this…

Also, what a dumb take “this is just a bucket in the sea”…first they are not in the position to know that (and numbers say otherwise), second this system has a very specific mission: keeping that crap off Apple’s servers. Nothing more nothing less. Not eradicating child pornography from the face of Earth with a single tool. Imagine the irony of on the one hand lamenting “Apple thinks they’re the police now?? They’re overreaching!” and on the other hand “What Apple is doing is not enough anyway, it’s a bucket in the sea” i.e. bashing Apple for LIMITING the scope of their action against CSAM. Imagine that.
 
This is pretty much the tech companies too and where Apple is headed.

F7EEB15C-5FF8-45D0-8B84-9CEAD6D66DB5.jpeg
 
If it’s not about the tech then people are getting triggered for something that was already routinely performed server-side before.

Alternatively, if people fail to understand that there’s no practical difference between server-side search and on-device search that only becomes human-readable once the pics are uploaded to a server AND you cross a threshold of offences, yes he’s right you people can’t wrap your head around how this works.
Oh yeah, a master key to open all doors In your house. Guess it is good for you to have.
The tool is there, and it WILL be misused, regardless of its form, shape or otherwise.
Can’t wait for headlines saying Apple is forced to implement voice, video and phone Activity analysis on all iPhone In a future iOS release. :rolleyes:
You know what Apple can do? Remotely shutdown your phone after detecting using banned apps or browsing banned websites. Must be nice to have.
 
Finally Apple is turning the iPhone into a bug with iOS15. Apples made clear that this isn‘t your device, it is Apples device and Apple does what Apple wants to do. If it wants to scan your device for p0rn or secrets, it does. If the FBI or China tells Apple that it wants to piggyback Apples scanning service Apple has no chance to deny.

Apple fooled the people by telling them that is does not want to create a back door in iOS to unlock phones, just because if there would be one, also the evil people will use it. So no one needs a fingerprint or pin anymore to scan an iPhone. Just use Apple(TM)s exclusive content scanning service.

With the roll out of iOS15 millions or billions of people will loose their privacy. So don‘t install it.

Waiting for my Linux phone - seems to be the only way to stay private and secure in 2021.
 
Apple will do the same thing that WhatsApp did. If you remember WhatsApp did an update that said it would hand over phonenumbers to it's parent company Facebook, which had thousands upon thousands of people leaving WhatsApp. Apple will do the same, they will make an update that says it has the right to scan all your devices and icloud and that to agree to the new terms and conditions you must click 'accept' because if you do not, you will not be able to use your device.
 
Oh yeah, a master key to open all doors In your house. Guess it is good for you to have.
The tool is there, and it WILL be misused, regardless of its form, shape or otherwise.
Can’t wait for headlines saying Apple is forced to implement voice, video and phone Activity analysis on all iPhone In a future iOS release. :rolleyes:
You know what Apple can do? Remotely shutdown your phone after detecting using banned apps or browsing banned websites. Must be nice to have.

False.
The key is cryptographically broken in an undisclosed number of pieces that can only be pieced back together if multiple matches happen.

This is the current implementation, everything about it should be dismantled and rebuilt to make it work differently. Basically fan fiction at this point.

Backdoor truthers spreading fake news out of their *ss as usual.
 
Debunking fake news, misconceptions and the comforting warmth of the “let’s be enraged by default, shoot then ask” tendency on social networks is a thankless endeavour.

Won’t make you popular, being enraged and triggered is the easier path.

But somebody’s gotta do it.
 
For all the „not a real problem“ people who fail to understand the practice because they only look at the implementation: It‘s like someone going through your house with glasses that will only give them vision to stuff that they’re supposed to look after. They won‘t see your (currently) legal private stuff but they‘re still constantly sniffing around.
Please do show some understanding for people feeling uncomfortable with this. Saying „go somewhere else then and don‘t complain“ is a poor sign of absence of empathy.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.