Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
There must be a reason that Apple has been working on this plan to begin with. It’s that it’s platform is being used as a distribution mechanism for CSAM.
it is weird apple is even interested in doing this as they have no reason to get into this topic to begin with as a technology company. It seems like they have been contacted from the government and agreed to something. why else would they get involved in something they have no right to be involve with...
 
it is weird apple is even interested in doing this as they have no reason to get into this topic to begin with as a technology company. It seems like they have been contacted from the government and agreed to something. why else would they get involved in something they have no right to be involve with...
And remember that this affects everyone in the world as the system is baked into iOS itself. Any enterprises and high level politicians should probably be concerned if they use iPhones.
 
it is weird apple is even interested in doing this as they have no reason to get into this topic to begin with as a technology company. It seems like they have been contacted from the government and agreed to something. why else would they get involved in something they have no right to be involve with...

The question is, what's that actual reason? And is CP the actual reason, or is it just a ruse to have such system in place?

I don’t have it in front of me, but if you look at the amount of CSAM incidents reported in 2020, Facebook and Google were in the hundreds of thousands to millions, while Apple was in the hundreds (under 300 I believe).

Edit: https://www.missingkids.org/content/dam/missingkids/gethelp/2020-reports-by-esp.pdf
 
Last edited:
  • Like
Reactions: Pummers and dk001
I don’t have it in front of me, but if you look at the amount of CSAM incidents reported in 2020, Facebook and Google were in the hundreds of thousands to millions, while Apple was in the hundreds (under 300 I believe).

Edit: https://www.missingkids.org/content/dam/missingkids/gethelp/2020-reports-by-esp.pdf
Very intriguing. Again, it begs the question on why Apple is so Gung Ho in implementing a mass scnani g system on all iPhones worldwide. Why go such length when their own platform is not significantly linked to the issue itself?
 
Very intriguing. Again, it begs the question on why Apple is so Gung Ho in implementing a mass scnani g system on all iPhones worldwide. Why go such length when their own platform is not significantly linked to the issue itself?

Their platform isn’t significantly linked to the issue itself because they haven’t been scanning for it, not because the CSAM material isn’t there (iCloud Photos is one of the larger photo hosting platforms now). Because of their focus on privacy to date, they haven’t even been actively scanning server side for any of this stuff. They thought this method of scanning was a clever workaround where they could catch CSAM photos being uploaded to iCloud Photos without having to scan your entire iCloud Photo Library. That was the communicated reason behind all of this - a more private way of catching this stuff. What they didn’t expect was the massive, and appropriate, pushback to this hash scanning happening on device rather than server side.
 
Everybody will look for the hashed database now. The only way out for them I see is to totally move the scan to separate servers and keep end user devices clean from anything. If not abandoning the entire idea. The public might expect to get some formal guarantee/promise that they "never" do something like this again. This used to be a specialist topic but only now slowly moves into the more general audience. The iPhone event stories will all cover it.

I still don't understand how they could be surprised by the opposing reaction? This might have cost them billions in brand value.
 
Everybody will look for the hashed database now. The only way out for them I see is to totally move the scan to separate servers and keep end user devices clean from anything. If not abandoning the entire idea. The public might expect to get some formal guarantee/promise that they "never" do something like this again. This used to be a specialist topic but only now slowly moves into the more general audience. The iPhone event stories will all cover it.

I still don't understand how they could be surprised by the opposing reaction? This might have cost them billions in brand value.
Yes this is what I expect them to do, obviously they feel strongly about csam and have put a lot of work in to it and I’ll support that as long as the Code remains on their server…. As far as the apology guarantee thing though, I’m not sure they will do that although I need them to….. I’m with you that it’s hard to understand why they thought this was ok, perhaps one day inter department emails will emerge like they did in the epic case that will shine some light on it….. until then we will probably remain in the dark
 
it is weird apple is even interested in doing this as they have no reason to get into this topic to begin with as a technology company. It seems like they have been contacted from the government and agreed to something. why else would they get involved in something they have no right to be involve with...

The question is, what's that actual reason? And is CP the actual reason, or is it just a ruse to have such system in place?
Don't know if this is a ruse, but this didn't happen suddenly and has been in the planning and implementation stages for a while. As to the root cause of the trigger, we likely will never know, but I don't think Apple took it upon themselves to implement a system such as this.
 
I don’t have it in front of me, but if you look at the amount of CSAM incidents reported in 2020, Facebook and Google were in the hundreds of thousands to millions, while Apple was in the hundreds (under 300 I believe).

Edit: https://www.missingkids.org/content/dam/missingkids/gethelp/2020-reports-by-esp.pdf
Not trying to minimize/trivialize the existence of CSAM or the child abuse/exploitation problem in any way, but I’d be genuinely interested to know how many of these reports of “apparent child sexual abuse material” (what the linked report says the numbers are) turn out to be actual, verified CSAM. And of those, how many involve photographs or similar images (what Apple is focusing on). When I look at some of the entities reporting on this list, I wonder what constitutes CSAM in the report vs. just pictures.
 
If you have a i-Device, open up your photos app and look at the search icon on the bottom right (at least on my iPhone). Upon click it, you'll see a prompt to search for <Photos, People, Place...>. Go ahead and search for something. On same i-Device, from the home screen, swipe down, and start typing something --- results show up nearly instantly. The fact that you can search for these things is a result from your phone scanning your content and semantically tagging it. This is all built into iOS.
As does every other major operating system. Even your local library does this with books. It’s called Indexing.
 
Congratulations. I'm on a similar path. I already phased out macOS and replaced it by Linux. It feels realy great to have a lean system that I can control like I want. I've already bought a midrange Android Phone to get used to it. I've yet to do the switch. It has some downsides, when you try to use it without coupling it to a Google account, but it is possible. Also, it is great to have the possiblity to install applications without an app store. I've yet to decide what will replace my iPads or if I'll keep them for now.
I've been looking at a Microsoft Surface and replacing Windows with Pop!_OS.

Check out the video's by Eevnos Linux on YouTube. The guy has a video for installing Pop!_OS, and another about running it on the MS Surface.
 
Last edited:
You still don’t get it…. Amazing
Amazing that you don’t get what he’s saying. In reference to your article
God please don't let mine run into you or any others of your ilk. Puke
great comeback. almost a repeat of what I posted.
No. I hope you don't have kids. If you require Apple to spy on your children in order for you to parent, then you already failed. But this isn't about our parenting skills so I'll leave it at that. There is also no point in discussing anything with you because you are dead set to think that anyone that opposes this equates to them being for Child Porn which is completely false and insulting. Having a discussion with some one that plugs their ears and goes la la la la is a waste of everyones time. If you are truly interested in understanding the big picture and why people are opposed to this, maybe go back and reread the comment section with an open mind, instead of responding with your childish personal attacks (ie. what wrong with you people, Hope you don't have kids, I hope my children never come into contact with you) which offers absolutely nothing to the discussion. Why not respond to our points and prove us wrong instead, but I'm sure you already knew that's how civil people debate because you are the only great parent on this forum.
See. This is where you need to mature. You are arguing, not debating, with someone but instead of using factual statements in posts, you make up your own. I quoted what a person said…”child safety crap” but you go on to guess that I need apple to parent for me. I never said that. You can’t just make stuff up to fit your argument.

What I’m interested in is people who have child porn on their devices to be found, caught and prosecuted. Again, you make assumptions about me when I’m telling you exactly my point. About halfway down, you use several of my quotes and I’m ok with them. I stand by them. If you compare getting groceries to children, I don’t want you to know my children. Alot of commenters here have stated they just want their privacy and I’m good with that as well. But I’m also the person that when I get pulled over at 10 p.m. and I tell an officer I have a gun in my car (because I’m allowed to have said gun) and that officer as me to step out of my car for the safety of both of us, I willingly do so. Do I have to, No, but I do because it’s the mature thing to do. I also have LEO’s in my family, and I appreciate them being afforded some extra steps for safety.

Please check your last line. As you continue to put words in my mouth, can you let me know when I said I’m the “only great parent on this forum”, or did you just make stuff up again to fit your argument? Again, maturity. It will happen one day. Be better until that day though.
 
Nobody is for not saving kids, I'm all onboard for more funding and more resources to help catch and convict child predators, but what is not needed is the degradation of everyones security and privacy. You start trading away your individual rights then soon you'll not have any. This is what this protest/fight is about. We've all seen this show before about "Will anyone think of the children!" outcry which has always been about being a code for invading your privacy and taking away your freedom and censoring, pick your poison pill it'll happen history is littered with it and littered with people like you who willingly go along with it. The powers that be, this is not about saving any children it is about controlling you and serving it up on a plate of propaganda where if you oppose it they can call you all sorts of nasty things.

So, ok fine you have no problem with it but for the rest of us that sees the danger we are being forced into it we have no option to refuse.
He’s got no problem because it’s Apple, that’s literally the only reason…
 
Do you really know that for a fact, though? Why is it you can trust the current scanning processes, but not the ones Apple’s put in place for CSAM? Who’s to say Apple hasn’t been receiving metadata about your photos this whole time? I mean, people completely trusted Apple with Siri until it came out that people were actually listening in on your conversations, so…
The biggest factor is the intent behind features. Being able to search your library for cars is helpful. Detecting and reporting CSAM is looking to get people in trouble. Unlike Google, the way most Apple software is developed shows that the people designing it actually use it themselves


Best I can tell about the whole Siri debacle, is that people didn’t read the update cards after updating their phones. It spelled it out clearly that opting in would allow recordings to be listened to by humans. It was easy enough to turn it off and go into settings after the fact and make sure it’s off. Settings-Privacy-Analytics & Improvements-Improve Siri & Dictation. Intent is plain as day right below the switch.

Bottom line, we don’t know for sure as the software is closed-source and traffic is encrypted, but there are other indicators to look for.
 
  • Like
Reactions: Euronimus Sanchez
He’s got no problem because it’s Apple, that’s literally the only reason…
Yeah it would seem many who pretend not to even understand the debate are just either Apple fan boys or are so far down the cancel culture road they have lost the ability to think at all, it’s all related , you almost can predict their stance on other things by the way they have responded to this issue. Sad but true
 
  • Like
Reactions: PC_tech
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.