Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I have not seen it in action but based on at least the protest from the trans community I assumed it must be scanning the camera roll, not just incoming text messages, there are several things Apple was rolling out, we could be talking about two different things
The iMessage feature blocking nudity is completely separate from the CSAM stuff. Why would the trans community be worried about CSAM detection unless they’re collecting known CSAM content (in which case they deserve what they get)

In short, no your camera roll isn’t scanned for nudity and it won’t notify your parents.
 
Wow dude. A month of articles about this, millions of comments and you still don't understand that people's beef with this has nothing to do with child safety. People fought for our liberties and others around the world are losing their lives trying to gain some. Its shocking to me that their are people such as yourself that are so willing to give them up under the guise that this is about child safety.

This is essentially like having someone in your home that pops up every time you bring in some new groceries to tell you that everything is okay, carry on. Sorry I don't need or want that. You want to video or check everything I do while I am outside my home (ie. cloud), fine, I am okay with that. But I am not going to pay money for a home (ie. phone) that comes with someone patting down my pockets every time I come in.
Comparing child safety to buying groceries is your first mistake. I hope my children never come into contact with you or any others of your ilk.
 
Here’s what I don’t understand.

Scan in iCloud: completely fine, they can scan anything they want. Even if it’s political memes or anti vaccine stuff. They can use the exact same software and everything.

Scan on my device: Bad. All of the results stay on your device and your device has no idea whether there’s a match until the voucher is uploaded to iCloud and decoded. Then to further enhance this, the files are checked again using another hashing process in the cloud to make sure it’s a real match and not a false positive. Then if all of that happens with 30 photos, it’s finally seen by humans.

Both of these methods NEED iCloud to function. The code can’t physically do ANYTHING without the second step in the cloud. So in reality, this system can’t actually be any more abused than scanning solely in the cloud.
Jayson, this article was written specifically for people like you that have been watching all month and still don’t understand…. Your not alone though so don’t take offense, just read it https://www.theatlantic.com/ideas/archive/2021/09/spyware-your-iphone-step-too-far-privacy/619987/
 
Jayson, this article was written specifically for people like you that have been watching all month and still don’t understand…. Your not alone though so don’t take offense, just read it https://www.theatlantic.com/ideas/archive/2021/09/spyware-your-iphone-step-too-far-privacy/619987/
iCloud is still needed. No iCloud, no hashing, no cops. If Apple wants to scan the content that’s being uploaded to their servers, then they can. I won’t ditch my iPhone over this non-issue.
 
I’m just saying, both methods require iCloud to work. So the complaints that the first part happens technically on your device makes no difference. Your phone can’t do anything with that data. It HAS to be uploaded to iCloud for it to work. So therefore it’s the same thing.
So do it all on the cloud and leave me and my device alone.
 
The iMessage feature blocking nudity is completely separate from the CSAM stuff. Why would the trans community be worried about CSAM detection unless they’re collecting known CSAM content (in which case they deserve what they get)

In short, no your camera roll isn’t scanned for nudity and it won’t notify your parents.
I’ll defer, was not talking about csam and the parent thing was not something that concerned me so most of my knowledge was based on articles about endangering children through these parental notifications. This was something they were doing that I was not concerned about but still thought it was overreach because it still was on device scanning using a.I. Looking for objectionable content
 
  • Like
Reactions: dk001
I wonder how many additional children will be victimized from now until then? Apple the greatest company in history with the greatest humanitarian intentions forced to deal with grandstanding ignorant politicians and self centered selfish advocacy groups. It’s unbelievable!

By flagging old images, Apple actually incites the images producers to produce new non-flagged images on a daily basis. So children will be abused MORE to get daily NEW images, just for the criminals to avoid the flagging system.

Meaning Apple move will actually *increase* daily abuse of children
 
  • Like
Reactions: ian87w
I personally wouldn’t change the system. I think it’s fine the way it is, but Apple seems to be re-thinking it and perhaps they’ll make it even more secure than it already is so that it pleases the people who think their privacy has been invaded.

I fear it won’t help though because people already don’t trust anything they do or say anymore because they wanted to install “spyware” on their iPhones.
Ok, so then what's important for you about this discussion? You've posted so much, dozens if not hundreds of posts.

Clearly there is some part of all this that's important to you. What is it?
 
FROM APPLE:
the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.
That's the thing, I don't want ANYTHING on MY phone remotely associated with child porn. Whether it's hashed, encrypted or obfuscated makes little difference to me. Keep this crap out of my phone.
 
That it improves privacy for the end-user?
Stopping the distribution if CSAM was never about improvement if privacy.
That I would like to see. I would like to see how scanner-on-my-device-no-matter-how-implemented improves privacy over no-scanner-at-all. Or how scanner-on-my-device-no-matter-how-implemented improves privacy over scanner-on-cloud. Stipulated: There is no E2EE in either case.


Are you suggesting those who have or plan to exit the Apple ecosystem are doing so based solely on emotionalism?
eyebrow.gif
It could be, I don’t know.
 
I can see your point, just wonder if your also thinking of the redneck down the road that will beat and kill his children because Apple sent him a notification they were sexting….. that will happen, and The gay or transgender outing thing will happen too and probably result in suicides, people tend to judge such tools based on their own situations and mental stability but we live in a messed up world… I am not finding Apple solution very appealing, just questioning if it’s really apples job to bake this in to iOS…. There’s an app for that if you think it’s important to spy on your kids

Using the foster care system as a reference, for every bad story there is one good and another of hope. It is disheartening what parent will put kids through based on some little thing.
 
That's a big no because as I stated listening to a podcast with a tech expert, this is a great way for bad parents to target their queer children. But like everyone else, I'm against child abuse material too of course, and grooming.
I don’t know that I buy that. When a kid (under 13) is texting with another kid that they have a crush on, it isn’t like the phone is looking for a gay conversation. It is censoring images that may a kid maybe should t be seeing.

I still remember my best friends brother showing us his girlfriends boobs that she texted him. They were both 12, the little brother didn’t realize that he may have been disininating child porn.

If I was that girls parents, I’d want to know if my kid was sending or receiving tit/dick pics.

And I want everyone to remember, the parent isn’t notified, UNLESS the picture is viewed/sent by the child.
 
That said, some do fear that once Apple has opened the door to this kind of on-device scanning and reporting, it could very well choose to build algorithms that would scan for a lot more, but again that's getting into extreme "What if?" scenarios. If you're willing to go down that road, you should have stopped using an iPhone years ago, as Apple "could" do just about anything it wants to behind your back.
The fear is that Apple could be strong armed into scanning for other stuff. They have no control of the hashes they receive from their sources, and have very little in the way of verifying them. The only verification they get is once an account is flagged, they can review the matches. Therefore our only protection from government abuse of the system is the people at Apple doing manual verification. Better hope they care about their work like Craig does, perhaps more so.

Apple’s method of detecting material is precisely what makes this dangerous. Doing a server-side only check on shared files is much safer for us and Apple. Their AI could scan all shared photos (not the library just being stored, but photos that are explicitly “Shared”) and look to see if any photos contain both children and nudity. If it does, hash it and compare to the database, and if it matches, flag for human review. This adds a layer of protection in that Apple’s image recognition AI has to detect possible CSAM, then positive hits have to match the database, and then a human at Apple looks at the report. If Apple’s AI doesn’t detect the presence of both nudity and children in the same photo, it doesn’t go any farther than that (which currently already occurs). This extra step should help prevent government abuse (internet memes and political crap never has a chance of being checked against the database). This also protects us in that it all happens on-server, and therefore removes the potential for the system to work without iCloud.

Not an ideal setup (rather do without any of this and keep photos as-is), but if we must, it has to be:

1. All on-server
2. Rely also on Apple’s image recognition AI (what already runs on our phones to enable image search, but run on-server instead and ONLY on shared photos, not the main library).
3. Be bundled with E2E encryption on ALL stored iCloud data (sharing photos with other people necessarily breaks E2EE on those photos only).
4. Mac/PC cable/Wi-Fi sync and optional iCloud features remain. We must retain the option of using iTunes or Photos to sync pictures/videos instead of iCloud, and we must retain the ability to leave iCloud turned off.
 
<devil's advocacy>They'd get their search warrant, enter the property in question, and seize the evidence. Today that evidence is often protected by strong encryption. They're just trying to level the playing field. Same reason they now wear body armor and oft times have true assault rifles close to hand: The criminals have upped their firepower. Law enforcement must needs answer evolving threats.</devil's advocacy>

I see no reason for LEO not to get new / improved tools and methods HOWEVER the degradation of my rights isn’t a tool they should have access too.
 
  • Like
Reactions: Pummers
Ok, so then what's important for you about this discussion? You've posted so much, dozens if not hundreds of posts.

Clearly there is some part of all this that's important to you. What is it?
It’s important to me because I don’t like all the misinformation that is going around about this.
 
  • Haha
Reactions: Pummers
That has been an ongoing question; “what is driving this and why this solution?”.
You keep bringing this up but the truth is only apple has the answer to that question. Asking us isn’t going to help you find it.
 
I don’t know that I buy that. When a kid (under 13) is texting with another kid that they have a crush on, it isn’t like the phone is looking for a gay conversation. It is censoring images that may a kid maybe should t be seeing.

I still remember my best friends brother showing us his girlfriends boobs that she texted him. They were both 12, the little brother didn’t realize that he may have been disininating child porn.

If I was that girls parents, I’d want to know if my kid was sending or receiving tit/dick pics.

And I want everyone to remember, the parent isn’t notified, UNLESS the picture is viewed/sent by the child.
Or you could just tell them not to do that kind of thing, if they won’t listen to that advice then no amount of punishment or shaming is going to help… maybe Apple will send some therapist to your house to assist. Point being if your kids are doing this you have already lost
 
Correct. I may exit no further than I have already, but I won't be going back to the degree of immersion into the Apple ecosystem I was before all this. As somebody else put it: "The toothpaste of distrust is out of the tube. You can't put it back."

I picture a tube of toothpaste in hand, toothpaste all over the bathroom sink, looking at it, and going how the hell do I get it back in there?

What a visual!
 
And the way the US government can’t keep their crap together is not astonishing to you? The Afghanistan debacle. The booster vaccine dose debacle. The border crisis debacle. The masks debacle. The flooding in New England debacle. Are those not astonishing and mind blowing? We, in the US. have a bigger fish to fry right now than this Apple PR disaster.

Sir, this is a Wendy's
 
  • Haha
Reactions: dk001
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.