Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Can someone explain how this tech works? I find face scanning and photo regensition tech to get it wrong a fair amount of time. Is this going to flag pictures that might be an issue and upload them for someone to look at? People take a lot of nude photos, so is it going to be scanning for nude photos it thinks are young people and sending them for a person to look at? I don't get how it's supposed to work.
As I understand it, it's not interpreting the contents of the images in your photo library at all. It won't know the difference between a nude or a picture of a sandy valley. Instead, it's boiling down the photos to a small hash and comparing it to a database of known illegal image hashes. If they almost match, a "safety voucher" is generated. (What double-speak.) If it's not in the database, it probably won't match, even if it does happen to be an (unknown) illegal image.

According to another article on MacRumors, Craig Federighi says about 30 vouchers are needed before action is taken. However, the action is indeed your photos (nudes or sandy valleys) that have vouchers attached being decrypted and inspected by Apple.

So they might well get sent to Apple, if the threshold is met.
 
Spotlight search don’t report your image to law enforcement.
Honestly at this point I wouldn’t be so sure the spotlight search of your pictures doesn’t index and send a hash in the first place I’m having a feeling the rush for this and so quickly so before lunch of iOS 15 this is probably already been done it’s just a matter of flipping the switch and off the report goes.
 
Yep cause buying a £50 hard drive and keeping them on there instead is IMPOSSIBLE I tell you it’s impossible. You are presuming that this will make paedophilic scum will stop once they can’t share pics. No they are more likely to go and do it in person. Be careful what you agree to and wish for.
Sure anybody can buy an external hard drive through your photos on there but unless you’re actually shipping them via FedEx which adds tracking if a case is brought forward an investigation done that’s evidence, if you upload to a file sharing service the more common ones like one drive or Google Drive they’re gonna still scan that contact so external hard drive is useless at that point. The main key would be to keep it stored but even then your store not hoarding for themselves they’re still trying to be part of this messed up community I seen somebody specifically arrested for that during a study and was very glad and happy because most of us in that study did get some strange ill vibes from that one guy.
 
How is Apple getting the photos or their signatures to compare against? From law enforcement or some private agency?
 
Last edited:
As per my latest comment, that’s not a hash algorithm then, that’s feature detection algorithms such as the one you mentioned. That is much worse than simply comparing image hashes, but yes, it is more accurate and harder to spoof. However, it is more open to feature abuse.
It IS a hash.

"A hash function is any function that can be used to map data of arbitrary size to fixed-size values."

In this case, the image data is of arbitrary size, and the fixed size value is the computed hash.
 

Enough is Enough, Apple! It is incredibly distasteful for you to SPY on us, as consumers. Apple it will be in your best interest to give us the Opt-Out option from CSAM feature, please.​


Where are you, Craig? You said this yourself in WWDC - 2021. What happened?

“At Apple, we believe privacy is a fundamental human right,” said Craig Federighi, Apple’s senior VP of software engineering. “We don’t think you should have to make a tradeoff between great features and privacy. We believe you deserve both.”

This is a punishment and a big slap in the face for anyone who owns an iPhone. Whatever you are trying to accomplish with this Apple, leave it in government and law enforcement hands.

1. How about CSAM scans Apple's executives' iPhones first? No one wants their privacy to be exposed. Please stop this nonsense and RESPECT our PRIVACY. It is our fundamental human right.

2. How come this CSAM stuff was not mentioned by Apple during WWDC - 2021. Apple is up to something. Why now? When we are almost to the release date of iOS 15...


Also, this guy needs to be FIRED from Apple. He is the mastermind behind CSAM. What a joke!

View attachment 1818081

Does anyone here have a game plan on how we can stop this crappy CSAM feature?​

Turn off photos in the cloud? Simple.
 
Good, but sorry it's too late for me to stay with Apple.

I'm done with big tech. Bought a Nokia 5310 (2020) for calls and texts. That'll do.

I also have a wifi-only degoogled android for some apps when I'm at the house.

We'll see how it goes. I may take the degoogled android as the main phone in future, but for now, I'm going low-tech.
worked out really well for the neanderthals
 
I'll share my game plan with you. (It's a work in progress).

I will stop using iCloud - Especially the Photos feature
I will not use my personal ID anymore
I plan on setting up the new iPhone as new

So far, this is all I have come up with :(

Nothing is going to stop me not upgrading to the new Apple iPhone.
They'll still get you by your ip.....amongst other ways....
 
Up until now all of my data has been on a pair on macOS X Servers and I have only used iCloud to get data from "here to there". Since last week all my data is now on Synology NAS devices and because of DSM I don't even have iCloud Drive > Documents and Desktop turned on any more. This change to Synology NAS was always going to happen since Apple stopped supporting their Server product in a meaningful way but it couldn't have been more timely.

Unfortunately privacy is an "all or nothing" game and I thought Apple knew that!

My advice to most people that are truly concerned is to own their own "cloud".
Been doing this same thing for 10 years now. only bad news, is my Synology is no longer upgradable for new versions of DSM
 
Well done! You said it! If you’re against the surveillance you must be a pedophile!!! Bravo! But the again, who wants all their pictures to be scanned. Who wants to be watched and monitored all the time. You like that sort of stuff? You like to be watched? That’s so sick and perverted. Please, people like that must be reported, preferably automatically.

You do understand this is much bigger than some CSAM images? This is about privacy and increase in surveillance. It’s about misuse of power. It’s worth mentioning that Apple doesn’t plan to release this feature in Europe because it would be highly illegal there.
Like I need the world to know I have a fetish for red headed ladies...oops?
 
  • Like
Reactions: UltimoInfierno
My takeaway from this as a father of a son -- can the algorithms tell the difference between 13 and 16 and 18 year old bodies? This is going to affect teenage boys more than anyone else. C'mon they are hormone factories. I'd pay thousands for a photo of moments with my first G/F in high school (and we were both jail bait, but she was 6 months older.) and that was 39 years ago. That's not being gross, we're just humans.

If. I had a photo from 39 years ago of my lovely first g/f's nubile body -- am I now a pedo? So many variations of grey in this.....
 
  • Like
Reactions: UltimoInfierno
My takeaway from this as a father of a son -- can the algorithms tell the difference between 13 and 16 and 18 year old bodies? This is going to affect teenage boys more than anyone else. C'mon they are hormone factories. I'd pay thousands for a photo of moments with my first G/F in high school (and we were both jail bait, but she was 6 months older.) and that was 39 years ago. That's not being gross, we're just humans.

If. I had a photo from 39 years ago of my lovely first g/f's nubile body -- am I now a pedo? So many variations of grey in this.....
Well, there have been cases where under-aged kids were trading nude selfies, and they charged BOTH of them with distributing child porn. Think about THAT one... :S

So does it make you a pedo? I don't think so... would it mean you were in possession of child porn? Possibly...

#SickSadWorld
 
  • Like
Reactions: ian87w
I am first in line against this mess.

But then again, I think: how many Americans do not have a Facebook account? And I kind of lose hope, to be honest.
FB is evil. MR is my social media.

It is rather sad about many people blindly click on agree / accept without really caring about what they are at risk for and what they are giving away all for the drug of FB, IG, Twitter and other such mindless drivel.
 
Good, but sorry it's too late for me to stay with Apple.

I'm done with big tech. Bought a Nokia 5310 (2020) for calls and texts. That'll do.

I also have a wifi-only degoogled android for some apps when I'm at the house.

We'll see how it goes. I may take the degoogled android as the main phone in future, but for now, I'm going low-tech.
Wait, since when is Nokia not "big tech"?
I'll share my game plan with you. (It's a work in progress).

I will stop using iCloud - Especially the Photos feature
I will not use my personal ID anymore
I plan on setting up the new iPhone as new

So far, this is all I have come up with :(

Nothing is going to stop me not upgrading to the new Apple iPhone.
I am tapping the brakes on my new iPad and MBP plans for later this year.

There's still time to see what happens, but this is stupid for Apple to do.
I don't think Apple explicitly said "Apple employee." Remember that Apple had no problem giving confidential SIRI materials to outside contractors, and we knew the "integrity" of those contractors....
My company refers to outside contractors as "contract employees". And they are expected to adhere to all the same rules that employees do.
Bottom line: It will be reported to Law Enforcement. Just imagine the FBI is knocking on your door with a search warrant to go through your iPhone because of a detection caught during CSAM.
Worse than this, what if somebody figures out how to "go after" somebody. You know, like a Red Flag thing, or "Swatting"?

CSAM are images that have been pulled from convicted pedophiles phones, computers etc and added to a database as hashes. If these EXACT images are matched via this new system, the image will be flagged. So I have to ask the question, if you happen to have EXACT matches for images of CSAM that were identified from pedophiles device, how is it you have a problem with that?
A hash is by its very nature, INEXACT.
To simply put. I don’t want Apple scanning my iPhone whether it’s via AI or Hash. I want my privacy to be protected and to keep to myself. That is my fundamental human right.
I am with you 100%. Privacy is very important to me too, but...well, assuming that you are in the US, your right to privacy is not actually an enumerated right. The Founding Fathers respected privacy and felt that the government doesn't/shouldn't have power over your privacy, but the "right to privacy" is not quite like your freedom of speech, freedom of association, freedom to bear arms to protect yourself and your family, the freedom from being forced to house a military person in your abode, and much much more.

The only reason I pick at this is because for us to defend our actual RIGHTS, we need to know what they are.
No no no… It’s not about exact matches. It’s about content which can be seen similar to those images. The AI has been trained with approx. 200 000 images. The information gathered from that will be used to find possible new images. If image falls within set range of neural hash then it will be flagged and manually checked.
Additionally, I say that this technology will be hackable. Hope you don't have any tech-savvy enemies!
 
Last edited by a moderator:
Actually it looks like you are taking it way too seriously. You might want to chill a bit and stop all the crazy sensationalistic nonsense you are sprouting. Apple have held fast to their privacy stances and I see no indication of that changing.
Fixed it for you.

“Apple have held fast to their privacy stances and we all see a strong indication that they’re changing that.”
 
How can you not see how easily this system is manipulated?
And maybe THAT is why Apple wants to do it. What better way to go after your political or ideological detractors than just having them arrested and jailed?

What, you say, without a trial? Hey, there's already a not-insignificant percentage of the population that is already willing to jail or put to death people who've merely been accused of a crime; for example, "spreading misinformation", even unintentionally.

A long time ago, I heard one person actually say that they would be willing to have somebody just put to death if they were accused of rape. "They don't deserve a trial", she said.

And she could not be convinced to come down off of her stance. The accusation was enough for the death penalty. Somebody else in the group finally spoke up and said, "I accuse you of raping me".

No response, but the expressions around were like this: :eek::oops:😬, with one of these 😏 and one of these🤭 from a couple of the smarter ones present. Then he said, "Now, shall we just take you out back, or would you like to have me prove my accusation of you in a fair proceeding with somebody who won't pick favorites?"

Now, there was no crime commited, and in fact it was just a bunch of young people getting together for beer and bullsh!t, but man oh man, what a way to make a point!

And THAT is my point here. This hashish AI thing is ripe for abuse by the next Karen or Chad who gets their panties in a wad over the least perceived social offense. Abuse of our rights, including the right to a prompt and fair trial where we can confront our accuser and have our guilt decided by an unbiased party of our peers.
 
the whole point of this isn’t to find and prosecute child abusers, it is to get a system in place for monitoring everything we own digitally.
Yes. Take a picture of your blind and dexterity-challenged elderly parent's prescription label so that you get the right stuff at your new pharmacy? Now you're dealing in drugs that you have no prescription for.

Take pictures of your new firearm so that you have a visual record of the serial number? Hey, the hash for a handgun or a rifle would be pretty easy to put into a database. Now we know you're a gun owner, so we'll know A) the first address to go to when it's time to take away your rights or B) the first address to go to when it's time to phone in a red flag call.

Take a picture of your girlfriend? Now you're a stalker.

Take a picture of your child's rash so that you can show it to the doctor or a pharmacist? Oh man, now we just need to get you off the planet and put your child into a government re-education camp.
by saying they are going after paedophiles they have chosen a subject that it is very hard to argue against, witness the posts earlier saying you must be a paedo if you are against it.
Yep, that's how it works.
so the public will think it is all a good idea, until their door gets kicked in for a meme about transgender people or whatever their government takes offence at.
But then it's too late to do anything about it. You're going to jail, and your crime is so bad, you don't deserve a trial.
you never know quite how slippery the slope is until you are in it and it is too late, and if you give power to anyone they will ALWAYS abuse it
They will ALWAYS abuse it.

I know this gets off topic a bit, but that's why I support term limits for everybody in government at any level. Mailman? 5 years, then you need to move on dude, sorry. Congressman? No more 36 year stints; no way. Time to move on, ma'am...you're going to have to grow your wrinkles in a DIFFERENT profession! Homeowner Association busy-body? Time to move on, sir!

lol!
 
That's not how it works. Have a read up, Apple have put out a technical white paper on it.

Unfortunately this is exactly the way it works. I’ve read the white paper, the leaked memo and analysis from security experts.

You do understand they use the base set of images to teach the AI as is said on the Apple white paper. Do that 200 000 times and voilà, you have new material within a neural hash threshold due to limited subject matter.
 
My company refers to outside contractors as "contract employees". And they are expected to adhere to all the same rules that employees do.
Ideally, yes. However, seems like Apple's choices of sub contractors do not always have the integrity we expected based on prior cases. Thus it begs another question, especially since this can mean the difference between someone being convicted or not. It sends quite a chill imo.
 
  • Like
Reactions: Violet_Antelope
Well, there have been cases where under-aged kids were trading nude selfies, and they charged BOTH of them with distributing child porn. Think about THAT one... :S

So does it make you a pedo? I don't think so... would it mean you were in possession of child porn? Possibly...

#SickSadWorld
Seems like It's part of the dehumanizing humans, and USA seems to have a a preversed view of porn, nudity, etc. Remember the outrage of wardrobe malfunction?
 
  • Like
Reactions: 09872738
Will the entire Apple ecosystem made to be like this? Like permanent inspections happening inside all end user devices?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.