Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The mother of all Trojans is still part of the new SpyOS!

It will be already implemented and there is no way back once you installed it.

They just want to prevent a huge drop in Sales over the holidays. In February they will press the button and start the global mass surveillance of their customers. First step is pictures, then comes voice, text, video and gps (if a crime happened close to you and you are not the victim, you are a suspect).
Heh, what always gets me about these thoughts is that Apple could have been doing this already for years.

Consider that when Apple released iOS 10 back in 2016, it started cataloguing all of your photos on your device, recognizing everything from people and pets to firearms, food, and locations. We've only had Apple's word that it hasn't already been sending all of these details to shadowy government agencies for the past five years.
 
What about a compromise? Apple doesn't scan thru photos and but if people are doing this get professional help and change those behaviors.
Like Charly Murphy once famously said after beating the c** out of Rick James, after Eddie got remorse and said "Wow, man, Rick really needs help", he was like "Yo, we just gave him some help. Bust his friggin a** and s**."
 
I wonder how many additional children will be victimized from now until then? Apple the greatest company in history with the greatest humanitarian intentions forced to deal with grandstanding ignorant politicians and self centered selfish advocacy groups. It’s unbelievable!
That’s law enforcement’s job, not Apple’s.
 
So - anyone disturbed just a tad that there is a whole group at Apple that is studying how to identify Child Porn and how to program some computer to recognize it? That means that have to have examples of it....that means they have to study it, that means they have to develop requirements for this SW, that means they have to develop algorithms to figure out that this picture is child porn vs a kid taking a bath or in a swimming pool....

Then someone has to review these results to make sure they are correct and meet the requirements of the SW product.

What kind of staff are working this task?
By having and reviewing examples of the Kiddie Porn, they are breaking the very same laws.
Who is vetting these Apple employees?
This is making me queazy to think about.
It more disturbs me that people are still so misinformed after all this time. None of this happened. Not one little bit of it.
 
While I agree with what you’ve said about server-side scanning, what you said about people sharing CSAM is not accurate. The initial makers and distributers of CSAM may be savvy enough to avoid being caught, but there are plenty of morons among the consumers that spread it around using Facebook, Whatsapp, Google, etc. It’s a huge problem on all social media and cloud networks, and FB alone has millions of cases to investigate per year.

So hopefully something will be done, just not on device?

Many miss the point that Google, DB, MS, and others “scan” on share. They don’t scan on “store”
 
I wonder how many additional children will be victimized from now until then? Apple the greatest company in history with the greatest humanitarian intentions forced to deal with grandstanding ignorant politicians and self centered selfish advocacy groups. It’s unbelievable!
as if this would put a dent in it.. child predators are smarter than that or they'd be caught a lot more often. this only works if you use iCloud photo anyways.
 
  • Like
Reactions: PC_tech
Is this a safer world?, should everyone have unlimited and uncontrolled freedom?. A little sacrifice of “freedom” isn’t worth it for a big reward?. Just 1 child saved is not worth it?. We value more our individual “freedoms” over the safety of children?. There a cameras on the streets in many cities for our safety, but at the same time they “spy” on you. Paranoids and molesters making big noise with a “warning about misuse” and parents not heard. American companies are a kind of joke.
 
Hopefully they just remove this spyware from iOS and put it in their cloud infrastructure.

This would make me updating to iOS 15 and buying an iPhone 12 Pro :)
 
  • Like
Reactions: Euronimus Sanchez
So - anyone disturbed just a tad that there is a whole group at Apple that is studying how to identify Child Porn and how to program some computer to recognize it? That means that have to have examples of it....that means they have to study it, that means they have to develop requirements for this SW, that means they have to develop algorithms to figure out that this picture is child porn vs a kid taking a bath or in a swimming pool....

Then someone has to review these results to make sure they are correct and meet the requirements of the SW product.

What kind of staff are working this task?
By having and reviewing examples of the Kiddie Porn, they are breaking the very same laws.
Who is vetting these Apple employees?
This is making me queazy to think about.
This is not what happened.

Organizations that deal with CSAM use algorithms (complex sums) to create unique identifiers for known CSAM material. The system is clever enough to work even if the images have been slightly edited.

Apple simply runs the same sums against images in a user's iCloud Photo Library. If any have the same identifier as known CSAM, a flag is raised. If enough flags are raised (20 if I remember rightly) an investigation is opened. If actual CSAM is found by a human moderator, law enforcement is involved.
 
While I agree with what you’ve said about server-side scanning, what you said about people sharing CSAM is not accurate. The initial makers and distributers of CSAM may be savvy enough to avoid being caught, but there are plenty of morons among the consumers that spread it around using Facebook, Whatsapp, Google, etc. It’s a huge problem on all social media and cloud networks, and FB alone has millions of cases to investigate per year.

So hopefully something will be done, just not on device?
Exactly. The “dumb” ones can be caught through scanning on corporate infrastructure (Cloud). CSAM, as currently proposed by Apple (requires some steps of the process taking place on my phone) is not necessary.
 
The system sounded incredibly secure to me. I understand the concern about nefarious governments demanding that Apple scan for other types of content, but if China (for example) wanted to do that, they would and will do it anyway.
This.

I would be very surprised if somebody in the Chinese government hasn't already asked Apple to share the database of people, objects, and locations that it's been building (locally) on everyone's iPhones since 2016. However, I'm also confident that Apple would have said no to this.

I trust Apple on the basis of its own enlightened self-interest. The risks and massive fallout of getting caught doing something behind customer's backs far outweighs its need to pander to the whims of foreign governments. Just look at the outrage around this CSAM Detection feature, and multiply that by about a million if Apple was actually caught supply information to governments and law enforcement agencies behind our backs.
 
  • Like
Reactions: Tres
Is this a safer world?, should everyone have unlimited and uncontrolled freedom?. A little sacrifice of “freedom” isn’t worth it for a big reward?. Just 1 child saved is not worth it?. We value more our individual “freedoms” over the safety of children?. There a cameras on the streets in many cities for our safety, but at the same time they “spy” on you. Paranoids and molesters making big noise with a “warning about misuse” and parents not heard. American companies are a kind of joke.
Last week the president ordered a revenge attack in Afghanistan because he didn't want to look weak.
7 kids died.
For me, the privacy of a billion people is more important than the ego of 1 man.
 
People really don't understand this system...

Putting the automated scans of images being sent to children via iMessage to one side:

The on-device scanning for CSAM in iCloud Photo Library was a safer alternative to what every other image host does: Directly scanning uploaded photos on their service to detect known child pornography.

The difference with Apple's system is that instead of directly accessing users' unencrypted data on their servers, scanning was being done privately on-device.

I understand the slippery slope argument of Apple being forced to add other image hashes by oppressive regimes like China and Russia, but the relatively open manner in which they designed this system would have helped prevent that, or at least made it obvious.

Apple isn't wrong to want to keep CSAM off of their servers and services—they have a clear moral obligation to do this. In response, they developed a system that protected user privacy while ensuring that users hosting child pornography on iCloud could be rooted out and reported to the authorities. They even set a high threshold on the number of red flags that would have to arise before an investigation could be opened.

Their main mistake was forgetting that people don't read past the alarmist headlines.
People really need to stop insisting that just because we object to something, we must not understand it. One more time for the people in the back: we understand how it works, and we don’t want our devices spying on us.
 
Oh God! Don’t just delay it. CANCEL THIS, Apple. Can’t you see… people won’t be ordering the new iPhone 13 if you launch this child safety crap.
Nah. I don't care. I will buy some of the next iPhones. I still don't see any issue with this. I read all the arguments. I disagree with most of them. Your government is reading all of your communication anyway. And Europe's communication as well, so they basically know everything about you anyways. It is nothing new. If you didn't care back then how come ppl. want to care now?
 
Ok, they "delayed" it, for now.

The question next would be: will Apple announce the implementation of a "revised" CSAM system? Or go completely radio silent and just install it in a point update without telling anybody else, much like "batterygate" a few years ago?

And, how much effort they will put to "revise", to what extent? Would the attempt be just mitigation or some sort of overhaul which I doubt? Will they change the underlying technology?

Last but not least, will the feature be released worldwide by the time they think they are ready to push the CSAM system out or still start from US and slowly roll out to other countries?

Not to mention how much of those issues pointed out by researchers, universities, institutions, advocate groups etc will be addressed or ignored.
 
This.

I would be very surprised if somebody in the Chinese government hasn't already asked Apple to share the database of people, objects, and locations that it's been building (locally) on everyone's iPhones since 2016. However, I'm also confident that Apple would have said no to this.

I trust Apple on the basis of its own enlightened self-interest. The risks and massive fallout of getting caught doing something behind customer's backs far outweighs its need to pander to the whims of foreign governments. Just look at the outrage around this CSAM Detection feature, and multiply that by about a million if Apple was actually caught supply information to governments and law enforcement agencies behind our backs.
I'm ~96.7% sure the Chinese government already scans iCloud libraries stored locally for winnie-the-pooh photos. The only solution to stop that would be end-to-end cloud encryption, but then we'd be back where we are now, with the masses complaining about "reduced" security. Sadly, if Apple does actually cancel this entirely, end-to-end cloud encryption (and the massive human rights win that would be for places like China, Russia, et al) would be a non-starter.
 
  • Like
Reactions: jhollington
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.