They're scamming their customers.The cynic in me thinks the delay is so they don't have to talk about it during the iPhone 13 reveal. Then after, all bets are off!
They're scamming their customers.The cynic in me thinks the delay is so they don't have to talk about it during the iPhone 13 reveal. Then after, all bets are off!
Heh, what always gets me about these thoughts is that Apple could have been doing this already for years.The mother of all Trojans is still part of the new SpyOS!
It will be already implemented and there is no way back once you installed it.
They just want to prevent a huge drop in Sales over the holidays. In February they will press the button and start the global mass surveillance of their customers. First step is pictures, then comes voice, text, video and gps (if a crime happened close to you and you are not the victim, you are a suspect).
Like Charly Murphy once famously said after beating the c** out of Rick James, after Eddie got remorse and said "Wow, man, Rick really needs help", he was like "Yo, we just gave him some help. Bust his friggin a** and s**."What about a compromise? Apple doesn't scan thru photos and but if people are doing this get professional help and change those behaviors.
That’s law enforcement’s job, not Apple’s.I wonder how many additional children will be victimized from now until then? Apple the greatest company in history with the greatest humanitarian intentions forced to deal with grandstanding ignorant politicians and self centered selfish advocacy groups. It’s unbelievable!
It more disturbs me that people are still so misinformed after all this time. None of this happened. Not one little bit of it.So - anyone disturbed just a tad that there is a whole group at Apple that is studying how to identify Child Porn and how to program some computer to recognize it? That means that have to have examples of it....that means they have to study it, that means they have to develop requirements for this SW, that means they have to develop algorithms to figure out that this picture is child porn vs a kid taking a bath or in a swimming pool....
Then someone has to review these results to make sure they are correct and meet the requirements of the SW product.
What kind of staff are working this task?
By having and reviewing examples of the Kiddie Porn, they are breaking the very same laws.
Who is vetting these Apple employees?
This is making me queazy to think about.
I do not consider myself a predator and im breathing sigh of relief….I understand why this is a slippery slope but I don’t like the idea of child predators breathing a sigh of relief.
Close it quietly?After so much trust in Apple has been destroyed?It'll be cancelled, they just can't admit failure. They'll quietly close it down like they've done a bunch of other failures.
While I agree with what you’ve said about server-side scanning, what you said about people sharing CSAM is not accurate. The initial makers and distributers of CSAM may be savvy enough to avoid being caught, but there are plenty of morons among the consumers that spread it around using Facebook, Whatsapp, Google, etc. It’s a huge problem on all social media and cloud networks, and FB alone has millions of cases to investigate per year.
So hopefully something will be done, just not on device?
as if this would put a dent in it.. child predators are smarter than that or they'd be caught a lot more often. this only works if you use iCloud photo anyways.I wonder how many additional children will be victimized from now until then? Apple the greatest company in history with the greatest humanitarian intentions forced to deal with grandstanding ignorant politicians and self centered selfish advocacy groups. It’s unbelievable!
Why? They haven't announced what they're going to do.im breathing sigh of relief….
This is not what happened.So - anyone disturbed just a tad that there is a whole group at Apple that is studying how to identify Child Porn and how to program some computer to recognize it? That means that have to have examples of it....that means they have to study it, that means they have to develop requirements for this SW, that means they have to develop algorithms to figure out that this picture is child porn vs a kid taking a bath or in a swimming pool....
Then someone has to review these results to make sure they are correct and meet the requirements of the SW product.
What kind of staff are working this task?
By having and reviewing examples of the Kiddie Porn, they are breaking the very same laws.
Who is vetting these Apple employees?
This is making me queazy to think about.
Exactly. The “dumb” ones can be caught through scanning on corporate infrastructure (Cloud). CSAM, as currently proposed by Apple (requires some steps of the process taking place on my phone) is not necessary.While I agree with what you’ve said about server-side scanning, what you said about people sharing CSAM is not accurate. The initial makers and distributers of CSAM may be savvy enough to avoid being caught, but there are plenty of morons among the consumers that spread it around using Facebook, Whatsapp, Google, etc. It’s a huge problem on all social media and cloud networks, and FB alone has millions of cases to investigate per year.
So hopefully something will be done, just not on device?
This.The system sounded incredibly secure to me. I understand the concern about nefarious governments demanding that Apple scan for other types of content, but if China (for example) wanted to do that, they would and will do it anyway.
Last week the president ordered a revenge attack in Afghanistan because he didn't want to look weak.Is this a safer world?, should everyone have unlimited and uncontrolled freedom?. A little sacrifice of “freedom” isn’t worth it for a big reward?. Just 1 child saved is not worth it?. We value more our individual “freedoms” over the safety of children?. There a cameras on the streets in many cities for our safety, but at the same time they “spy” on you. Paranoids and molesters making big noise with a “warning about misuse” and parents not heard. American companies are a kind of joke.
People really need to stop insisting that just because we object to something, we must not understand it. One more time for the people in the back: we understand how it works, and we don’t want our devices spying on us.People really don't understand this system...
Putting the automated scans of images being sent to children via iMessage to one side:
The on-device scanning for CSAM in iCloud Photo Library was a safer alternative to what every other image host does: Directly scanning uploaded photos on their service to detect known child pornography.
The difference with Apple's system is that instead of directly accessing users' unencrypted data on their servers, scanning was being done privately on-device.
I understand the slippery slope argument of Apple being forced to add other image hashes by oppressive regimes like China and Russia, but the relatively open manner in which they designed this system would have helped prevent that, or at least made it obvious.
Apple isn't wrong to want to keep CSAM off of their servers and services—they have a clear moral obligation to do this. In response, they developed a system that protected user privacy while ensuring that users hosting child pornography on iCloud could be rooted out and reported to the authorities. They even set a high threshold on the number of red flags that would have to arise before an investigation could be opened.
Their main mistake was forgetting that people don't read past the alarmist headlines.
An ugly case of wrong versus wrong. There's no satisfactory solution. As surveillance states keep growing across the globe, I hope Apple errs on the side of privacy.Apple has delayed the rollout of the Child Safety Features that it announced last month following negative feedback, the company has today announced.
Nah. I don't care. I will buy some of the next iPhones. I still don't see any issue with this. I read all the arguments. I disagree with most of them. Your government is reading all of your communication anyway. And Europe's communication as well, so they basically know everything about you anyways. It is nothing new. If you didn't care back then how come ppl. want to care now?Oh God! Don’t just delay it. CANCEL THIS, Apple. Can’t you see… people won’t be ordering the new iPhone 13 if you launch this child safety crap.
I'm ~96.7% sure the Chinese government already scans iCloud libraries stored locally for winnie-the-pooh photos. The only solution to stop that would be end-to-end cloud encryption, but then we'd be back where we are now, with the masses complaining about "reduced" security. Sadly, if Apple does actually cancel this entirely, end-to-end cloud encryption (and the massive human rights win that would be for places like China, Russia, et al) would be a non-starter.This.
I would be very surprised if somebody in the Chinese government hasn't already asked Apple to share the database of people, objects, and locations that it's been building (locally) on everyone's iPhones since 2016. However, I'm also confident that Apple would have said no to this.
I trust Apple on the basis of its own enlightened self-interest. The risks and massive fallout of getting caught doing something behind customer's backs far outweighs its need to pander to the whims of foreign governments. Just look at the outrage around this CSAM Detection feature, and multiply that by about a million if Apple was actually caught supply information to governments and law enforcement agencies behind our backs.