No. Based on frequent reports of pedo grooming activity, where they fabricate a false identity for themselves. As I said.So, based on your opinion then
No. Based on frequent reports of pedo grooming activity, where they fabricate a false identity for themselves. As I said.So, based on your opinion then
i am one of those who believe this is a sop to the government because apple is rolling out e2ee and they need to “look” like they are doing something to combat child porn since so much data has gone dark, i would be surprised if they don’t roll out e2ee in a couple of weeks, it makes perfect sense to explain why this thing is happeningNo, I believe Apple is being forced to create a backdoor. CSAM is just the excuse. On phone scanning can be expanded to literally cover anything. In the USA, re: terrorist threats, pictures of your massive gun stockpiles? Muslim countries, photos of women not in burkas? If it can scan images, text is a breeze. Apple has been a champion of privacy. So why suddenly is naked images of children so important to eradicate now? It’s important to listen to what Apple is saying. But even more important is what Apple is NOT saying... If Apple was totally on board with this we’d have know years ago. It wouldn’t be a shocking 180° turn around from their core value of privacy, that they heavily advertise worldwide. Once that hash scanning has been installed into iOS, the NSA can get anything on any iPhone anywhere. & the NSA doesn’t talk about all the once illegal/still unconstitutional things they do.
What?It’s all fun and games when they’re coming for the pedophiles, because I’m not a pedophile.
It’s all fun and games when they’re coming for the atheists, because I’m not an atheist.
It’s all fun and games when they’re coming for the foreigners, because I’m not a foreigner.
It’s no longer fun and games, because I said something they didn’t like, and now they’re coming for me.
Ridiculous ! they either are unaware of this or have long deactivated their iCloud Libraries.I understand why this is a slippery slope but I don’t like the idea of child predators breathing a sigh of relief.
no they are matches hashes of known csam against hashes of photos you upload to icloud, apple says they are accurate to 1 in a trillion false positives so the naked kids in the bath pictures or romping through the sprinklers will not get flagged and then you have to have 30 of these image before they will have a human look, but if indeed, they are matching to a 1 in 1,000,000,000,000 accuracy there is no way you will ever get in front of human reviewer unless you have actual csam that exists in at least 2 databases in 2 different legal jurisdictionsNo, they are scanning for manipulated images of known CSAM, it's scanning for pictures that "seems to be these CSAM pictures". Which means your family photos can be false positives. That's why they have the 30 picture threshold, the more pictures you have in the iCloud , the more likely you'll end up getting false flagged.
Once you've flagged and your family photos are getting reviewed, pray it's something that a American deems cultural appropriate in their country or else...
that’s not what i have read, icloud is apparently used for the distribution of csam regularly and widely, it makes sense if apple has only reported less than 300 people and facebook has reported 20 million people, if i were distributing csam i would be looking at icloud as a good distribution methodNo child predators are uploading their illegal content to iCloud, let’s be serious. This feature is just a huge misstep and slippery slope to privacy for everyone ELSE. They were never going to catch child predators with this feature…
Are you okay with everyone having to use a breathalyzer every time they start their car because we need to make sure we get all those drunk drivers.
Are you okay with your car calling the police on you when you go over the speed limit because speed kills you know.
These three examples are not the same. Because when it comes to CP, it's 100 percent wrong, 100 percent of the time.Are you okay with your new TV having a camera looking for drugs in your room because we need to get rid of all those drug addicts.
That's not a fact. That's a what if. You are only guessing. We can think about worst case scenarios and that's important, but presenting it like it's a fact does everybody a disservice.Are you okay a year down the road when Apple expands the CSAM to scan for other illegal activities our government doesn't like because those will be good causes too. Just wondering what your line in the sand is.
You are still only sourcing yourself. So it's still just your opinion.No. Based on frequent reports of pedo grooming activity, where they fabricate a false identity for themselves. As I said.
Do you really think it could stop them?I understand why this is a slippery slope but I don’t like the idea of child predators breathing a sigh of relief.
From Apple.com…Sorry. There was no glossing over. Your info was factually wrong.
And I would even ask about the bolded section. I believe you may be incorrect about the procedure there as well in terms of the photos being sent from your iPhone to Apple.
FROM APPLE:
Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.
Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC.
------------
It does not mention it accesses the photos on your phone. Please provide literature where that is clearly stated, otherwise you are making it sound like Apple is directly accessing your phone when that is seemingly not the case.
You are still only sourcing yourself. So it's still just your opinion.
They are probably already implementing a switch to activate it in backend with no user being able to opt out of criminal scanning.Just going to wait until everyone forgets and do a quite release a .1 update with some security enhancement etc..
One swallow doesn’t make a summer.![]()
News and opinion
The latest news and opinion on our work and activities across the UK, Channel Islands and Isle of Man.www.nspcc.org.uk
They won’t cancel it like the other comment they will wait till after the iPhone release so they don’t hurt early sales.They will end up CANCELLING IT. 🤫
It’s been a hot mess. Even good ol’ Craig (executive) admitted it. He knew Apple’s approach was wrong.
![]()
Craig Federighi Acknowledges Confusion Around Apple Child Safety Features and Explains New Details About Safeguards
Apple's senior vice president of software engineering, Craig Federighi, has today defended the company's controversial planned child safety...www.macrumors.com
From Apple.com…
... This voucher is uploaded to iCloud Photos along with the image.
Where i come from, a little drunk driving is not okay. Regardless, these were just some examples off the top of my head. I am sure you will agree that there are numerous good causes that require policing. So why out of the blue has it become Apple’s obligation to police just this one thing?These three examples are not the same. Because when it comes to CP, it's 100 percent wrong, 100 percent of the time.
You know what though, I would like cars to have a breathalyser. That would dramatically reduce the carnage from drink drivers.
Also, some countries in the world like Australia have far less speeding because they have cameras everywhere.
And there are plenty of things that can be done to reduce drug usage.
So there are things that can be done to reduce the issues. Is the way Apple doing it the right way? Probably not.
That's not a fact. That's a what if. You are only guessing. We can think about worst case scenarios and that's important, but presenting it like it's a fact does everybody a disservice.
Apple is not. NCMEC is the one. Apple is just doing what’s required by the law.The main point being that Apple isn't (and shouldn't be) in position to decide what's safe or not for people
Apple is not. NCMEC is the one. Apple is just doing what’s required by the law.
Pedophiles and sex traffickers everywhere are cheering their victory in swaying the opinion of the cowardly sheep that didn’t understand this useful technology. Apple just got sick of the incessant bleating.