You know that you're the one doing the training right ?, those are bordelines within their datasets that need better definition.Given how accurate Google's photo Captchas are at knowing what you've clicked on is the correct picture of, say a train, I can easily see this failing pretty hard.
First of all, this system does nothing to prevent transmission of CSAM.
I said in a prior post these system should prevent transmission of CSAM instead of blurring it. Just block it from being sent or uploaded.
Even better, the CSAM system should destroy the perverted image.
Even better than that, if the CSAM system detects a big library of these disgusting filthy images then the phone should explode and kill the nonce.
Good luck getting full support of Microsoft and Adobe for 365 and CC for Linux. 😁Helluva way for this to be the 'year of the Linux desktop' ain't it?
LibreOffice has me covered just fine, thank you. No subscription needed.Good luck getting full support of Microsoft and Adobe for 365 and CC for Linux. 😁
Lordy I hate LibreOffice. The UI really sucks. I'll stick with iWork. No sub needed either. Works great for running my business.LibreOffice has me covered just fine, thank you. No subscription needed.
No sub sure, but you might pay a much larger price if this CSAM thing goes through.Lordy I hate LibreOffice. The UI really sucks. I'll stick with iWork. No sub needed either. Works great for running my business.
If me having to make this small concession in privacy protects even one adolescent from sexual exploitation, then it will have been worth it.
I don't think 'woke' has anything to do with Apple's move. Being authortarian does. Shame on Apple.
You have to study this subject a lot more which you apparently have and I have not. I simply don’t fully understand it, it’s all new to me. The tech has just been announced and is far from a rollout, so I’ll just wait and see. If there are better solutions and they’ll change course or if the EU won’t allow it at all, I would’ve worried about nothing. I will keep an eye on it but it won’t get affect my sleep. Not completely oblivious, but also not burning my iPhone.
I said in a prior post these system should prevent transmission of CSAM instead of blurring it. Just block it from being sent or uploaded.
Even better, the CSAM system should destroy the perverted image.
Even better than that, if the CSAM system detects a big library of these disgusting filthy images then the phone should explode and kill the nonce.
You're creating something out of your head that doesn't exist so let that nonsense go. We're talking about an office suite.No sub sure, but you might pay a much larger price if this CSAM thing goes through.
Office apps for me became more irrelevant once I graduated. The only 'app' I use for documents is a PDF viewer, and plenty of those exist for Linux.
All the same, with more electric cars and vrNot just a step too far, this is another huge step of private companies overtaking the job of law enforcement and applying arbitrary laws to accuse any citizen under any arbitrary ground without the need for warrant and the accused will have zero ground to fight back. It’s 100% checkmate. And you know what? Because so many companies are copying Apple no end, the entire android world will be plagued with inferior version of this thing that will only ruin more people’s lives faster.
Apple, commit all you want. I will try my best to live long enough to see how this hell will look like in 10 years.
This also doesn’t solve the problem. I want to see the CREATORS in jail or worse. While someone randomly finding these pics and saving them is sick, how about we focus on the creators. And criminals are smart. Now that Apple has talked about this LOUDLY, everyone knows turning off iCloud photos will prevent this. Or people just won’t use iPhones for this thing. But we as innocent individuals are harmed by being scanned without a warrant.
It's interesting isn't it...that a "crime" would be reported even if the picture that was syncing was a completely innocent picture of my baby daughter crawling around without anything on!You misunderstand what chances are. To win something in a lottery you need at least two to three correct numbers and the odds to have these are not trillons, but only hundreds.
In Apples case one picture is enough to give an alert, even though they claim not to report that, it's a lie because one or hundred child porn pictures doesn't matter, both is a crime you have to report. They would get steong legal problems allowing their users have one or more pictures of this kind. So there will be a zero tolerance.