One of the biggest bait and switch in history
Yep, this is a classic Trojan horse. An attractive gift, stop kiddie porn, with a hidden stinger, on device general surveillance tool. And don't worry, all those doors in the belly of the horse have locks on the inside so nobody can exit.Apple said they would consider different countries requests on per country basis. This means the system can be customized. This is not about CP. This is about a mass scanning system that is rolled out globally.
And do you think Tim Cook or Craig will live forever to keep their pinky promise? Apple is a publicly traded company. Even the CEO can be replaced. And when management changes, policy will also change. The fact that such system with huge abuse potential is being implemented is the big problem. Apple’s pinky promise to only scan for CP is irrelevant.
I’m not clear about this.
Will the scanning of images start on a certain day or does it require the iOS device to be upgraded to iOS 15?
Audits won’t mean anything when a state changes the law and tells Apple they have to use whatever means available to them to report on images that may be homosexual in nature.That's the thing: this system is much more auditable than any server-side approach (that's the standard in the industry). I would prefer to have no scanning, but between this implementation or implementation on iCloud I prefer this one.
Audits won’t mean anything when a state changes the law and tells Apple they have to use whatever means available to them to report on images that may be homosexual in nature.
I think many are missing the point still, we don’t care about anything companies or law enforcement scans that we upload to the internet, it’s the putting it on our hardware we object to. It’s just a deal breaker for some of us…. Obviously it does not matter to many as witnessed by the dozens of threads talking about it….. however some of us will not accept it period…. Whether we can just cut off iCloud to feel safe or leave Apple altogether is a question unanswered but it’s one many of us are watching…. Yesterday I decided not to upgrade to ios15, this morning I discover they have already put the spyware on my phoneI can't believe all this misunderstanding and baseless paranoia! Does he not understand that Apple, Google, Microsoft, etc. are already scanning for CSAM? So if they wanted to instead search for other types of images, they can already do that. The only thing this new method does is make things MORE private by hiding all scanning data from Apple except that related to a sizable collection of CSAM being uploaded to their servers. If people are still paranoid about that and don't trust Apple, then they should immediately disable iCloud for photos.
This. Also, so many people are ignoring the opportunities for abuse here. Once this system is in place, any government can order Apple to scan for literally anything — a subversive poster or phrase, a person who is a political dissident… the opportunities are endless. Apple is turning iPhone into a mass surveillance tool under the classic, age-old excuse of “protecting the children.”Because you can disable the above features and you have the ability to use iCloud photos without it. This will be baked into iOS 15 and cannot be disabled unless you opt not to use the features that make the Apple ecosystem what it is and have been part of why people use Apple products.
They just have to be ordered report them when they see them via a national security letter for instance. (yes, I agree the US security services aren't going to request that.)And which means they have available to do that?
So you're saying they break the law? Because the law says they have to report it if they find it. Or are you claiming there's virtually no CSAM on iCloud? (personally I find that as plausible as Apple's privacy claims. ie not at all)
Absolutely nothing wrong with having a rethink.Yes I agree with the sentiment but if it is indeed already a legal requirement that all images stored on corporate servers must be scanned for CASM, then perhaps this is the best way of going about it? I don't know for sure. I'm still against the whole idea in general but I accept there may be more to it than I first gave it credit for. And reconsidering a position isn't the same as changing your mind 👍
I think people (politician) who do not like this are people who like child sexual abuse material.
Member of the German parliament, Manuel Höferlin, who serves as the chairman of the Digital Agenda committee in Germany, has penned a letter to Apple CEO Tim Cook, pleading Apple to abandon its plan to scan iPhone users' photo libraries for CSAM (child sexual abuse material) images later this year.
![]()
In the two-page letter (via iFun), Höferlin said that he first applauds Apple's efforts to address the dangers posed by child sexual abuse and violence but notes that he believes Apple's approach to remedying the issue is not the right one. Höferlin continued to say that the approach Apple has chosen violates one of the "most important principles of the modern information society – secure and confidential communication."
Höferlin notably called Apple's CSAM approach "the biggest opening of the floodgates for communication confidentiality since the birth of the internet." The letter speaks out against Apple's plans to scan images in a users' iCloud Photo Library for CSAM by checking the hashes of images to a database of known child sexual abuse material.
That feature is entirely different from another feature rolling out later this year, in which iOS will use on-device image analysis to detect possible sexually explicit images in the Messages app and asks users under the age of 13 if they wish to see the photo. While Höferlin referenced some legitimate concerns over CSAM scanning, he continued that the feature destroys "some of the trust users place in not having their communications secretly monitored." Neither CSAM scanning nor the Child Safety Features in Message, however, are monitoring any communication.
Apple's senior vice president of software engineering, Craig Federighi, admitted in a recent interview that the conjoined announcement of CSAM detection and improved safety for children within the Messages app has caused confusion. Nonetheless, Höferlin continued in his letter by stating that while he wishes he could believe Apple's reassurance that it will not allow government interference into CSAM detection, he is unable to take the company by its word.
Höferlin concluded his letter by pleading with Cook for Apple to abandon its CSAM scanning plans and asked that the company stays on the side of free and private internet.
Since its announcement earlier this month, Apple’s plans have received criticism, and in response, the company has continued its attempt to address concerns by publishing additional documents and an FAQ page. CSAM scanning and Child Safety Features within the Messages app are still on track to be released later this year.
Article Link: German Politician Asks Apple CEO Tim Cook to Abandon CSAM Scanning Plans
This. Also, so many people are ignoring the opportunities for abuse here. Once this system is in place, any government can order Apple to scan for literally anything — a subversive poster or phrase, a person who is a political dissident… the opportunities are endless. Apple is turning iPhone into a mass surveillance tool under the classic, age-old excuse of “protecting the children.”
They just have to be ordered report them when they see them via a national security letter. (yes, I agree that's not going to happen in this case)
Sure. iPhone 13 will have a 14-day no questions asked refund.Will Saudis or Hungarians or Russians get a refund if they don't like the policy in their country after they bought an iPhone 13?
No I didn't think so.
I think many are missing the point still, we don’t care about anything companies or law enforcement scans that we upload to the internet, it’s the putting it on our hardware we object to. It’s just a deal breaker for some of us…. Obviously it does not matter to many as witnessed by the dozens of threads talking about it….. however some of us will not accept it period…. Whether we can just cut off iCloud to feel safe or leave Apple altogether is a question unanswered but it’s one many of us are watching…. Yesterday I decided not to upgrade to ios15, this morning I discover they have already put the spyware on my phone
The only REAL solution is to open up the phone so people can run whatever software they like. Make the phone more like a Mac. Yes, Apple will argue about malware on the phone, but this is not a big problem with Macs. This could be done by law too.
Abiding by laws includes presumably abiding by laws in China where it instead data be kept in China, and where no doubt if Apple did not comply it would not be on sale in China? These sort of pressures can easily sway a company
Everyone has to make their own decisions, you can choose to accept it, I choose not toNo, we simply think the "point" you're referring to is irrational. If you're this paranoid and consistent about it, then you're going to have to go offline completely. Good luck.
The reality is that by moving the scanning process to your phone, your privacy has been nothing but enhanced. Spyware is installed without your knowledge and is used for nefarious purposes. Neither of those things are true with iOS 15. The comparison is, frankly, absurd.
Yes, the part you’re ignoring is the part where Apple has to comply with a change in the law. That’s what the fella in the German parliament was concerned about.Yes they do. Apple still needs to have a source dataset to perform the match on the server. So, adding non-CSAM hashes to the dataset and possible government overreach still apply. The only difference is that the server is a black box for security researchers -- which makes it easier to do any kind surveillance.
Regarding the bold part: people who say this have no idea how software development works. The code implementation needs to be ported to other parts of the OS and implemented at the server side. Plus, they would need to have another source dataset, which would need to be discussed with other agencies. This is not a toggle they can change in settings. This requires coordination between multiple teams. They would need to implement this basically from scratch.
And all that effort for what? For a "surveillance software" that needs a dataset and performs an hash algorithm locally. It would take a couple of hours between a software update and someone raising a red flag.
Because they find that CSAM whilst looking for other things as part of a request from police for instance?Again, what ARE you on about? Why would Apple randomly choose to report some users with CSAM and not others? Does that make any logical sense to you? Instead of immediately assuming they're breaking the law and covering up illegal material they found, would not a more logical conclusion, for example, be that perhaps iCloud isn't as popular a platform as others for uploading such material?
And beyond that, you're totally missing my point. The fact is they ARE scanning images already (how else would they make the reports they did if they weren't?)
Why are you so eager to protect CSAM collectors who stashed their collection on iCloud?and thus they already have the capability to search for other types of images if they wanted to. So is this German politician not wanting them to scan AT ALL - on the cloud OR on devices? If so, he's out to lunch. By uploading your content to the cloud, you're voluntarily surrendering a large degree of privacy, so people already have the choice NOT to do that. But if they continue to do so, the scanning process is now more private. Sounds like a win-win to me.
And which means they have available to do that?
Everyone has to make their own decisions, you can choose to accept it, I choose not to
Are you truly incapable of seeing the difference between a person being filmed while out in a public place versus a person's private images being scanned on their phone?Might as well ask Cook to also remove all the cameras in the phone so that people can avoid being filmed during altercations (an altercation that THEY themselves usually cause), threatening to put them on YouTube or Facebook, which they almost certainly do regardless. Their video is subsequently shared many, many times. So much respect for people’s privacy! Do Facebook and YouTube care about people's privacy? No, not while it's entertainment. Yet the source comes from somebody's phone and violates the rights of others.
Or maybe remove the ability to record conversations via Voice Memos, or third-party apps just in case a conversation is recorded secretly without other people knowing.
I can see through yourvmultiple posts that you are a strong supporter of the CSAM implementation. However you are simplifying things. It is not true that you have control. Your imigases are scan these or other way as the hash library is built into iOS. and that is the core of the issue. we all (most of us…) understand that using any cloud service comes with price. The thing is its not many opposing apple scanning icloud library for CSAM on their server. We oposse the code on our iphones and that we find a privacy invasion.Which Apple is releasing in a country-by-country basis to abide by laws first.