Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
One of the biggest bait and switch in history

1629303442662.jpeg
 
Apple said they would consider different countries requests on per country basis. This means the system can be customized. This is not about CP. This is about a mass scanning system that is rolled out globally.

And do you think Tim Cook or Craig will live forever to keep their pinky promise? Apple is a publicly traded company. Even the CEO can be replaced. And when management changes, policy will also change. The fact that such system with huge abuse potential is being implemented is the big problem. Apple’s pinky promise to only scan for CP is irrelevant.
Yep, this is a classic Trojan horse. An attractive gift, stop kiddie porn, with a hidden stinger, on device general surveillance tool. And don't worry, all those doors in the belly of the horse have locks on the inside so nobody can exit.
 
I’m not clear about this.
Will the scanning of images start on a certain day or does it require the iOS device to be upgraded to iOS 15?

Seems to be some chatter that it’s already in 14.3. But the plan is to release it in 15

That's the thing: this system is much more auditable than any server-side approach (that's the standard in the industry). I would prefer to have no scanning, but between this implementation or implementation on iCloud I prefer this one.
Audits won’t mean anything when a state changes the law and tells Apple they have to use whatever means available to them to report on images that may be homosexual in nature.
 
I can't believe all this misunderstanding and baseless paranoia! Does he not understand that Apple, Google, Microsoft, etc. are already scanning for CSAM? So if they wanted to instead search for other types of images, they can already do that. The only thing this new method does is make things MORE private by hiding all scanning data from Apple except that related to a sizable collection of CSAM being uploaded to their servers. If people are still paranoid about that and don't trust Apple, then they should immediately disable iCloud for photos.
I think many are missing the point still, we don’t care about anything companies or law enforcement scans that we upload to the internet, it’s the putting it on our hardware we object to. It’s just a deal breaker for some of us…. Obviously it does not matter to many as witnessed by the dozens of threads talking about it….. however some of us will not accept it period…. Whether we can just cut off iCloud to feel safe or leave Apple altogether is a question unanswered but it’s one many of us are watching…. Yesterday I decided not to upgrade to ios15, this morning I discover they have already put the spyware on my phone
 
Because you can disable the above features and you have the ability to use iCloud photos without it. This will be baked into iOS 15 and cannot be disabled unless you opt not to use the features that make the Apple ecosystem what it is and have been part of why people use Apple products.
This. Also, so many people are ignoring the opportunities for abuse here. Once this system is in place, any government can order Apple to scan for literally anything — a subversive poster or phrase, a person who is a political dissident… the opportunities are endless. Apple is turning iPhone into a mass surveillance tool under the classic, age-old excuse of “protecting the children.”
 
And which means they have available to do that?
They just have to be ordered report them when they see them via a national security letter for instance. (yes, I agree the US security services aren't going to request that.)
 
So you're saying they break the law? Because the law says they have to report it if they find it. Or are you claiming there's virtually no CSAM on iCloud? (personally I find that as plausible as Apple's privacy claims. ie not at all)

Again, what ARE you on about? Why would Apple randomly choose to report some users with CSAM and not others? Does that make any logical sense to you? Instead of immediately assuming they're breaking the law and covering up illegal material they found, would not a more logical conclusion, for example, be that perhaps iCloud isn't as popular a platform as others for uploading such material?

And beyond that, you're totally missing my point. The fact is they ARE scanning images already (how else would they make the reports they did if they weren't?) and thus they already have the capability to search for other types of images if they wanted to. So is this German politician not wanting them to scan AT ALL - on the cloud OR on devices? If so, he's out to lunch. By uploading your content to the cloud, you're voluntarily surrendering a large degree of privacy, so people already have the choice NOT to do that. But if they continue to do so, the scanning process is now more private. Sounds like a win-win to me.
 
Yes I agree with the sentiment but if it is indeed already a legal requirement that all images stored on corporate servers must be scanned for CASM, then perhaps this is the best way of going about it? I don't know for sure. I'm still against the whole idea in general but I accept there may be more to it than I first gave it credit for. And reconsidering a position isn't the same as changing your mind 👍
Absolutely nothing wrong with having a rethink.
 
  • Like
Reactions: Jim Lahey


Member of the German parliament, Manuel Höferlin, who serves as the chairman of the Digital Agenda committee in Germany, has penned a letter to Apple CEO Tim Cook, pleading Apple to abandon its plan to scan iPhone users' photo libraries for CSAM (child sexual abuse material) images later this year.

privacy-matters-apple-ad.jpg

In the two-page letter (via iFun), Höferlin said that he first applauds Apple's efforts to address the dangers posed by child sexual abuse and violence but notes that he believes Apple's approach to remedying the issue is not the right one. Höferlin continued to say that the approach Apple has chosen violates one of the "most important principles of the modern information society – secure and confidential communication."
Höferlin notably called Apple's CSAM approach "the biggest opening of the floodgates for communication confidentiality since the birth of the internet." The letter speaks out against Apple's plans to scan images in a users' iCloud Photo Library for CSAM by checking the hashes of images to a database of known child sexual abuse material.

That feature is entirely different from another feature rolling out later this year, in which iOS will use on-device image analysis to detect possible sexually explicit images in the Messages app and asks users under the age of 13 if they wish to see the photo. While Höferlin referenced some legitimate concerns over CSAM scanning, he continued that the feature destroys "some of the trust users place in not having their communications secretly monitored." Neither CSAM scanning nor the Child Safety Features in Message, however, are monitoring any communication.

Apple's senior vice president of software engineering, Craig Federighi, admitted in a recent interview that the conjoined announcement of CSAM detection and improved safety for children within the Messages app has caused confusion. Nonetheless, Höferlin continued in his letter by stating that while he wishes he could believe Apple's reassurance that it will not allow government interference into CSAM detection, he is unable to take the company by its word.
Höferlin concluded his letter by pleading with Cook for Apple to abandon its CSAM scanning plans and asked that the company stays on the side of free and private internet.
Since its announcement earlier this month, Apple’s plans have received criticism, and in response, the company has continued its attempt to address concerns by publishing additional documents and an FAQ page. CSAM scanning and Child Safety Features within the Messages app are still on track to be released later this year.

Article Link: German Politician Asks Apple CEO Tim Cook to Abandon CSAM Scanning Plans
I think people (politician) who do not like this are people who like child sexual abuse material.
 
This. Also, so many people are ignoring the opportunities for abuse here. Once this system is in place, any government can order Apple to scan for literally anything — a subversive poster or phrase, a person who is a political dissident… the opportunities are endless. Apple is turning iPhone into a mass surveillance tool under the classic, age-old excuse of “protecting the children.”

How's that different if Apple implemented this on the server?

They just have to be ordered report them when they see them via a national security letter. (yes, I agree that's not going to happen in this case)

I'm talking about Apple, not the government. There are two approaches: they would need to change this system completely and use a classification AI, or they would need to add a lot of things to the CSAM dataset. Both would be noticed within a couple of hours after a software update hits.
I realise that's a concern, but if Apple or any other company were to comply with such a demand, they would do it on the server, away from people's prying eyes and away from any possible audit. Using this system to comply with some government overreach is much more risky for them.
 
I think many are missing the point still, we don’t care about anything companies or law enforcement scans that we upload to the internet, it’s the putting it on our hardware we object to. It’s just a deal breaker for some of us…. Obviously it does not matter to many as witnessed by the dozens of threads talking about it….. however some of us will not accept it period…. Whether we can just cut off iCloud to feel safe or leave Apple altogether is a question unanswered but it’s one many of us are watching…. Yesterday I decided not to upgrade to ios15, this morning I discover they have already put the spyware on my phone

No, we simply think the "point" you're referring to is irrational. If you're this paranoid and consistent about it, then you're going to have to go offline completely. Good luck.

The reality is that by moving the scanning process to your phone, your privacy has been nothing but enhanced. Spyware is installed without your knowledge and is used for nefarious purposes. Neither of those things are true with iOS 15. The comparison is, frankly, absurd.
 
  • Disagree
Reactions: xpxp2002
The only REAL solution is to open up the phone so people can run whatever software they like. Make the phone more like a Mac. Yes, Apple will argue about malware on the phone, but this is not a big problem with Macs. This could be done by law too.

Lol that's even worse for privacy.
 
Abiding by laws includes presumably abiding by laws in China where it instead data be kept in China, and where no doubt if Apple did not comply it would not be on sale in China? These sort of pressures can easily sway a company

Is Apple supposed to fight the Chinese government? If not, blame Apple?

I don't see how Apple is at fault here.
 
One of the biggest bait and switch in history

View attachment 1820308

People who keep posting this simply prove they don't even understand the most basic thing about the CSAM detection on iOS 15. Let me try to simplify this for you:

iPhone <-- this is your iPhone
iCloud <-- this is not your iPhone (it's Apple's servers)

So if you have 1,000,000 CSAM images on your iPhone and don't enable iCloud for photos, then everything that happens on your iPhone does indeed stay on your iPhone. However, if you enable iCloud for photos, you are now moving things OFF your iPhone and ONTO Apple's servers (remember, Apple's servers are NOT your iPhone).
 
  • Like
Reactions: Ethosik
No, we simply think the "point" you're referring to is irrational. If you're this paranoid and consistent about it, then you're going to have to go offline completely. Good luck.

The reality is that by moving the scanning process to your phone, your privacy has been nothing but enhanced. Spyware is installed without your knowledge and is used for nefarious purposes. Neither of those things are true with iOS 15. The comparison is, frankly, absurd.
Everyone has to make their own decisions, you can choose to accept it, I choose not to
 
  • Like
Reactions: MuppetGate
Yes they do. Apple still needs to have a source dataset to perform the match on the server. So, adding non-CSAM hashes to the dataset and possible government overreach still apply. The only difference is that the server is a black box for security researchers -- which makes it easier to do any kind surveillance.

Regarding the bold part: people who say this have no idea how software development works. The code implementation needs to be ported to other parts of the OS and implemented at the server side. Plus, they would need to have another source dataset, which would need to be discussed with other agencies. This is not a toggle they can change in settings. This requires coordination between multiple teams. They would need to implement this basically from scratch.

And all that effort for what? For a "surveillance software" that needs a dataset and performs an hash algorithm locally. It would take a couple of hours between a software update and someone raising a red flag.
Yes, the part you’re ignoring is the part where Apple has to comply with a change in the law. That’s what the fella in the German parliament was concerned about.

What these governments know is that Apple has a file scanner built into the OS, and it can be set up to work with multiple databases in different countries. If the pay change the law then Apple will comply, especially when their supply chain and sales are threatened.

Oh, and I’ve been a software developer for over 25 years. Code is a lot more modular than you think, these days. I reckon it would take less time to implement this change than to move all Chinese iCloud accounts to Chinese servers and hand over control to a Chinese cloud service company – which Apple did at China’s request.
 
Again, what ARE you on about? Why would Apple randomly choose to report some users with CSAM and not others? Does that make any logical sense to you? Instead of immediately assuming they're breaking the law and covering up illegal material they found, would not a more logical conclusion, for example, be that perhaps iCloud isn't as popular a platform as others for uploading such material?
Because they find that CSAM whilst looking for other things as part of a request from police for instance?
And beyond that, you're totally missing my point. The fact is they ARE scanning images already (how else would they make the reports they did if they weren't?)

Looking at the NCMEC figures, that's not the case.
and thus they already have the capability to search for other types of images if they wanted to. So is this German politician not wanting them to scan AT ALL - on the cloud OR on devices? If so, he's out to lunch. By uploading your content to the cloud, you're voluntarily surrendering a large degree of privacy, so people already have the choice NOT to do that. But if they continue to do so, the scanning process is now more private. Sounds like a win-win to me.
Why are you so eager to protect CSAM collectors who stashed their collection on iCloud?
 
Everyone has to make their own decisions, you can choose to accept it, I choose not to

That's stating the obvious. I'm simply saying I've seen no rational reasons for "choosing not to." They're all based on paranoid slippery-slope fallacies or conspiracy theories.
 
Might as well ask Cook to also remove all the cameras in the phone so that people can avoid being filmed during altercations (an altercation that THEY themselves usually cause), threatening to put them on YouTube or Facebook, which they almost certainly do regardless. Their video is subsequently shared many, many times. So much respect for people’s privacy! Do Facebook and YouTube care about people's privacy? No, not while it's entertainment. Yet the source comes from somebody's phone and violates the rights of others.

Or maybe remove the ability to record conversations via Voice Memos, or third-party apps just in case a conversation is recorded secretly without other people knowing.
Are you truly incapable of seeing the difference between a person being filmed while out in a public place versus a person's private images being scanned on their phone?

This is an apple to orangutan's comparison. Not even close.
 
Which Apple is releasing in a country-by-country basis to abide by laws first.
I can see through yourvmultiple posts that you are a strong supporter of the CSAM implementation. However you are simplifying things. It is not true that you have control. Your imigases are scan these or other way as the hash library is built into iOS. and that is the core of the issue. we all (most of us…) understand that using any cloud service comes with price. The thing is its not many opposing apple scanning icloud library for CSAM on their server. We oposse the code on our iphones and that we find a privacy invasion.
 
No sh*t. Politicians and other notable groups need to stop donating to Apple as well. Hit them where it hurts.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.