Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I guess my issue with this is as such; and someone feel free to point out how I am wrong, but won’t this just embolden hackers?

“Hi, I am hacker-x. I now have access to your iPhoto library. Unless you transfer $500 to this account in the next hour, I will be uploading kiddie fiddler images to your photo library that will get you arrested. If you don’t pay, have fun losing your livelihood, your freedom and all the money you will spend on attorneys fighting these charges. Oh, and have fun getting shived in prison because they treat child molesters great there!”

I know people have concerns about this being abused by an overly watchful government, and rightfully so, but the possible hacker aspect concerns me more.

I imagine Apple would be able to determine that the images had been uploaded from a device the user did not own. And if the hacker in question wants to add an additional charge by admitting to having child pornography in their possession, then he deserves the stupid prize as well.

Could it be done? Nothing is impossible in this world.

Would it be done? Again, I think back to all the fanciful doomsday scenarios between suggested here, and I suspect that if the government or some random hacker really wanted to do a person in, there are probably already far easier methods of doing so that don’t require them to jump through the various hoops that abusing the CSAM detection system entails.

I can’t help but think back to the announcement of the iPhone 5s and Touch ID and how people were going around claiming that robbers would be chopping off the fingers of 5s users to use to unlock their phones with. Fast forward to today, I am not seeing this phenomenon happening, so it just feels like a whole lot of concern trolling to me.
 
  • Disagree
Reactions: 09872738
Here’s a great podcast on the subject. René Ritchie is an Apple apologist so he won’t say anything negative about them. I think it’s because he has contacts on the inside and doesn’t want to lose those. Georgia Dow on the other hand is a privacy advocate and explains what they are doing without any BS

That was a great one.
 
  • Like
Reactions: russell_314
Never said it was, or that I was worried about what info they would gain from me, I'm as boring as they come. It's the precedent. They said they were for privacy, yet this is the biggest privacy violation that can happen.

My reaction is that it’s the opposite.

Apple doesn’t scan iCloud (or at least scans it only minimally), photos is scanned on-device and that information never leaves your phone, and with this proposed CSAM detection, my photos are scanned only once for child pornography (and as noninvasive lay as possible), at the point of uploading to iCloud, and for nothing else. Plus in the event of false positives, there are additional safeguards, so Apple has clearly thought this through.

This to me is far less invasive than what facebook and google are currently doing, but I acknowledge that for it to work, this entails a great amount of trust from users that Apple will use said technology only for its current intended purpose and not modify it for other uses further down the road.

Which brings me back to my initial point about trust. I currently still trust Apple in this regard, so I am not disturbed by their plans to implement said feature.

I am not going to lose sleep over a hypothetical that may or (more likely) may not happen in the future. If it does, then I will just deal with it as it happens.

If the trust isn’t there, then I don’t think there is anything Apple or anyone else for that matter can say to assuage your concerns, and it may be better to switch to an android phone for that peace of mind.

That’s just what I feel at least.
 
  • Like
Reactions: cupcakes2000
Most people in China uses WeChat. They have implemented OCR in their chat to convert all text in images to text and also image filtering and reporting.

China doesn't need Apple at all.

Also iCloud backup and iCloud Photo Library is available to Apple and thus law enforcement in the US, Europe and China.

That is one thing that makes little sense to me. Messages is highly used in the USA. The rest of the world, not much.
So if this is a global affecting change, either they plan to expand this or they have another very different reason for putting this functionality into play.
 
I am late to the game, but they scan photos on iCloud? Can I just turn that off and uses something else to sync photos like Pcloud, Mega, or others?
 
Seems I missed an announcement. Where did Apple say they don't need those keys anymore?
Also : you think the FBI&politicians are going to be happy if Apple says : "all the existing CSAM collections in iCloud are safe with us"?

Where did you get that idea?

Say I am a person with problem photos and I want them on iCloud.
Okay, so I turn Backup Photos to iCloud off. CSAM stuff is inactive.
Now I manually upload these to my iCloud account.
Apple is not currently scanning iCloud and has no plans going forward unless issued a warrant / subpeona / other.

So what exactly does this change?
 
Don’t you think, instead of publicly announcing an overly complicated privacy orientated single use scanning system, they would secretly just change the code for the more robust os scanning systems already in place, like spotlight? It’s would be far easier, far more effective and far more clandestine.
And far more destructive if caught. Apple thought they would be safe by hiding behind kids. They were wrong.
 
My reaction is that it’s the opposite.
I know, I've heard all of your side's arguments and read all the Apple info. That doesn't change things for me though. I'm not trying to convince you to do anything different, though I am trying to convince Apple.
Apple doesn’t scan iCloud (or at least scans it only minimally)
You actually don't know how much they scan iCloud, but anyway, as I've said before, scanning iCloud is okay by me, it's their servers and their responsibility. I assume iCloud has been scanned all along, just like Google, and Microsoft, and a host of other online players.

my photos are scanned only once for child pornography (and as noninvasive lay as possible), at the point of uploading to iCloud, and for nothing else. Plus in the event of false positives, there are additional safeguards, so Apple has clearly thought this through.
I think you're forgetting that photos can come to be on iCloud in more ways than from your Apple devices, and that ability to get there via other means, like web access, or Windows, or, ..., means they're probably be scanned on the server side as well, and probably on some schedule. At least I assume it so and can't see how Apple could explain away bad stuff on their servers because they only scan once, on an Apple device!

But whatever, this kind of argument would just force me to the never scanning side, rather than allowing it on my device, just so they wouldn't scan it on theirs!

Which brings me back to my initial point about trust. I currently still trust Apple in this regard, so I am not disturbed by their plans to implement said feature.
I'm happy for you, it must be nice. I'm a pretty trusting person for most things, but when I feel betrayed, I don't forget and it will be a very long time before I could trust Apple again, if ever. It wouldn't be quite so bad if half the things that come out of their mouth was about trust, that only makes the betrayal worse.

If the trust isn’t there, then I don’t think there is anything Apple or anyone else for that matter can say to assuage your concerns, and it may be better to switch to an android phone for that peace of mind.
You're absolutely correct, it's gone, and so is my spending on apple products. And fwiw, I already have a Samsung Flip 3 on order and I'm typing this from my Windows PC.
 
Apple is on a suicide drive and apparently doesn't give a damn about public opinion on this. Seems extremely foolhardy to me, what makes them think this is so absolutely essential to implement this PR-blow of a feature? As I have stated so many times, the amount of actual good this will do is next to none as the people creating and consuming CSAM are resourceful enough to simply bypass the new implementations. All this does is erode Apple's built-up image of the privacy champion of the big-tech world. I am bemused that they have not called off their plans for this. Perhaps this is the beginning of a scenario in which the Apple killer is Apple itself...
 
First CSAM, next Confederate flags. Next, blocking access to web sites that Tim Cook does not like.

A few comments to those of you and for this:
1. This is all the same issue as the govt wanting a key to bypass encryption. All the same issues and the same conclusion: it cannot be done.
2. If you're for this, you're for periodic searches of your home by the police to make sure you're doing nothing wrong.
 
Does anyone actually trust any piece of software developed by Apple? Does any of it actually work as promised?

As far as the Android alternative, there are supposedly 'clean' versions of Android you can install to help with the privacy issues there (I have no experience with this).

PinePhone maybe? Not quite ready for prime time I think.
 
Where did you get that idea?

Say I am a person with problem photos and I want them on iCloud.
Okay, so I turn Backup Photos to iCloud off. CSAM stuff is inactive.
Now I manually upload these to my iCloud account.
Apple is not currently scanning iCloud and has no plans going forward unless issued a warrant / subpeona / other.

So what exactly does this change?
There’s a difference between scanning iCloud and scanning your iPhone. That is what changes. Yes I understand Apple says by turning iCloud photos off it turns off the scanning feature. It doesn’t turn off the capability to scan your iPhone. For example if you live in China and the Chinese government tells Apple we want to scan this guys phone then Apple will just turn this “feature” on to scan your device.
 
There’s a difference between scanning iCloud and scanning your iPhone. That is what changes. Yes I understand Apple says by turning iCloud photos off it turns off the scanning feature. It doesn’t turn off the capability to scan your iPhone. For example if you live in China and the Chinese government tells Apple we want to scan this guys phone then Apple will just turn this “feature” on to scan your device.

I understand that. With it stupid easy to bypass the scan for now, why even build this?
 
  • Like
Reactions: 09872738
Just sold my iPhone 12 for £560 and got £72 back for my remaining AppleCare.

Selling all this **** actually made me realise how much of my time and money I had burned on a few conveniences and marketing.

Thanks Apple for pissing me off enough to jump through hoops to a better life.

What did you get?
 
  • Like
Reactions: PowerMacBook
I understand that. With it stupid easy to bypass the scan for now, why even build this?
Because governments to include the US government have been pressuring Apple to do this. It’s not good when you have your subjects using a method of communication that’s very difficult for you to intercept. The less democratic government is the worse it is.

Apple isn’t worried about financial loss because other than a short-lived outrage there’s nothing consumers will or can do. Apple knows this. There is no other privacy focused alternative. And in a sad way I think it’s funny that all these people were worshiping Apple like it was some sort of charity now realize it’s just a corporation designed to make money. I love Apple products but Apple as a corporation and Tim as a CEO isn’t any different than Google or Facebook ethically.


If you haven’t seen this video it’s a good watch. Just wait till Renee and Georgia finish a little bit of chitter chatter she straight up goes off on Apple. Renee is just sitting there looking like oh crap there goes my friendly relationship with Apple
 
That is one thing that makes little sense to me. Messages is highly used in the USA. The rest of the world, not much.
So if this is a global affecting change, either they plan to expand this or they have another very different reason for putting this functionality into play.
My guess is that Apple is doing this for iMessage because that's the messaging app that they own and control.

They are also reportedly going to offer the same API for other messaging services to adopt, but then the ball will be in their court.
 
  • Like
Reactions: dk001
or at least scans it only minimally

How does that work??? like quickly glance over your 89.000 pics???

that may or (more likely) may not happen in the future. If it does, then I will just deal with it as it happens.

Well, that's exactly the point. You will deal with it as it happens? Really??? I hope you do understand what this means in some countries, or in some future situations?
My whole point is not the child porn. My point being so very very very strongly against it is because this can be easily (or less easily) turned into a anything a government wants, or for that matter, some idiots at Apple think they need to govern as well.
A government is called a "government" because they suppose to govern.

Apple is a computer maker, and some other stuff.

Soon I will get a fine because I crossed red light, because Apple's iPhone is connected to the traffic lights system and Apple thinks it's a good idea to make sure nobody is crossing red lights???

Specially Tim Cook, who is gay, should know very well, or at least realise, gays are being killed in some countries - for being gay.
They should realise people are being killed for listening to music - anyone seen images of Afghanistan lately?
They should realise (relatively) more jewish were killed during WWII in The Netherlands than anywhere else because of a very good administration - yes that was some years before iPhone came out, but the idea still is the the very same: information can kill people in the end.

Anyone who doesn't understand these issues, is blind for what has happened in the world, what is happening in the world, and, regrettably is going to happen in the world. History repeats itself, as we know.

We have to be very careful with information.
That's why privacy is so important.

This idea simply kills all the effort they made in the last ten (10) years of working towards a greater privacy phone.

I will not buy iPhone anymore if Apple does not take this back.
 
Currently, although they are able to, Apple only scans iCloud on request of law enforcement.

Honestly, this most likely isn’t the case. Apple has had iCloud scanning in their privacy policy since May 2019. They must do CSAM scanning as part of their normal routine procedure or have law enforcement with search warrant. If law enforcement just asks for CSAM scan then they would have conducted warranties search. Fourth Amendment protections apply to searches conducted by private parties who act as “agents” of a government. How do you know if a private citizen or company acts as a government agent? The legal definitions and tests vary by court, but generally, a court will consider the degree of control that the government exercised over the private party’s search and whether the private party had an independent reason, unrelated to law enforcement, to conduct the search (such as a business justification).

This issue is arising increasingly in criminal cases in which online service providers turn over information that the government ultimately seeks to use as evidence of a crime. If a service provider is found to have conducted a warrantless search as a government agent, the criminal defendant may be able to prevent the court from considering not only the evidence that the service provider gave to the government, but any subsequent discoveries due to that initial evidence.
 
  • Disagree
Reactions: dk001
I imagine Apple would be able to determine that the images had been uploaded from a device the user did not own. And if the hacker in question wants to add an additional charge by admitting to having child pornography in their possession, then he deserves the stupid prize as well.

Could it be done? Nothing is impossible in this world.

Would it be done? Again, I think back to all the fanciful doomsday scenarios between suggested here, and I suspect that if the government or some random hacker really wanted to do a person in, there are probably already far easier methods of doing so that don’t require them to jump through the various hoops that abusing the CSAM detection system entails.
I don't think it's fanciful at all. Surveillance states like the U.S. are always looking for more advanced ways to conduct mass surveillance more easily and efficiently. Apple is serving it to them on a golden platter. Of course they will use and abuse this as quickly as possible.
I can’t help but think back to the announcement of the iPhone 5s and Touch ID and how people were going around claiming that robbers would be chopping off the fingers of 5s users to use to unlock their phones with. Fast forward to today, I am not seeing this phenomenon happening, so it just feels like a whole lot of concern trolling to me.
On the contrary, with the courts ruling that law enforcement can compel your biometric password (i.e. force your finger onto the TouchID sensor), it is absolutely a reality that your finger (or face) can be used involuntarily to access your phone.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.