Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The rate of error does not concern me. I don't want them going through my photos, or any other data for that matter.
if you use iCloud for photos or device backups, they already are going through your photos.

if you don't use iCloud at all, you're in the minority in which Apple has to make the tough choice of catering to the majority of their customers.
 
It is harmful for two reasons: it will give people a false belief that their data is private, plus it will mean we’ll have to start all over agin campaigning for E2EE without embedded spyware in the client (which was hard enough when it was Facebook doing it, and they dropped the privacy claims rather than fixed the issues).

People already have the false belief that their iCloud photos are private, which they are currently not. This system doesn't change that assuming the user is still using iCloud photos.


Apple made a big point about ensuring that data would never leave the device on which it was created, except as part of an encrypted backup, which is why neither the ML models nor the match locations are shared between iOS and MacOS Photos.

I was responding to the quote: "under no circumstances should it be preemptively scanned."

This is a circumstance in which it is preemptively scanned.
 
When someone builds a nuclear warhead, however it works doesn’t matter. It kills people.
Apple creates this advanced scanning tool. However fancy it is doesn’t matter. What’s matter is it SCANS private Photos.

I doubt it, based on your responses.

Do you know what’s the meaning of the word “proactive”? Do you know what “proactive approach” mean? Do you know what “pandora box” mean? Why every recent (2020 to 2021) Facebook related posts there are always a group of people saying “who uses Facebook nowadays” or just bashing Facebook one way or another?
“Proactive” has the meaning of “act based on history and reasonable expectation of the consequences”. History shows privacy and safety is a hard trade off: more privacy means less safety and vice versa. Machine learning based tools have been proven the potential to surpass human in the task they have designed to excel, such as beating chess. Apple plays this dangerous game to let machine learning to protect child online safety, which means privacy will be sacrificed in the way. For apple as a “privacy advocate”, such move is bound to receive lots of criticism, and sure they did in the past week.
Then, based on previous observation of power of machine learning and the trade off between privacy and safety, people reasonably assume apple’s move to bring machine learning into this can have untold consequences. So they voice their concern and addresses it’s potential to be misused. Sure, this machine learning tool can be extremely good at finding CSAM as time goes on, but this tool can be just as good to find other images since it’s designed to do so. What will stop Apple from changing policies to allow other countries to search their own images of concern? When we talk about trusting Apple, what we trust is people who runs Apple to not put the tool to the hand of bad guys. Do you trust a bunch of strangers thousands miles away and probably never gonna meet you in person caring about your privacy and safety at the same time? If you do, then good for you, cause we here don’t.
Remember we’re talking about hashes here and not photos. Its not correct or incorrect really to say it scans photos cause it kind of does but it really doesn’t. Its a bit complicated but it was chosen because of the privacy it affords with accuracy it can output.

Your comparison of a nuclear warhead is a poor comparison. Also by comparison a nuclear warhead is incredibly affective at doing exactly what its been designed and told to do.

This feature they are introducing doesn’t kill people nor does it have fallout or attack people indiscriminately. In fact the only way it will go off IS if you are doing something bad. This issue here is you don’t have trust. I can respect that. I think that is really were the core of the discussion is for many.

Privacy is not a trade off for safety in every case. When I keep something private lets say my SS# I’m also safe from any malicious attacks. What we have in this situation is trying to find a very narrow path where privacy is maintained for everyone with the exception of people who are harboring these illegal images.

I don’t have a crystal ball and so I can’t predict how this could be corrupted. I have tried and listened to many of the arguments people have posted. Most are pure speculation and fear-mongering. That doesn’t mean a way doesn’t exist politically or technically. Politically I feel pretty safe as Apple as been very public on not bending to those demands. Technically I’m sure there will be bad actors that will try. However I also know there are ethical hackers and I believe Apple will address those issue.

All of this is because I trust them. I’m not saying you have to. I think you should voice your concerns. Ask the hard questions. Make good points. Hell even bring up historical comparisons, as unfair as some have made them. Not saying you have. Since Apple seems like they are going through with it no matter what, hold them accountable. You don’t have to use their products. Its people like you that, if Apple is being honest, make sure that it doesn’t get out of line.
 
Just why do you even think this is relevant to the discussion? Spotlight is for our benefit and doesn't report in to the government if it finds something it thinks is objectionable. And fwiw, I don't use spotlight either, but I don't bother disabling it either, I simply don't care it's there as it's not useful to me.
The quote I was responding to was "any scanning or review of our private photos is a total invasion of privacy."

I'm showing how that statement is false. There is some scanning that is *not* a total invasion of privacy. On device scanning is one.
 
Nope. It wouldn't bother me, and in fact, I think they will have to anyway, even with the backdoor in iOS now -- as iOS isn't the only way to store things in iCloud, any old browser can do it.

Anecdotal.

And guess what, I'm going to wager that majority of iPhone customers that are complaining about privacy are going to continue iCloud for their photos where their entire photo libraries can be viewed by a human unknown to the owner of the account. For as long that statement is true, asking for privacy is moot considering they'll continue to use this "public" iCloud.

And guess what, iCloud is Apple's, it's public, and they pay for the scanning -- it would be more efficient power-wise as well.

Customers ultimately pay for scanning as costs get passed to them.
Power efficiency is debatable. We're talking about thousands of servers in several data centers on 24/7 whether there are photos to scan or not. Serverside scanning likely consumes more energy than the overhead of using the user's device to scan.
 
Apple's statements about the odds of erroneous flagging are meaningless without independent verification. Since the details of the technology are proprietary, that's impossible to do. They should open-source the scanning/flagging code so independent experts can validate it.

They did have experts validate it. They don't mention specifically that it's 1 in a trillion, but one expert confirms: "Harmless users should experience minimal to no loss of privacy, because visual derivatives are revealed only
if there are enough matches to CSAM pictures, and only for the images that match known CSAM
pictures. The accuracy of the matching system, combined with the threshold, makes it very unlikely
that pictures that are not known CSAM pictures will be revealed."
 
That "1 in a trillion" figure is just PR bafflegab unless it's independently verified.

The threshold can be modified by one line of code to reach 1 in a trillion. What would be a realistic reason of why Apple would lie about it? Do you really think Apple prefers to have false accounts flagged left and right?

Seems like people assume the worst and treat everything as "us vs them".
 
Exactly. It incentivizes the creation of *new* child porn and the exploitation of more children in order to produce images whose hashes are not in the current database. It makes the problem *worse* and hurts *more* children. Why would anyone want to do that????


That's ridiculous.

So instead of turning off iCloud (which would turn off CSAM detection), they're going through the risky route of courting new children to have sex with just to have the convenience of iCloud sync?
 
Increasing demand for new abuse content doesn't make sense to me either.

It doesn't. People would just turn off iCloud (which would turn off CSAM detection) instead of risking prison time by having sex with new children just to have more content.

Again, you're making ZERO sense here.
 
A bit funny how someone types /fin in this thread but yet keeps coming back to continue arguing only to say bye again.

Going say bye to that particular person 👋 (but I remain open to conversing with those that can actually have a proper discussion).
 
It doesn't. People would just turn off iCloud (which would turn off CSAM detection) instead of risking prison time by having sex with new children just to have more content.

Again, you're making ZERO sense here.
It does. If you make only new content safe to have, then more kids need to be abused to produce more content.
 
No.

They've been scanning iCloud photos since 2019 https://www.imore.com/apple-says-its-scanning-photos-uploaded-icloud-weed-out-child-abusers

So whatever supposed "increase in demand" already took place a couple of years ago.

It's more likely those people turned off iCloud, which this new CSAM detection changes absolutely nothing for them.
According to you, they must not be making any changes at all, because that's the only way demand wouldn't go up.

Otherwise, you are thinking about it as a PUTT'M IN JAIL / CAPT'L PUNISHMENT ROX approach and not thinking about the abuse industry.
 
The quote I was responding to was "any scanning or review of our private photos is a total invasion of privacy."

I'm showing how that statement is false. There is some scanning that is *not* a total invasion of privacy. On device scanning is one.
I totally disagree -- it's the exact opposite -- if it's only for my benfit and doesn't get reported, it's okay. If it's not for my benefit like the CSAM scanning, and it scans my files, it's not okay -- that's the whole argument basically, and why I ask why you equate photos scanning and CSAM scanning -- one is for my benefit, one is for the government's benefit and on my dime.
 
And guess what, I'm going to wager that majority of iPhone customers that are complaining about privacy are going to continue iCloud for their photos where their entire photo libraries can be viewed by a human unknown to the owner of the account. For as long that statement is true, asking for privacy is moot considering they'll continue to use this "public" iCloud.
I don't care what the majority do, never did -- the majority don't even watch the news. What is right and wrong is that's important to me.
 
Customers ultimately pay for scanning as costs get passed to them.
Power efficiency is debatable. We're talking about thousands of servers in several data centers on 24/7 whether there are photos to scan or not. Serverside scanning likely consumes more energy than the overhead of using the user's device to scan.
When I buy a new phone in 2 or 3 years, and that's fine, they have a right to set that price and I have a choice whether I pay it or not. As for server side scanning costing more, no way in heck that's true. (or haven't you ever worked on the server side?)
 
I've bolded what I'm responding to...

Good point. The reason for staying within the Apple ecosystem for many in spite of these moves by Apple boil down to three...

1) 10+ years of indoctrination that "Google is evil". This is so ingrained in people that they believe that Android is a free and unfettered pipeline of personal information to every nefarious corner of the internet. Even if Apple ends up where Google currently is, they are still not Google and so... Apple is better. (The same happened with Microsoft)

2) They don't mind these things when Apple does them. This is the flipside of #1. "Apple has a conscience" is similarly ingrained in people's minds. This results in a default assumption of positive motives when Apple does a thing. Even if that action is the same as what other companies do, the fact that it is Apple causes a different response.

3) A heavy investment in the Apple ecosystem. This has been a longtime in the making. Computers, smartphones, tablets, smartwatches, speakers, TV set-top boxes, digital media, software, and more. There are many who are so financially locked-in to Apple's ecosystem that it is prohibitive for them to get out of it.

. . .

I would add one more to this list. The Apple hardware and software I currently have does what I need it to. Sometimes I'm tempted by Android or Windows. Maybe they'd do what I need them to as well. Maybe they'd do it better. But I'd need to purchase new hardware and research, find alternative software, learn how to use it, and that would take time. In the end it might work better, but it might work worse. And I have limited time. And so since what I have now works well enough, while I'm tempted, I hesitate to try something else. If I knew someone who was already doing what I do on a different platform and I could copy their setup, I might give it a try.
 
  • Like
Reactions: sracer
That's ridiculous.

So instead of turning off iCloud (which would turn off CSAM detection), they're going through the risky route of courting new children to have sex with just to have the convenience of iCloud sync?
There are so many things wrong with your statement I honestly don't know where to begin.
 
  • Haha
Reactions: Shirasaki
According to you, they must not be making any changes at all, because that's the only way demand wouldn't go up.

Otherwise, you are thinking about it as a PUTT'M IN JAIL / CAPT'L PUNISHMENT ROX approach and not thinking about the abuse industry.

Your comment made no sense.

When they turned off iCloud two years ago, this new system doesn't change anything for them because IT ONLY SCANS WHEN ICLOUD IS TURNED ON.

If they didn't turn off iCloud, then your supposed "increase in demand of filming new child porn" already took place and considering it's more difficult to catch predators through on device CSAM detection because there's a high threshold that must be met before anyone is alerted as opposed to all-you-can-scan via iCloud serverside.
 
There are so many things wrong with your statement I honestly don't know where to begin.

don't bother. we're done. you're deliberately avoiding the fact that it's easier for people to turn off iCloud and continue viewing child porn like the old fashion way without reprecussions as opposed to filming new content.

have a good one.
 
if it's only for my benfit and doesn't get reported, it's okay.

that would be a scan that is not a "total invasion of privacy" which counters the statement i responded to: "any scanning or review of our private photos is a total invasion of privacy."
 
When I buy a new phone in 2 or 3 years, and that's fine, they have a right to set that price and I have a choice whether I pay it or not. As for server side scanning costing more, no way in heck that's true. (or haven't you ever worked on the server side?)

I'm a full stack engineer and have setup image processing pipelines on colocated servers and as well as on AWS. I am without a doubt 100% sure it costs more.

Scanning every single photo uploaded by over a billion customers distributed across many data centers as opposed to having a billion CPUs already paid for and managed by users to do it? It's an easy answer. Server side scanning is more costly.
 
See my post here: https://forums.macrumors.com/thread...eatures-later-this-year.2307130/post-30168363

Still not 100% sold on this, but my perspective is shifting a bit.


Alright, I believe it's my fault that I have contributed to the overuse of analogies to the point where we all made them useless as we're all trying fix and adjust the analogies to suit our argument. Analogies are meant to simplify the situation, but instead we've all made it complex and they no longer fit the argument.

I was going to try and introduce another analogy to the "police station" analogy like a bill counter/counterfeit detector, but I can see how that'll add some confusion.

Let's just leave it at that, but glad to see someone properly discussing this as opposed to some of the other people I've talked to on here. Cheers.
 
  • Like
Reactions: jntdroid
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.