Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Monitor web sites and infiltrate sex offender networks online, which is what governments already do.
And they’re GROSSLY and sadly behind the curve. How often do you read about a major sting or take down vs just 1 single arrest????

that’s the issue. Authorities not governments unless international make single arrests? Not often at all. They wait until a major bust, then big news then major increase in budget.

all the while far too many youth continually get abused, if they survive long enough then they’ll be freed of said major sting.

the best camera and video camera is the one in your pocket - proven by how many pictures uploaded across the entire web and file sharing/storage services (exif data). And the best TV or viewing source for any video is the one in front of you most of the time - again smartphone. We’ve already seen this trend with youth on social media platforms, prime playgrounds for sickos to lure non-street smart youth.

I’m willing to bet there are more pedophiles caught wanking off in public than the amount of arrests for child sexual abuse/trafficking on a weekly, monthly, yearly.

so come up with a better solution not state what’s already in place - my original challenge to those complaining and overblowing this to their own privacy vs the safety of children in their society. “It’s not my problem or the guv has a solution” isn’t a great answer.

personally: I’ve registered my son and step daughter befor 6 each in case they’d be abducted (not likely but can’t be with them every minute of the day). I’ve gone to neighborhood watch meetings, petition for lights in nearby and large parks. My place is a safe haven for kids in need and access to phone whenever they need.

I grew up near a VERY large park in Toronto, ON and there are (after 40yrs STILL no lights) and a close friends sister was raped or molested on 3 different occasions somewhere in that park!! Why, how, thinking doesn’t matter it’s still horrible and wrong.

so what are YOU doing to solve or prevent the issue?!
 
Only if you have multiple wrongly flagged photos.
Basically impossible.
One isn’t enough.
Just why do you think that threshold wont become 1 when they never hit a match with the current threshold (whatever that threshold is! We don't know for a reason!)? The gov is going to scream at them for not doing anything, and Apple will knuckle under. Having it there on the phone almost insures that will happen.
 
The irony of this is in order for them to be able to claim they can do this while not accessing your photos, they've essentially created a way for others (and themselves if they needed) to access and eventually, if need be, see things that were formerly protected by E2E. In the process of saying, "Hey we can do this without seeing your encrypted stuff", they've decreased one of the primary benefits of E2E.
The problem with this "logic" that has been repeated over and over in this thread in others is that people think that adding this method of analyzing photos somehow provides a backdoor for them or others to search photos.

It already existed...this is not some new tech. There are many other easier ways for anyone to get info off of someones phone including rogue apps, click bait in messages, etc.

Apple controls it since it is part of iOS, not an app. They could have always added this in...why would they suddenly use this to do something nefarious?

Governments could or couldn't force Apple (depending on the country) to do their bidding...how has that changed with this?

Hackers can fairly easily get into an individual user's iCloud account to see everything...why create some complicated hack that alters on an on device analysis to see what they want to see when they can look at it all once it is in iCloud? How could they get it off the device if the user doesn't use iCloud?

People watch too much t.v. and movies where someone hacks into a phone and the info is magically transmitted to the hacker. Even if that can be done, it isn't easy, and this on-phone analysis doesn't help.
 
Just why do you think that threshold wont become 1 when they never hit a match with the current threshold (whatever that threshold is! We don't know for a reason!)? The gov is going to scream at them for not doing anything, and Apple will knuckle under. Having it there on the phone almost insures that will happen.

Why? Because I don’t think in terms of “stuff could happen” memes, buzzwords and simplifications like a sloppy slippery sloper.
 
  • Like
Reactions: mw360 and MozMan68
Google does not install spyware on your device that sifts through your files. Everybody understands that if you choose to upload data to the cloud (i.e. someone else's computer) without first encrypting it, they can access it and have the right to keep illegal material off of their servers.

Yes, they have.

Google have deployed to Android phones a service which "scans" and analyse every URL used by the system. Not only URLs you type inn in your browser but every URL which every app is using. And some of them are under certain conditions sent to Google.

Google has also deployed a tool which scans part of the file system on Android and deletes certain files undre certain conditions, probably reporting back to Google what it did.

How about anti-virus software? Built-in to Windows, scans every local file.

Yet, very few people care.
 
So, how is AAPL stonks going these last 2 days?

Has it already fallen to zero?

Or most investors couldn’t care less about the FUD-drama?
 
  • Haha
Reactions: MozMan68
So, how is AAPL stonks going these last 2 days?

Has it already fallen to zero?

Or most investors couldn’t care less about the FUD-drama?
“Surprisingly” this hasn’t broke through to the major news networks in the US. It’s like no one outside of Apple forums either knows or cares about this… 🤣
 
  • Like
Reactions: bobcomer
While that authority is on the phone...

"So...about this tool you have to scan users data on their phone and look for matches against a database...."

Have you thought about Apple already have a much better tool for this? The Photo app!

If a government wants Apple to scan for "pictures showing activity A" or "picture containing objects of type B" then the CSAM detection system is poorly suited for the job. The Photo app on the other hand will excel at this.

So what's stopping someone from forcing Apple to use the Photo app?
 
  • Like
Reactions: mw360
You don't seem to get it. If a threshold is reached, a person at Apple will review the images. Moreover, how is any perceptual algorithm going to classify an image as child porn without assessing the amount of skin exposed as a feature? T

The system doesn't classify at all. It just compares images by advanced hashing to something which has been determined to be CSAM.

The algorithm used cannot detect CSAM material, nudity or anything at all.
 
  • Like
Reactions: mw360
Why? Because I don’t think in terms of “stuff could happen” memes, buzzwords and simplifications like a sloppy slippery sloper.
So you never plan ahead for different eventualities? That's VERY different from me, and it's no slippery slope, your definition -- it's planning ahead! You don't do backups of your data/OS's? How about maintenance for your car? This is no different.
 
  • Like
Reactions: Euronimus Sanchez
Apple is selling out to government overreach (not just the US, I see this quickly spreading to places like China and Russia and incorporating other types of material), and it's painfully obvious they don't know how to navigate these waters very well since the San Bernardino shooting/attack.

Also, this method of "protecting" us, does a complete end run around any end to end encryption, as the check happens on the phone after decryption has occurred. And of course if someone were to write a utility to re-encrypt or otherwise foil this invasion of privacy, they would kick it from the App Store. Anyone seeing Epic's side of the story yet? Because this is one of the things it addresses even if it was unintentional.
 
  • Like
Reactions: turbineseaplane
Have you thought about Apple already have a much better tool for this? The Photo app!

If a government wants Apple to scan for "pictures showing activity A" or "picture containing objects of type B" then the CSAM detection system is poorly suited for the job. The Photo app on the other hand will excel at this.

So what's stopping someone from forcing Apple to use the Photo app?
Nothing stops that...but that's not as sexy as some rogue hacker adding hashes to flag photos of rainbows that is somehow magically pulled from the phone through the private network run by billionaires... :rolleyes:
 
So you never plan ahead for different eventualities? That's VERY different from me, and it's no slippery slope, your definition -- it's planning ahead! You don't do backups of your data/OS's? How about maintenance for your car? This is no different.
I plan ahead based on the probability of different possible outcomes, not based on “anything could happen”.
Bringing a fridge to a trip the North pole is not planning ahead.
 
  • Love
Reactions: MozMan68
Google have deployed to Android phones a service which "scans" and analyse every URL used by the system. Not only URLs you type inn in your browser but every URL which every app is using. And some of them are under certain conditions sent to Google.
URL's are public, even mine.
Google has also deployed a tool which scans part of the file system on Android and deletes certain files undre certain conditions, probably reporting back to Google what it did.
But not my photos. And that data it deletes is all OS and cache stuff and I imagine it does report back if you have it set to report performance statistics. (I don't)

How about anti-virus software? Built-in to Windows, scans every local file.
Yep, and it doesn't report to the government what it finds either and the AV company hasn't said they'd turn you in if they found something.

Yet, very few people care.
You're right, I don't care about this stuff, it's no threat. When google starts scanning my data on device and reporting to the government if they find anything, I'll complain just as much about them as I am Apple. And since I have an android phone too, I have standing to complain. (and worry!)
 
I plan ahead based on the probability of different possible outcomes, not based on “anything could happen”.
Bringing a fridge to a trip the North pole is not planning ahead.
Same as me, except this is one of those things that need planning ahead.
 
You still don't get it..what a surprise since I've only said it 10 times and am repeating exactly what Apple says...The "one in a trillion" odds do NOT apply to a single picture match.

Winning the lottery (PowerBall, Megamillions, etc.) is one time is about 1 in 257 million chance.

Do you know what the odds are for winning the lottery twice? (hint: it's NOT 1 in 500 million).

EDIT: An even more apples to apples comparison...what are the odds of winning multiple different lotteries in the same week?


Having one hashed picture match a single hashed picture in the database is not one in one trillion...but multiple different pictures on your phone that happen to match multiple different pictures in the database and are NOT true matches?? One in a trillion seems like pretty realistic odds.
I don't think you've thought your response through. It sounds like you are treating false positives like statistically independent events, as though the contents of a person photos are randomly related to each other. Yet people often take pictures in a series. So if one picture in a series matches as a false positive, other pictures in the series have a high chance as well. And again, the odds of a false positive vary with the number of pictures*, even if the pictures are unrelated. so I have no idea how Apple has come up with this one-in-a-trillion figure.

*To be precise if the pictures are statistically independent: the probability of a false positive for at least one picture = 1-[(1-the false positive probability per picture)^number of pictures scanned]. Perhaps Apple feels the false positive probability per picture is so close to 0 that the number of pictures doesn't matter. Or perhaps Apple is assuming statistical independence of pictures rather than using real statistics derived from people's photo libraries.
 
The problem with this "logic" that has been repeated over and over in this thread in others is that people think that adding this method of analyzing photos somehow provides a backdoor for them or others to search photos.

I don't think that at all. But it does provide a path that was not previously there. It's a path to "nothing" - I get that - but it's still a path that...
  • didn't exist before
  • is there without my permission
  • is being used to detect something that may or may not be there
  • and that "something" has the potential of being passed to a human for review and then the authorities
  • and did I mention without my permission?
I 110% understand that the likelihood of anything going wrong with this is next to nothing if someone doesn't have any CSAM on their device. But that's not the point.

It already existed...this is not some new tech. There are many other easier ways for anyone to get info off of someones phone including rogue apps, click bait in messages, etc.

It's a (mostly) new methodology, though.

Apple controls it since it is part of iOS, not an app. They could have always added this in...why would they suddenly use this to do something nefarious?

Governments could or couldn't force Apple (depending on the country) to do their bidding...how has that changed with this?

I agree. I'm not worried about Apple doing something nefarious. And I know full well that governments can play hardball and Apple will likely have to appease them (which should actually cause this implementation to be more concerning, no?).


My main point was this - if I'm a criminal, it's causing me to pause and say, "maybe the encryption wasn't as much protection as I thought it was." If I'm a privacy minded (obsessed? ;)) individual, and I use Apple for that reason alone, it's causing me to rethink things a bit as well.

That could be the fault of a general public misunderstanding or thought process that having E2E encryption means your data is bulletproof... who knows. But people who previously thought, "Hey nobody can see this stuff, it's protected by E2E encryption!" are now saying, "well wait, so even though they still can't see it, there IS a way to access it and potentially unwrap it to see it? I didn't think that was possible?"
 
Consider that maybe you're not the target. Review the Pegasus platform and understand that law enforcement can already root your phone with a text. If they want to create a reason to get a warrant, a way to trigger a NCEC report is a very nice tool to have.

If they already can root your phone with a text message to gain access to everything on the device. Why to they need a search warrant to get the data? Can't they just get all the data when they gain access in the first place?

Seems very convoluted.
 
And they’re GROSSLY and sadly behind the curve. How often do you read about a major sting or take down vs just 1 single arrest????

that’s the issue. Authorities not governments unless international make single arrests? Not often at all. They wait until a major bust, then big news then major increase in budget.

all the while far too many youth continually get abused, if they survive long enough then they’ll be freed of said major sting.

the best camera and video camera is the one in your pocket - proven by how many pictures uploaded across the entire web and file sharing/storage services (exif data). And the best TV or viewing source for any video is the one in front of you most of the time - again smartphone. We’ve already seen this trend with youth on social media platforms, prime playgrounds for sickos to lure non-street smart youth.

I’m willing to bet there are more pedophiles caught wanking off in public than the amount of arrests for child sexual abuse/trafficking on a weekly, monthly, yearly.

so come up with a better solution not state what’s already in place - my original challenge to those complaining and overblowing this to their own privacy vs the safety of children in their society. “It’s not my problem or the guv has a solution” isn’t a great answer.

personally: I’ve registered my son and step daughter befor 6 each in case they’d be abducted (not likely but can’t be with them every minute of the day). I’ve gone to neighborhood watch meetings, petition for lights in nearby and large parks. My place is a safe haven for kids in need and access to phone whenever they need.

I grew up near a VERY large park in Toronto, ON and there are (after 40yrs STILL no lights) and a close friends sister was raped or molested on 3 different occasions somewhere in that park!! Why, how, thinking doesn’t matter it’s still horrible and wrong.

so what are YOU doing to solve or prevent the issue?!
First, the process apparently only matches known examples of content already deemed illegal, so it won't stop new material from being generated on somebody's iPhone. As I said in my first post, I appreciate Apple's sentiment. However, this search without probable cause, and therefore I think this idea lacks balance, to put it mildly. We can stop the vast majority of all sexual assaults by locking up all men. We can protect the rights of all men by making sure they are never locked up. The point is to get a balance between these two extremes. I believe Apple has got the balance way wrong because I believe they should have either probable cause or my permission to scan private photographs; I believe Apple are not properly estimating the likelihood of false positives; I am uncomfortable the human review they propose after a picture matches a hash; and I know that their system could be altered just ever-so-slightly for purposes far less noble than detecting child porn. If an autocratic government gets a hold of this, you can kiss goodbye to any political opposition and say hello to all sorts of systematised oppression.

It's simple: the pictures I take are mine (indeed in the UK the taker of a picture has an unalienable copyright). They are none of Apple's business unless they first suspect a crime or that I have otherwise violated their T&C's,. Apple should have probable cause before they do anything so invasive as scanning my photo's. Seriously, what's next? Scanning text? Tracking web traffic? Monitoring audio calls? Law enforcement is already doing that, and we don't need Apple piling on.

I am sorry to hear about your friend's sister. My family has been affected by sexual abuse, but nothing Apple proposes to do would have changed that. Better law enforcement would. If Apple wants to help, let it donate funds to law enforcement.
 
The system doesn't classify at all. It just compares images by advanced hashing to something which has been determined to be CSAM.

The algorithm used cannot detect CSAM material, nudity or anything at all.
If the algorithm uses exact matching, then it will lead to an arms race between pedophiles editing images and Apple adding new templates until the system bogs down and becomes untenable. If the system is using an approximate match, then the hashing process will represent perceptual features of the image in summarised form, however implicitly. The use of a threshold suggests that basically this is a template matching system that will use perceptual similarity.
 
If they already can root your phone with a text message to gain access to everything on the device. Why to they need a search warrant to get the data? Can't they just get all the data when they gain access in the first place?

Seems very convoluted.
Law enforcement can't *legally* use Pegasus without a warrant in place. But bad actors are not bound by that. The risk vector is not getting data on the phone, it is placing data on the phone that will trigger detection and referral to law enforcement. It's fine if you don't think that's a risk and don't see any issues with having your data locally scanned.
 
I don't think that at all. But it does provide a path that was not previously there. It's a path to "nothing" - I get that - but it's still a path that...
  • didn't exist before
  • is there without my permission
  • is being used to detect something that may or may not be there
  • and that "something" has the potential of being passed to a human for review and then the authorities
  • and did I mention without my permission?
I 110% understand that the likelihood of anything going wrong with this is next to nothing if someone doesn't have any CSAM on their device. But that's not the point.



It's a (mostly) new methodology, though.



I agree. I'm not worried about Apple doing something nefarious. And I know full well that governments can play hardball and Apple will likely have to appease them (which should actually cause this implementation to be more concerning, no?).


My main point was this - if I'm a criminal, it's causing me to pause and say, "maybe the encryption wasn't as much protection as I thought it was." If I'm a privacy minded (obsessed? ;)) individual, and I use Apple for that reason alone, it's causing me to rethink things a bit as well.

That could be the fault of a general public misunderstanding or thought process that having E2E encryption means your data is bulletproof... who knows. But people who previously thought, "Hey nobody can see this stuff, it's protected by E2E encryption!" are now saying, "well wait, so even though they still can't see it, there IS a way to access it and potentially unwrap it to see it? I didn't think that was possible?"
Again...the path did exist and was always there...just use in a different way. Apple is taking that path and using it for a different purpose so they can report illegal images while keeping the privacy of the innocent safe. People may have a hard time with that, but considering what they have done and currently do with my data on my phone, this addition is the least of my worries.
 
Again...the path did exist and was always there...just use in a different way. Apple is taking that path and using it for a different purpose so they can report illegal images while keeping the privacy of the innocent safe. People may have a hard time with that, but considering what they have done and currently do with my data on my phone, this addition is the least of my worries.

They do a lot with data on our phones already, but most of those things have a toggle to opt out. And most of those things aren't built with the intention of catching bad guys. I believe the intentionality is fundamental difference. It's Apple dipping their feet into a role that the authorities would normally have.

On the "path", I think we're disagreeing on a semantic issue. And I guess it ultimately boils down to your last sentence and where people fall in that regard. Ultimately I do trust Apple is trying to do the right thing with this - some might call that naive. I just don't like that they're choosing to use that path in a different way (to use your verbiage), without me signing off on it, and with the potential of that path allowing a "reach out" from my device without me knowing it.
 
Also, after going back and forth on this for the past few days, my brain keeps coming back to the idea that this might not help at all. It's essentially an opt out "feature" for those that do keep CSAM on their devices. Turn off iCloud Photos, and you have nothing to worry about. So I don't fully understand the heaps of praise that the NCMEC, Ashton Kutcher, etc., are placing on Apple. Honestly I'm a little surprised they haven't seen more "it's not enough" type of messages.
 
  • Like
Reactions: VulchR
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.