Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Mac4Mat

Suspended
May 12, 2021
168
466
Already did. But from the other end, as the images circulate and proliferate across the web, they get added to the database. The biggest distributors of child porn are kids themselves. The content they entertain each other with is adults problem to live in the world with, as we try to “protect” them from their own sexuality. Ridiculous. Hopefully this backfires in an appropriately extreme way.
Already did. But from the other end, as the images circulate and proliferate across the web, they get added to the database. The biggest distributors of child porn are kids themselves. The content they entertain each other with is adults problem to live in the world with, as we try to “protect” them from their own sexuality. Ridiculous. Hopefully this backfires in an appropriately extreme way.
The argument that Apple even has access to full database is erroneous. They would not be allowed full access as that in itself would be breach of privacy. They certainly don't have access to UK data on the matter. Its fire fighting after the event for a really poor decision.
 

laff

macrumors member
Aug 1, 2009
34
8
Apple should fully disclose how they are “hashing” these images, what constitutes a “match”, and what is the score or threshold at which point they hand your information over to some “nonprofit organization” who will then build a criminal complaint against you.

If Apple isn’t lying here (i.e., they never see the actual images on your phone), then they are creating a “fingerprint” of sorts and comparing the print of your image to a print of known child pornography. Is the system really going to be so unsophisticated that it only detects 1:1 matches for previously identified material? If so, then they need to be crystal clear about that. Or… is this system going to be ML-based and detect certain body positions, parts, postures, exposed skin, facial expressions, etc?

I am highly skeptical this is going to be a binary flag for each image (good/bad). Your entire library is going to be scored and, based on that score, it will be handed over to law enforcement who will use that score as gospel to secure a search warrant.



Let me make this point (as unpolitically as possible): at least in the USA, cultural norms and morals have been changing very rapidly in the last 10-20 years, and technology has played a large role in that. What was once cherished is now looked down upon; what was once reprehensible is now embraced as sacred and unquestionable. Right and wrong are evolving so quickly that a socially-accepted, non-serious tweet from 10-15 years ago will cost you future opportunities. There is no room for social or historical context, only bitter argument: you wrote that with yesterday’s standards and I’m judging it with today’s standards. So what happens when our technological overlords change their standards? How can I know that my thoughts, attitudes, and behaviors today will be acceptable to a future unknown standard?

I’ll leave you with this, and I tell it to my children every time they use the computer:
The internet never forgets, and it seldom forgives.
You are correct. Now think if someone sends you a picture forwarded to them of a Sunrise but added the hash tags they are looking for in the search...your and everyone with that picture is screwed. Things are not black and white.
 

triton100

macrumors 6502a
Dec 15, 2010
780
1,311
The moon
Have no idea what these hashtags are or how it works but it feels gross thinking that my new OS may contain the data for a database of child porn to scan against. Just weird.
 

BigMcGuire

Cancelled
Jan 10, 2012
9,832
14,025
But thats just it. I dont have a big problem using gmail or google photos. I know google scans it. But there are 2 major differences:

#1 apple is doing it on my phone. If this was only on their server i could “trust” the scope. But now, apple is writing loopholes within the OS. It breaks all trust i have in the security of the underlying OS if they are willing to do that

#2. It is hypocritical. Google is an ad company. I expect them to do ad company things. Apple touts that privacy is a human right. I bought in. Then they do the same thing google is doing.

Understood, I do not like the idea of my photos being scanned locally - despite this only happening if I have iCloud Photos checked.

As for #2 - Google sells data for ads - sells your behavior and usage patterns and anything else they can monetize. Let's be clear here, Apple isn't doing that - they aren't doing the same thing Google is doing. So I agree with #1 but got a problem with #2 - I see the point you're trying to make.

The key word is “cloud”. Again. I have no issue with Apple looking on server side. I have huge issue with device scan.
Right. Offloading the scanning to user's devices - that's going to upset a lot of people.

Edit: @hagar goes into a good explanation below as to why local scanning - so data remains encrypted on the servers. But most people aren't going to understand that.
 

hagar

macrumors 68000
Jan 19, 2008
1,976
4,951
Apple has very good reason to target accounts with massive CSAM content. Obviously they’re the once that should be stopped. Owning one single photo can be a fluke, an accident, hardly an offence you can be convicted for. Typically these guys have massive libraries

Enough with this already. Apple, STOP trying to brainwash us every day now. Apple, STOP feeding us with the bull. I have a compromise for you to take. Since you did not even bother what the consumer thinks about this CSAM and you are bringing the CSAM feature this fall...

At this point, I am willing to pay $99 per year for keeping my privacy to myself. It's a win/win. Go ahead and start charging consumer to keep their privacy to themselves. We are talking billion of dollars in profit.

$99 Per Year (No iCloud Photo/Messages Scanning, Keep Your Data to Yourself and 100% Privacy)

That’s iPhone.



I will wait for your response...
stop being outraged about something you dont understand. Apple has found a way to van CSAM content from their servers while leaving you files encrypted on their servers. That’s a massive achievement. Because they want your data to remain your data. Nothing is leaving your device because they do CLIENT-SIDE verifications.
 
Last edited:

hagar

macrumors 68000
Jan 19, 2008
1,976
4,951
Understood, I do not like the idea of my photos being scanned locally - despite this only happening if I have iCloud Photos checked.

As for #2 - Google sells data for ads - sells your behavior and usage patterns and anything else they can monetize. Let's be clear here, Apple isn't doing that - they aren't doing the same thing Google is doing. So I agree with #1 but got a problem with #2 - I see the point you're trying to make. The idea is that there isn't much of an effort required to do more.


Right. Offloading the scanning to user's devices - does that really save them that much server processing and $? What was the motivation for device scanning vs iCloud Photos scanning server side? How did they not see that this would upset so many people? Shocking. Agreed.
They do it client-side so no data leaves your device and photos can remain encrypted on the web. Because they wanted a privacy friendly solution. It has nothing to do with the cost of server side scanning.
 
  • Like
Reactions: BigMcGuire

4jasontv

Suspended
Jul 31, 2011
6,272
7,548
stop being outraged about something you dont understand. Apple has found a way to van CSAM content from their servers while leaving you files encrypted on their servers. That’s a massive achievement. Because they want your data to remain your data. Nothing is leaving your device because they do CLIENT-SIDE verifications
This is a contradiction. You can not know what something is and also not know. It can't be both my private data and searchable. It's not a massive achievement it's hand waving and side talking.
 
Apple has very good reason to target accounts with massive CSAM content. Obviously they’re the once that should be stopped. Owning one single photo can be a fluke, an accident, hardly an offence you can be convicted for. Typically these guys have massive libraries


stop being outraged about something you dont understand. Apple has found a way to van CSAM content from their servers while leaving you files encrypted on their servers. That’s a massive achievement. Because they want your data to remain your data. Nothing is leaving your device because they do CLIENT-SIDE verifications.
You wouldn't be talking like that if Apple got hold of your wife pictures.

Dude, I'm trying to save you. Don't fall for it. Don't get yourself PLAYED.
 

ian87w

macrumors G3
Feb 22, 2020
8,704
12,636
Indonesia
They are scanning for meta-data of know pictures. Not adult pornography etc. Again, how about working on stopping those making the media. So they arrest people that had it, sure bad, but won't stop it.

Those that view it may get caught, but we can question why they have it all day. Does lead them to make it under the DOJ. Recidivism is under 4%. Can we instead use the time and money to focus on stopping the abusers. Cause if there is no media made, then there is no media to be had.
Thus it begs the question when something is done "for the children" without actually addressing the root of the problem.

As you pointed out, the real solutions is to go for the real abusers.

But now the system is (will be) in place, a system where Apple, a private company, can collaborate with another party with a blackbox database to scan hashes that Apple code in to iOS. With Apple's own privacy head having the mindset of "well, don't do anything illegal," I'm sure some countries will want to have a discussion with Apple to "collaborate" into scanning for "illegal" materials, I mean hashes.
 

rafark

macrumors 68000
Sep 1, 2017
1,743
2,937
OK but this doesn’t change anything.
People who are overly paranoid about this feature are still going to be overly paranoid about this feature.
but I think the funniest thing is, the place where I’ve seen the most paranoia about this feature is Facebook.
If you use Facebook, this feature shouldn’t even slightly concern you because your privacy is already gone
But when you post on Facebook you know you’re sending the data to a third party server. In this case it’s in your own device afaik. that's awful.
 
  • Like
Reactions: hagar and Mydel

ececlv

macrumors regular
Sep 26, 2014
130
386
Understood, I do not like the idea of my photos being scanned locally - despite this only happening if I have iCloud Photos checked.

As for #2 - Google sells data for ads - sells your behavior and usage patterns and anything else they can monetize. Let's be clear here, Apple isn't doing that - they aren't doing the same thing Google is doing. So I agree with #1 but got a problem with #2 - I see the point you're trying to make.


Right. Offloading the scanning to user's devices - that's going to upset a lot of people.

Edit: @hagar goes into a good explanation below as to why local scanning - so data remains encrypted on the servers. But most people aren't going to understand that.
#2. Google does not say privacy is a human right. Apple does. Apple is proposing to violate privacy. It is hypocritical
 

4jasontv

Suspended
Jul 31, 2011
6,272
7,548
As you pointed out, the real solutions is to go for the real abusers.
History has taught us, over and over and over, that this means going after those that profit from the industry. Target the distributors. Not the customers. People who feel they need this content need help. The people exploiting them are the ones that need to be in a prison.

Apple would have made a much more compelling argument if they were stating that the content wouldn't be handed over to the police, but rather used to identify creators and distributors.
 
  • Like
Reactions: lifeinhd and ian87w

sashavegas

macrumors regular
Jul 11, 2018
114
80
Even if apple introduce new feature like every iphone owner must dial special number at 9 am every morning and say
" hello apple. Here i am (Last name First name) let me use the phone", everybody still be using it, thinking it as small obstacle to a greate technology. As of today, apple can do whatever they want with their base, knowing that nobody will drop using it.
 

ian87w

macrumors G3
Feb 22, 2020
8,704
12,636
Indonesia
You are correct. Now think if someone sends you a picture forwarded to them of a Sunrise but added the hash tags they are looking for in the search...your and everyone with that picture is screwed. Things are not black and white.
Thus usually a system like this is balanced with transparency for audits and public scrutiny, and proper appeal process.

The problem here is, it's totally opaque, a complete blackbox. And it's not comforting when a blackbox system is used to judge the morality of others, no matter how noble the initial cause is.
 

Mydel

macrumors 6502a
Apr 8, 2006
804
664
Sometimes here mostly there
Right. Offloading the scanning to user's devices - that's going to upset a lot of people.

Edit: @hagar goes into a good explanation below as to why local scanning - so data remains encrypted on the servers. But most people aren't going to understand that.
remember that Apple hold the key to encyption. Thats why when served a warrent they provide law enforcement with all icloud library decrypted. So i doubt its an issue….
 
  • Like
Reactions: BigMcGuire

BigMcGuire

Cancelled
Jan 10, 2012
9,832
14,025
You wouldn't be talking like that if Apple got hold of your wife pictures.

Dude, I'm trying to save you. Don't fall for it. Don't get yourself PLAYED.
There's some layer of trust in everything right? You trust your car not to fall apart while you drive it, you trust your (insert computer used here) not to constantly take photos of you while you use it distributing it to everyone you know.

We have to trust that Apple is only comparing hashes to previously identified CP (child-****) images and sending those up for review. We have to.

If we don't agree to it, simple - turn iCloud photos off. End of story, right? (According to what I'm reading, turning off iCloud photos disables device photo scanning).
 

BigMcGuire

Cancelled
Jan 10, 2012
9,832
14,025
remember that Apple hold the key to encyption. Thats why when served a warrent they provide law enforcement with all icloud library decrypted. So i doubt its an issue….
Right, it involves trust. Agreed.

It isn't true encryption unless you're the only one that knows the key.
 
  • Like
Reactions: Philip_S
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.