The saying goes "if it's free, you are the product." Well if we're paying for iCloud storage it's not free. So I agree with you.I paid for my cloud storage. It may be their hardware but it's my space.
The saying goes "if it's free, you are the product." Well if we're paying for iCloud storage it's not free. So I agree with you.I paid for my cloud storage. It may be their hardware but it's my space.
Already did. But from the other end, as the images circulate and proliferate across the web, they get added to the database. The biggest distributors of child porn are kids themselves. The content they entertain each other with is adults problem to live in the world with, as we try to “protect” them from their own sexuality. Ridiculous. Hopefully this backfires in an appropriately extreme way.
The argument that Apple even has access to full database is erroneous. They would not be allowed full access as that in itself would be breach of privacy. They certainly don't have access to UK data on the matter. Its fire fighting after the event for a really poor decision.Already did. But from the other end, as the images circulate and proliferate across the web, they get added to the database. The biggest distributors of child porn are kids themselves. The content they entertain each other with is adults problem to live in the world with, as we try to “protect” them from their own sexuality. Ridiculous. Hopefully this backfires in an appropriately extreme way.
Germany, circa 1942Pls let me know of a real example. What is abusing data? I am not saying you are wrong. Just provide an example from the real world.
You are correct. Now think if someone sends you a picture forwarded to them of a Sunrise but added the hash tags they are looking for in the search...your and everyone with that picture is screwed. Things are not black and white.Apple should fully disclose how they are “hashing” these images, what constitutes a “match”, and what is the score or threshold at which point they hand your information over to some “nonprofit organization” who will then build a criminal complaint against you.
If Apple isn’t lying here (i.e., they never see the actual images on your phone), then they are creating a “fingerprint” of sorts and comparing the print of your image to a print of known child pornography. Is the system really going to be so unsophisticated that it only detects 1:1 matches for previously identified material? If so, then they need to be crystal clear about that. Or… is this system going to be ML-based and detect certain body positions, parts, postures, exposed skin, facial expressions, etc?
I am highly skeptical this is going to be a binary flag for each image (good/bad). Your entire library is going to be scored and, based on that score, it will be handed over to law enforcement who will use that score as gospel to secure a search warrant.
—
Let me make this point (as unpolitically as possible): at least in the USA, cultural norms and morals have been changing very rapidly in the last 10-20 years, and technology has played a large role in that. What was once cherished is now looked down upon; what was once reprehensible is now embraced as sacred and unquestionable. Right and wrong are evolving so quickly that a socially-accepted, non-serious tweet from 10-15 years ago will cost you future opportunities. There is no room for social or historical context, only bitter argument: you wrote that with yesterday’s standards and I’m judging it with today’s standards. So what happens when our technological overlords change their standards? How can I know that my thoughts, attitudes, and behaviors today will be acceptable to a future unknown standard?
I’ll leave you with this, and I tell it to my children every time they use the computer:
The internet never forgets, and it seldom forgives.
But thats just it. I dont have a big problem using gmail or google photos. I know google scans it. But there are 2 major differences:
#1 apple is doing it on my phone. If this was only on their server i could “trust” the scope. But now, apple is writing loopholes within the OS. It breaks all trust i have in the security of the underlying OS if they are willing to do that
#2. It is hypocritical. Google is an ad company. I expect them to do ad company things. Apple touts that privacy is a human right. I bought in. Then they do the same thing google is doing.
Right. Offloading the scanning to user's devices - that's going to upset a lot of people.The key word is “cloud”. Again. I have no issue with Apple looking on server side. I have huge issue with device scan.
stop being outraged about something you dont understand. Apple has found a way to van CSAM content from their servers while leaving you files encrypted on their servers. That’s a massive achievement. Because they want your data to remain your data. Nothing is leaving your device because they do CLIENT-SIDE verifications.Enough with this already. Apple, STOP trying to brainwash us every day now. Apple, STOP feeding us with the bull. I have a compromise for you to take. Since you did not even bother what the consumer thinks about this CSAM and you are bringing the CSAM feature this fall...
At this point, I am willing to pay $99 per year for keeping my privacy to myself. It's a win/win. Go ahead and start charging consumer to keep their privacy to themselves. We are talking billion of dollars in profit.
$99 Per Year (No iCloud Photo/Messages Scanning, Keep Your Data to Yourself and 100% Privacy)
That’s iPhone.
I will wait for your response...
They do it client-side so no data leaves your device and photos can remain encrypted on the web. Because they wanted a privacy friendly solution. It has nothing to do with the cost of server side scanning.Understood, I do not like the idea of my photos being scanned locally - despite this only happening if I have iCloud Photos checked.
As for #2 - Google sells data for ads - sells your behavior and usage patterns and anything else they can monetize. Let's be clear here, Apple isn't doing that - they aren't doing the same thing Google is doing. So I agree with #1 but got a problem with #2 - I see the point you're trying to make. The idea is that there isn't much of an effort required to do more.
Right. Offloading the scanning to user's devices - does that really save them that much server processing and $? What was the motivation for device scanning vs iCloud Photos scanning server side? How did they not see that this would upset so many people? Shocking. Agreed.
You would think so but in reality nothing in the cloud is private. That is why i never use cloud services.I paid for my cloud storage. It may be their hardware but it's my space.
This is a contradiction. You can not know what something is and also not know. It can't be both my private data and searchable. It's not a massive achievement it's hand waving and side talking.stop being outraged about something you dont understand. Apple has found a way to van CSAM content from their servers while leaving you files encrypted on their servers. That’s a massive achievement. Because they want your data to remain your data. Nothing is leaving your device because they do CLIENT-SIDE verifications
You wouldn't be talking like that if Apple got hold of your wife pictures.Apple has very good reason to target accounts with massive CSAM content. Obviously they’re the once that should be stopped. Owning one single photo can be a fluke, an accident, hardly an offence you can be convicted for. Typically these guys have massive libraries
stop being outraged about something you dont understand. Apple has found a way to van CSAM content from their servers while leaving you files encrypted on their servers. That’s a massive achievement. Because they want your data to remain your data. Nothing is leaving your device because they do CLIENT-SIDE verifications.
Thus it begs the question when something is done "for the children" without actually addressing the root of the problem.They are scanning for meta-data of know pictures. Not adult pornography etc. Again, how about working on stopping those making the media. So they arrest people that had it, sure bad, but won't stop it.
Those that view it may get caught, but we can question why they have it all day. Does lead them to make it under the DOJ. Recidivism is under 4%. Can we instead use the time and money to focus on stopping the abusers. Cause if there is no media made, then there is no media to be had.
But when you post on Facebook you know you’re sending the data to a third party server. In this case it’s in your own device afaik. that's awful.OK but this doesn’t change anything.
People who are overly paranoid about this feature are still going to be overly paranoid about this feature.
but I think the funniest thing is, the place where I’ve seen the most paranoia about this feature is Facebook.
If you use Facebook, this feature shouldn’t even slightly concern you because your privacy is already gone
#2. Google does not say privacy is a human right. Apple does. Apple is proposing to violate privacy. It is hypocriticalUnderstood, I do not like the idea of my photos being scanned locally - despite this only happening if I have iCloud Photos checked.
As for #2 - Google sells data for ads - sells your behavior and usage patterns and anything else they can monetize. Let's be clear here, Apple isn't doing that - they aren't doing the same thing Google is doing. So I agree with #1 but got a problem with #2 - I see the point you're trying to make.
Right. Offloading the scanning to user's devices - that's going to upset a lot of people.
Edit: @hagar goes into a good explanation below as to why local scanning - so data remains encrypted on the servers. But most people aren't going to understand that.
Apple has been saying that to gain consumers trust.#2. Google does not say privacy is a human right. Apple does. Apple is proposing to violate privacy. It is hypocritical
History has taught us, over and over and over, that this means going after those that profit from the industry. Target the distributors. Not the customers. People who feel they need this content need help. The people exploiting them are the ones that need to be in a prison.As you pointed out, the real solutions is to go for the real abusers.
Thus usually a system like this is balanced with transparency for audits and public scrutiny, and proper appeal process.You are correct. Now think if someone sends you a picture forwarded to them of a Sunrise but added the hash tags they are looking for in the search...your and everyone with that picture is screwed. Things are not black and white.
remember that Apple hold the key to encyption. Thats why when served a warrent they provide law enforcement with all icloud library decrypted. So i doubt its an issue….Right. Offloading the scanning to user's devices - that's going to upset a lot of people.
Edit: @hagar goes into a good explanation below as to why local scanning - so data remains encrypted on the servers. But most people aren't going to understand that.
There's some layer of trust in everything right? You trust your car not to fall apart while you drive it, you trust your (insert computer used here) not to constantly take photos of you while you use it distributing it to everyone you know.You wouldn't be talking like that if Apple got hold of your wife pictures.
Dude, I'm trying to save you. Don't fall for it. Don't get yourself PLAYED.
Right, it involves trust. Agreed.remember that Apple hold the key to encyption. Thats why when served a warrent they provide law enforcement with all icloud library decrypted. So i doubt its an issue….