Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
You have proof of this?


You think? What would be a more reasonable number in your estimation?
I think overtime proof will be out there. Many CEO’s are actually against this.


 
That’s a dumb take, locally files and photos were already indexed, sliced, diced, searched for faces, cats, dog, trees, sunset, that wasn’t principle-offending? That wasn’t a slippery slope? That couldn’t be pressured by Xi?

Basically to be coherent you should ask to
- ban any file indexing
- ban any hashing, don’t take fingerprints of my stuff for any purpose because then you could be pressured by dictators to do more with that!
- ban any metadata
- ban any AI deep analysis to look for cats and dogs in my pics

Basically people have no clue about what OSes already do locally once they’re logged in.

Apple should be commended for doing as much as possible of this stuff locally.
Well, I dony use any cloud services not to be subjected to slicing and dicing. And I would like to avoid that on my device as well. I understand that using "apple servers" comes with strings attached (same for google, Microsoft etc etc) but I would like my device to be "snooping free". That is my main point here. Apart of that I do not believe that it will have any measurable impact on child pornography and only creates tools for further privacy invasion.
 
I think overtime proof will be out there. Many CEO’s are actually against this.



Shock as CEOs jump on this to help push their own platforms.
 
I am privacy minded to the point of paranoia, and even I don't have a problem with step 1 of this process (scanning files uploaded to the cloud against a CSAM database). I trust Apple to do the right thing here, because Apple's incentives are aligned with doing the right thing.

However, I know from experience this is just step one.

Once the technology is implemented and people have accepted it, we'll move on to the same tech scanning files on your phone or laptop to check for materials that Apple has the copyright to, like if you have a web rip of Space Force on your phone. The way the technology is set up, it's easy to expand the database to include not just CSAM hashes, but also hashes of documents exposing governments, web rips of popular movies or TV shows, or memes critical of the local government.
 
I think overtime proof will be out there. Many CEO’s are actually against this.



Very CEO mentioned has a previous grudge against Apple. Hah.

Every CEO mentioned already does way worst with your personal data.
 
Apple fans are hilarious. First they bash the competition because their products are spying on them, no privacy protection, etc… Now they defend Apple because “the competition is also doing it”.

I miss the good ol' days when we argued about market share... or the "premium" phone market versus cheap burner phones... or how many years a platform provides software updates... or Android tablets thrown in a drawer.

You know... the classics.

:p
 
  • Like
Reactions: mr_jomo
I think overtime proof will be out there. Many CEO’s are actually against this.


Again, you think.

A lot of if, buts and thinks about all this...
 
The solution to remediate this would be to have the NCEMC release the list of hashes for open audit and review, it’s not like you can reconstruct the images from the hashes. If people find that non-CSAM images are being added to the database, it would be uncovered very quickly. The biggest issue with this system is that the hash list is presumed to only contain illegal images, yet is controlled by government funded entities. Apple’s response to this is that they check images first before reporting to law enforcement, and that response isn’t good enough.
 
I miss the good ol' days when we argued about market share... or the "premium phone" market... or how many years a platform provides software updates.

You know... the classics.

:p

Prediction…Apple has one of their best iPhone sales year ever and iOS 15 adoption rate hits typical 90+% in mere months despite then “invading my privacy”.
 
Interesting that the entire database of CSAM hashes will be stored on every iPhone. I get that it's just text. But how much space would we estimate this to require? Any guesstimates?
 
  • Like
Reactions: AstonSmith
The other companies have been doing this in the cloud so far. And you don't use the cloud if you care about the data.
Apple, on the other hand, scans on the device. This means that data that is not in the cloud can also be scanned. At least the technology is there.

The technology was always there.
iOS isn’t open source, you wouldn’t know what‘s happening under the hood with all the indexing and neural engine picture analysis going on.
Nothing has changed.
Except one thing: this time the local analysis is not done in the direct interest of the user. Well I mean, if you care about stopping child abuse, it’s somehow in your interest too.
But it’s not really a scan of STRICTLY local data. It’s data that sooner or later will END UP in the cloud. So Apple is asking you this: “Listen, we both know you’re about to upload this data to iCloud Photos, you already agreed to upload it, we’re already in the boarding area of the airport, yes the plane has yet to take off but c’mon, so, we could scan your pics once they’re ACTUALLY on our servers like other companies do, but could you pretty please let us PRE-label them locally on your device with a device-side PRE-scan when they’re still UNENCRYPTED so we don’t have to invade your privacy even more (like everybody else does) by decrypting them (or scanning them anyway) once they’re already on our servers? In a way that basically only escalate to human review for people who owns MULTIPLE offending pics and makes almost impossibile to catch an account by accident? Makes sense?”

That’s how it is.
It is local, but in a way it’s not until the security vouchers have MULTIPLE MATCHES and hence are TRANSMITTED to Apple in clear. It’s a Schroedinger cat until proven positive MULTIPLE times. It’s a tree falling in a forest with nobody listening until you got MULTIPLE matches. Not sure how to explain it any better.
 
This is a good tech in theory - anything we can do to protect children should be done, but in this day and age, where people are used to having their data treated like **** by companies, it shouldn't be done.

The line in the FAQ which worries me - when it comes to government - "Apple will refuse any such demands". Will they? If China turns around and says "hey, if you want to sell iPhones here, you must use this tech to identify anyone uploading known anti-CPC images". Are they really going to say no? Same as how they said no to storing chinese customers' icloud data in china? Same as how they say no to all of the government warrants which request information from/access to Apple accounts?

No chance.
 
I understand why some people don't see a problem with this. On it's face it sounds great. Who doesn't want to see people sharing CSAM get (ahem) ****ed.
The problem is, it isn't just "privacy" being mishandled by relatively transparent democratic governments that is at issue here, the real problem is that this technology can and will be weaponized by repressive governments around the world. Apple can deny it all they want but they have already demonstrated that they can and will bend to the CCP when their continued business in China is on the line. If you live within the reach of the PRC or even work in an Asia-adjacent field you should be concerned about this.
The CCP would use the technology to search for anything that is anti-CCP, documents, images, videos. China will just steal the technology and use it to spy even more on it's citizens.
 
Whilst a good idea to rid this world of the perverts, Apple hasn’t thought this through. What about somebody who has young children, taken pics of their children in the paddling pool with not much on. There’s nothing seedy about that, they are your children. I have young children and have thousands of photos on my iCloud. 99.9% full clothed and the rest, said paddling pool pics etc. I am not happy for anybody to be going through MY photos to check if they are child porn. They are my private collection of the kids childhood and they are not for other peoples eyes if I choose it to be. This will backfire massively and literally achieves nothing as anybody who is as sick to watch or create these pics, I would imagine, keep their stash offline anyway and share the pics online manually. This is just an excuse to get a back door in and I for one will not tolerate it. I will sell my Apple stuff and move back to windows/android and keep my pics and vids of the family offline. I only moved from windows/android due to privacy.
 
Yes, it's a hype, absolutely. But a justified one in my opinion. Don't get me wrong, people who own child pornography are terrible and should be punished severely. But the problem I see is the invasion of privacy of millions of people to convict a few (these people can also simply disable the upload of their pictures, then it has no benefits but just the destruction of privacy).

Governments in the future could simply force Apple to track down people who are found to possess images with certain hashes. Thinking into the future, aren't you concerned about what this technology can be abused for and how much damage it can do?
CP itself is sometimes hotly contested considering the grey area of nudity and 2D arts (refer to Japan). What is acceptable in Japan can be considered CP in the US.

Since morality is not a simple black and white and depends on many things like culture, geography, and even religion, anything that can potentially judge a person based on morality derived from a blackbox without option for public scrutiny or appeal, should be a concern.
 
  • Like
Reactions: Mendota
What about somebody who has young children, taken pics of their children in the paddling pool with not much on. There’s nothing seedy about that, they are your children. I have young children and have thousands of photos on my iCloud. 99.9% full clothed and the rest, said paddling pool pics etc. I am not happy for anybody to be going through MY photos to check if they are child porn. They are my private collection of the kids childhood and they are not for other peoples eyes if I choose it to be.
Where/when did Apple say that people will be sifting through your paddling pool photos?

Unless your photographs are a 99.9% match to those on a separate database of indecent images - checked using A.I - I think it's safe to assume your privacy is in check...
 
Well, of course not - but what can you do? Glower at them? Bend them over a knee and spank them silly? Alas, the only way forward is to either keep on trucking or move to another platform - until that too suffers a similar fate.
Third option: Let it be known to Apple that people don’t want to be the subjects of mass surveillance from an Apple product, until they back off this very odd (who in Apple came up with this idea?), and ill conceived situation.

”Keep on trucking”. Right. Keep letting our rights to privacy get chipped away until we are just like China/Russia.
 
Whilst a good idea to rid this world of the perverts, Apple hasn’t thought this through. What about somebody who has young children, taken pics of their children in the paddling pool with not much on. There’s nothing seedy about that, they are your children. I have young children and have thousands of photos on my iCloud. 99.9% full clothed and the rest, said paddling pool pics etc. I am not happy for anybody to be going through MY photos to check if they are child porn. They are my private collection of the kids childhood and they are not for other peoples eyes if I choose it to be. This will backfire massively and literally achieves nothing as anybody who is as sick to watch or create these pics, I would imagine, keep their stash offline anyway and share the pics online manually. This is just an excuse to get a back door in and I for one will not tolerate it. I will sell my Apple stuff and move back to windows/android and keep my pics and vids of the family offline. I only moved from windows/android due to privacy.

The way I understand it... they are comparing hashes against *known* CSAM images. Child porn images. Sick stuff.

Your personal iPhone photos are not those... and Apple isn't doing a visual scan of your photos anyway.

At least that's how it has been described. ¯\_(ツ)_/¯
 
  • Like
Reactions: Javabird
Whilst a good idea to rid this world of the perverts, Apple hasn’t thought this through. What about somebody who has young children, taken pics of their children in the paddling pool with not much on. There’s nothing seedy about that, they are your children. I have young children and have thousands of photos on my iCloud. 99.9% full clothed and the rest, said paddling pool pics etc. I am not happy for anybody to be going through MY photos to check if they are child porn. They are my private collection of the kids childhood and they are not for other peoples eyes if I choose it to be. This will backfire massively and literally achieves nothing as anybody who is as sick to watch or create these pics, I would imagine, keep their stash offline anyway and share the pics online manually. This is just an excuse to get a back door in and I for one will not tolerate it. I will sell my Apple stuff and move back to windows/android and keep my pics and vids of the family offline. I only moved from windows/android due to privacy.

You really need to do more reading and less writing.

Your assumptions about your personal pics are completely wrong and shows you couldn’t even get through the dumbed down MacRumors article.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.