Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Plenty of married people take photos of this nature, too. I don't think it means they're plastering them all over the internet or OnlyFans or whatnot.
Given how people like to use their cameras these days, I think it's completely appropriate that Apple and Google have restricted and encrypted folders to keep things away from those who should not see them.
People can do whatever they want. Personally, I prefer to be with someone than to look at someone. But to each their own.

Either way, my comment was a bad attempt at humor specifically responding to:
What's the fun in that?!
 
  • Like
Reactions: gusmula
I wonder if this is any more secure than the folder we already have for sensitive Photos?
 
Legally they have to, at LEAST to the point where they can attest to any authorities that they are not hosting any illegal content. New regulations make them responsible for reporting content they host even if a user placed them there.
My understanding was that this is not the case, but that the companies had to comply with warrants to search if there is reasonable cause, which is as it should be IMO. Think about it. Imagine having to go through all documents (e-mails, text and pictures to try to certify everything is legal - it is just not feasible).
 
Legally they have to, at LEAST to the point where they can attest to any authorities that they are not hosting any illegal content. New regulations make them responsible for reporting content they host even if a user placed them there.

I don't think this is true. Source, please. Apple backed off from scanning pictures after an outcry.
 
Google and the idea of privacy?
I needed the good laugh in the early morning. Thanks! 🤣
Agreed.

Rules I follow:
1. No cloud storage of pics for me. None. Not Apple, and especially NOT Google, the evil company.
2. Not using company phone. I won't use any phone controlled by someone else. Also, I will not connect to company features that require MDM installation (and therefore their ability to erase my phone at any time)
3. No iCloud backup, either. Apple has decided to comply with law enforcement requests and in the past they have happily provided data that is stored on their servers. Not acceptable IMO.

That said, this is a useful feature when handing off your phone to someone else with the Photos app open.
 
  • Love
Reactions: gusmula
My understanding was that this is not the case, but that the companies had to comply with warrants to search if there is reasonable cause, which is as it should be IMO. Think about it. Imagine having to go through all documents (e-mails, text and pictures to try to certify everything is legal - it is just not feasible).
I don't think this is true. Source, please. Apple backed off from scanning pictures after an outcry.
Open a modern browser and, in the URL, type “google csam”. It should be the first link. Search engines are fun!
 
Open a modern browser and, in the URL, type “google csam”. It should be the first link. Search engines are fun!
Sorry - what point are you trying to make? All I see is how Google is partnering with NGO's to try to prevent child abuse images being accessed and stored via their systems. There is nothing about a legal requirement to certify files stored on their systems are all legal. My understanding is that if Google or some other US company discovers illegal images on their servers, they must report them, but that is not the same thing as being required to do an exhaustive search to detect them, which for the vast number of users would be a search in the absence of a warrant that violates privacy without probable cause. I know legislation to require an exhaustive search has been raised in the US, EU and UK, but I hadn't heard anything about legislation being enacted.
 
Last edited:
  • Like
Reactions: Unregistered 4U
Open a modern browser and, in the URL, type “google csam”. It should be the first link. Search engines are fun!
It is my understanding that CSAM is looking for known photos that are digitally marked in some way and being shared by creepers. Someone’s personal photos of consenting adults would not be marked like this.

That said there is no way I would put anything like that on any cloud service.
 
It is my understanding that CSAM is looking for known photos that are digitally marked in some way and being shared by creepers. Someone’s personal photos of consenting adults would not be marked like this.

That said there is no way I would put anything like that on any cloud service.
They are provided with hashes of photos. They use the same algorithm to create a hash of every photo that’s uploaded to their servers and compare it to the stored hashes. If anything is a hit, then there are other steps taken, up to and including reporting the validated match.

Most of this is automated as, like you say, it’s highly unlikely that the hashes will false match but the possibility IS non-zero. If any does, then someone is looking at the photo.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.