Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
If it’s as apple states, I’m glad apple gives us as parents that feature to protect our kids.

As for the iCloud photos thing, I’m not gonna cry if sex offenders aren’t allowed to store their illegal and inmoral pictures in apple’s servers.
It’s for people like you with a closed mind that we lose privacy, don’t you get at any moment someone or even Apple can put new hashes to get info for any other activity beside save the children....
 
We're all aware that. What's your point?
Your entire point seems to be that we should trust Apple implicitly not to use the technology against us, even if that's against their best financial interest.

Since the Snowden revelations I've seen a lot of pathetic attempts of people telling me to trust governments and corporations implicitly, but I don't think I've ever seen a more strident example than this.
 
  • Like
Reactions: Frustratedperson
In my opinion, this exceeds and goes against the privacy that Apple talks so much about.

Obviously, going after child sexual predators is a good and positive thing, but I think indexing your photos in iCloud is excessive and contrary to user privacy. And this is dangerous, because Apple says it's doing it in search of child pornographic content, but do we have to trust Apple when it says it's only doing it for that purpose, and will this open the door for Apple to index our personal photos for their benefit? This makes me quite suspicious.

The issue of iMessage if I see it right and proper, since each parent can activate it or not.
 
Google has been doing this with GMail since 2014


No one bats an eye for that

But when Apple does it, NOW everyone gets upset

Why is this?

People do realise that companies such as Google, Adobe, Facebook et. al already use some form of automated technology to scan for and detect CSAM? Adobe does it with Creative Cloud:


That's just one example.
The crucial difference here is, that Apple's technology is built from ground up to circumvent end-to-end encryption because it is injected by design before any encryption can happen.

This is what the critics are rightfully complaining about.

Neither Gmail, nor Facebook (on itself), nor Adobe Cloud use end-to-end encryption.

No one has anything against Apple doing something to fight CSAM, but the means they are using are capable of removing any privacy.
 
I think the difference is that everyone expects Google, Adobe, and Facebook to be completely morally and ethically bankrupt while Apple seemed not to be
I don’t find google to be morally bankrupt. Been a heavy google user way before the iPhone existed. They are a pretty transparent company, MOST of the time.
 
  • Like
Reactions: Mendota
That MP3 file in your music folder appears in the IllegalFiles hash database. The authorities have been notified on your behalf. Please remain in position for the authorities to close in on you. This service has been automatically done for your own safety and that of your children, of course you would not want your children anywhere near STOLEN property.
 
as far as we know all the poop countries are already spying on their "enemies" (opposition, gays, etc.). I don't think it will change for the worse in those countries when Apple implements such technology, but I can see how it could potentially protect children in a society, that's less oppressive. If things get out of control someone can always change to a different phone manufacturer.

I'd rather protect our most vulnerable members of society than worry about what i.e. the UAE is doing, since they are already infecting journalists' phones with spy software and then cut them up and make them disappear.

Governments who deal with such violent states should be held accountable, not Apple who attempts to protect children from the most cruel crimes.
 
While I am sad to see a member of the Apple community leaving, I understand your decision to move towards companies with a much better track record for privacy and ethical conduct such as Microsoft and Google
You obviously didn’t read my post. When I was on windows I stored my private things offline. I only put them online when moved to Apple as I trusted them. Now I don’t. In my household I can save a fortune every year by moving back to windows/android and just go back to an offline photo storage/document storage etc. No big deal.
 
The difference is that Google is scanning images in the cloud where no one has any reasonable expectation of privacy. Apple is scanning a private device. Big difference.
Nope, Apple is pre-labeling pictures that have already bought a one-way ticket to the cloud.
Stop with this completely moot distinction.
 
  • Like
Reactions: ohio.emt
So, please... tell me... where's the 'spying' you're all screaming about?
They implemented it in such a way that can easily be used to scan for copyright infringement as well.

Get the hash of that Ted Lasso web rip on TPB and boom, find that web rip on an iPhone somewhere, boom you have caught a "criminal". For a company moving more and more into the direction of distributing their own online content, this is a very lucrative avenue.

Where is this blind trust coming from?
 
I don’t find google to be morally bankrupt. Been a heavy google user way before the iPhone existed. They are a pretty transparent company, MOST of the time.
I'm not saying they are morally bankrupt, just that some people expect them to be as a result of some of their morally bankrupt actions
 
  • Like
Reactions: iapplelove
How can we honestly have a fair and open discussion about this subject+thread if it isnt classified under PRSI?

Nothing to say here of nuance except 'Apple good' or 'Apple bad'?
 
Whilst a good idea to rid this world of the perverts, Apple hasn’t thought this through. What about somebody who has young children, taken pics of their children in the paddling pool with not much on. There’s nothing seedy about that, they are your children. I have young children and have thousands of photos on my iCloud. 99.9% full clothed and the rest, said paddling pool pics etc. I am not happy for anybody to be going through MY photos to check if they are child porn. They are my private collection of the kids childhood and they are not for other peoples eyes if I choose it to be. This will backfire massively and literally achieves nothing as anybody who is as sick to watch or create these pics, I would imagine, keep their stash offline anyway and share the pics online manually. This is just an excuse to get a back door in and I for one will not tolerate it. I will sell my Apple stuff and move back to windows/android and keep my pics and vids of the family offline. I only moved from windows/android due to privacy.
you are still in the stage of not understanding what all this is and simply reacting to your initial perception.

Unless your photos are known already, by an authority called NCMEC, to be offensive images is is nearly impossible for your personal photos of your own family to ever be flagged at all by this system. There's two separate checks that happen before any flagging.
I've not done the maths, but you're probably more likely to get struck by lightning every month for the next year than have even one false positive from this system.
Then for anyone other than you to see the photos, you'd have to store multiple such photos in iCloud.
This just isn't going to happen.

I was shocked and angered by this at first. As usual education and understanding set things right. It took me the weekend to work through it.
This comes back to a phrase "Any suitable advanced technology is indistinguishable from magic." and this cryptographic stuff is pretty damed advanced for most people.

Moving to Microsoft or Android will do nothing to keep your photos from a system like this, both of them do similar things. And Google will also mine your photos, emails, messages and anything else it can to sell your profile to advertisers.
 
  • Like
Reactions: ohio.emt
Your entire point seems to be that we should trust Apple implicitly not to use the technology against us, even if that's against their best financial interest.

Since the Snowden revelations I've seen a lot of pathetic attempts of people telling me to trust governments and corporations implicitly, but I don't think I've ever seen a more strident example than this.
His username means we have to agree as he obviously knows better zzzzzzz
 
Google has been doing this with GMail since 2014


No one bats an eye for that

But when Apple does it, NOW everyone gets upset

Why is this?
Because Apple promised never to look at our data. The other companies are in the data processing business.
In order to rule out false negatives, a human review is in place. That person will have to see your photos, and it‘s gonna be an employee.
If it‘s a correct positive I think everyone is onboard with all the consequences, but if it‘s a false positive, that person has been subject of surveillance. By a privacy-screaming brand.
 
as far as we know all the poop countries are already spying on their "enemies" (opposition, gays, etc.). I don't think it will change for the worse in those countries when Apple implements such technology, but I can see how it could potentially protect children in a society, that's less oppressive. If things get out of control someone can always change to a different phone manufacturer.

I'd rather protect our most vulnerable members of society than worry about what i.e. the UAE is doing, since they are already infecting journalists' phones with spy software and then cut them up and make them disappear.

Governments who deal with such violent states should be held accountable, not Apple who attempts to protect children from the most cruel crimes.
That’s a pathetic excuse followed by a mainstream concern when you don’t look forward, let’s hope someone don’t follow you for your political beliefs or any other activity not illegal for uploading photos to your personal privacy we never look at Apple iCloud inc because a government or political agenda wants it.
 
Because Apple promised never to look at our data. The other companies are in the data processing business.
In order to rule out false negatives, a human review is in place. That person will have to see your photos, and it‘s gonna be an employee.
If it‘s a correct positive I think everyone is onboard with all the consequences, but if it‘s a false positive, that person has been subject of surveillance. By a privacy-screaming brand.
Exactly. Nothing is fault free. It’s disgusting that an employee could see a picture, if flagged up incorrectly, a PRIVATE picture of my daughter in the paddling pool for example. It’s just unacceptable. You can’t tell me with all this technology that we can’t track down a pervert sharing images online without the need to root through EVERYBODYS phone. I’ll never believe it.
 
Last edited by a moderator:
For those of you asking, "What is stopping Apple from letting governments have access to my phone/pictures/data?"

What has ever stopped them? You've been sharing all of your data including emails, pics, calendar appointments, 3rd party app info, social media posts, etc. for as long as they existed and they could easily share access to all of this info via your iCloud account info.

The difference is that I trust Apple (more than others) to put the safeguards in place to prevent this access from being abused as well as NOT caving to government requests for access to data illegally (yes, illegally...all governments have laws allowing them SOME access to data either through law or with proper authorization).
 
  • Like
Reactions: ohio.emt and I7guy
For those of you asking, "What is stopping Apple from letting governments have access to my phone/pictures/data?"

What has ever stopped them? You've been sharing all of your data including emails, pics, calendar appointments, 3rd party app info, social media posts, etc. for as long as they existed and they could easily share access to all of this info via your iCloud account info.

The difference is that I trust Apple (more than others) to put the safeguards in place to prevent this access from being abused as well as NOT caving to government requests for access to data illegally (yes, illegally...all governments have laws allowing them SOME access to data either through law or with proper authorization).
The difference is that these companies don’t use privacy as excuse and human right to sell their products.
 
The way I heard it described on a podcast was like this:

Apple is not doing a visual scan of your photos. They're not looking at the actual contents of your photos.

They are, instead, comparing hashes of *known* CSAM images. These are photos that have already been labeled as child porn.

So there's no danger of Apple flagging a photo of your child in the bathtub or whatever.

With all that said... no one knows what else Apple could do in the future. Perhaps they could start scanning the actual contents of your photos. So I can see why people are freaked out.

But as others have said... all of the big companies are doing similar things. So I dunno.
The problem is not that they are doing this for CSAM images. It's well intentioned. The problem is the backdoor that they've opened.

Imagine this scenario. The FBI is trying to find someone very specific. They contact Apple but Apple holds its ground and gives a firm "No". Fine right? Do you think the FBI will stop there? No of course not, they will go to the NCMEC and have them upload a set of images. Chances are, they won't even bother with Apple, they'll go directly to the NCMEC and start finding people of interest.

Now replace FBI with another country's government. Say, the China, and the pictures are sets of pictures from the Hong Kong protestors.

Now replace China with a bunch of hackers who hack the NCMEC.

Lots of things can go wrong here
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.