Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
You trusted Apple with all of your data up until this point, so why do you think suddenly they just threw all privacy out the window and they're now going to use it to spy on you?
i think you are perhaps underestimating the significance of this decision, it is proof of concept to the whole world and we now will see apple under intense pressure to scan for who-knows-what on people's phones, governments may well order apple in secret to load databases of all kinds on phones

plus, you have gotten it wrong: apple IS spying on me now (or will be if i download os15)

finally, why are they not doing this another way, when they can accomplish the same objective by doing this off peoples phones, what are they up ? are they being forced to do this ? by who ?
 
i think you are perhaps underestimating the significance of this decision, it is proof of concept to the whole world and we now will see apple under intense pressure to scan for who-knows-what on people's phones, governments may well order apple in secret to load databases of all kinds on phones

plus, you have gotten it wrong: apple IS spying on me now (or will be if i download os15)

finally, why are they not doing this another way, when they can accomplish the same objective by doing this off peoples phones, what are they up ? are they being forced to do this ? by who ?
Sounds like a lot of speculation. Better ditch Apple because of something YOU think they MIGHT do.

I encourage everyone to vote with their wallet. Don't buy into Apple if you don't like their practices.
 
  • Like
  • Sad
Reactions: dk001 and MozMan68
CSAM is evil. But it’s not the only evil thing in this world. Today they scan for CSAM. Tomorrow they scan for terrorist activities. The day after tomorrow they scan for murder plots. Who could possibly be against that? Where does it stop?

How exactly would they use this system to scan for terrorist activities or murder plots?

The system is incredible bad at scanning for general things.
 
You trusted Apple with all of your data up until this point, so why do you think suddenly they just threw all privacy out the window and they're now going to use it to spy on you?

Case in point.

Apple just created a huge juicy target for governments around the world to covet and force(?) Apple to collude/cooperate with.
 
Case in point.

Apple just created a huge juicy target for governments around the world to covet and force(?) Apple to collude/cooperate with.
And yet here you are. Still with Apple. You should've left them a while ago. Afterall, they want to spy on you.
 
Where are you drawing this conclusion from? I’m not okay with warrantless searching on people that are not presumed to have committed a crime. Cloud servers are not my property, they are infrastructure that belongs to the service provider. I’m consenting to the provider’s terms when I upload content there, including consenting to searches in the cloud.

I am NOT consenting for surveillance software to be installed on my personal device.
PLOT TWIST: They already do this today!! You’ve been consenting for practically as long as you’ve had an iPhone.
 
How exactly would they use this system to scan for terrorist activities or murder plots?

The system is incredible bad at scanning for general things.

I would start by adapting the CSAM scanning to flag popular images that specific terrorists would share, memes about the creating war in the West, executions, etc. Those can be hashed just as easily as any CSAM. Instead of sending it to the CSAM Apple team, it is vouchered the same way to an Apple Anti-terror (iTerror™) squad, who would go to FISA court to get warrants to build a network of who received those messages and images once the specific account hits a set score.

Then all of those suspects' devices would receive hashes to search for more photos, and the neural hashing algorithm expanded to be more directed towards the user generated images instead of the flagged hashes only.

The Messages CSAM grooming flagging can also be adapted, maybe secretly, to identify terrorist recruiting cells.

Specific networks that could be targeted include Al Qaeda or ISIS terrorist factions, QAnon violent extremists, Kurdistani Freedom fighters, Navalny supporters, Hong Kong Protestors, Uighar dissidents, Falun Gong, Anti abortionists, Palestinian Gaza protestors, Anti vax networks.

The machine is ready to be fed.
1536.jpg
 
In order for anything to happen you would have to do all of the following

1. Update to iOS 15 when it comes out
2. Enable iCloud Photo Library
3. Have a whole collection of known CSAM photos being uploaded to iCloud

All of these things are optional. Nobody is forcing your phone to report everything it finds to Apple or authorities.
You are completely sidestepping the issue, so let’s just get to the point.

Should innocent citizens put up with warrantless searches on personal property? It all boils down to that. How the search is being done is irrelevant.

If warrantless searches and surveillance dont bother you, then we’ll just have to agree to disagree.
 
I would start by adapting the CSAM scanning to flag popular images that specific terrorists would share, memes about the creating war in the West, executions, etc. Those can be hashed just as easily as any CSAM. Instead of sending it to the CSAM Apple team, it is vouchered the same way to an Apple Anti-terror (iTerror™) squad, who would go to FISA court to get warrants to build a network of who received those messages and images once the specific account hits a set score.

Then all of those suspects' devices would get receive hashes to search for more photos, and the neural hashing algorithm expanded to be more directed towards the user generated images instead of the flagged hashes only.

The Messages CSAM grooming flagging can also be adapted, maybe secretly, to identify terrorist recruiting cells.

Specific networks that could be targeted include Al Qaeda or ISIS terrorist factions, QAnon violent extremists, Kurdistani Freedom fighters, Navalny supporters, Hong Kong Protestors, Uighar dissidents, Falun Gong, Anti abortionists, Palestinian Gaza protestors, Anti vax networks.

The machine is ready to be fed.
1536.jpg
But they can do all of that now. What's stopping them?
 
You are completely sidestepping the issue, so let’s just get to the point.

Should innocent citizens put up with warrantless searches on personal property? It all boils down to that. How the search is being done is irrelevant.

If warrantless searches and surveillance dont bother you, then we’ll just have to agree to disagree.
How is this any different from you giving apple access to the photos to search?
 
I would start by adapting the CSAM scanning to flag popular images that specific terrorists would share, memes about the creating war in the West, executions, etc. Those can be hashed just as easily as any CSAM. Instead of sending it to the CSAM Apple team, it is vouchered the same way to an Apple Anti-terror (iTerror™) squad, who would go to FISA court to get warrants to build a network of who received those messages and images once the specific account hits a set score.

Then all of those suspects' devices would get receive hashes to search for more photos, and the neural hashing algorithm expanded to be more directed towards the user generated images instead of the flagged hashes only.

The Messages CSAM grooming flagging can also be adapted, maybe secretly, to identify terrorist recruiting cells.

Specific networks that could be targeted include Al Qaeda or ISIS terrorist factions, QAnon violent extremists, Kurdistani Freedom fighters, Navalny supporters, Hong Kong Protestors, Uighar dissidents, Falun Gong, Anti abortionists, Palestinian Gaza protestors, Anti vax networks.

The machine is ready to be fed.
1536.jpg
in addition they could hash political figures, or gay celebrities, pro or anti vax celebrities, they could almost certainly build very detailed an accurate profiles of people based on what kinds of images people have on their device, who is to say they can't just simply download databases of key words and scan for them

once the principle is established (apple has now established the principle under huge cover of being the most "private" tech company on the planet) scanning could in as yet undreamed of ways
 
  • Like
Reactions: Pummers and briko
How is this any different from you giving apple access to the photos to search?
Because of consent.

Like when a police officer asks for permission to search your property, you have the opportunity to provide or deny consent.

That consent is only side-stepped if there is a warrant or probable cause for a search. Innocent citizens whom have done nothing wrong should not be subjected to surveillance.

Im not even trying to stop my photos from being scanned. I literally have nothing to hide, so I’m happy to upload them to iCloud and have them scanned there. I consent to the scan when I upload them off my device to Apple. But apple’s implementation of on-device scanning is wrong in principle, so I must protest.
 
in addition they could hash political figures, or gay celebrities, pro or anti vax celebrities, they could almost certainly build very detailed an accurate profiles of people based on what kinds of images people have on their device, who is to say they can't just simply download databases of key words and scan for them

once the principle is established (apple has now established the principle under huge cover of being the most "private" tech company on the planet) scanning could in as yet undreamed of ways
The thing is. Apple can't allow CSAM images on their servers so either they have the encryption key for all of your photos, or they do it the new way where they have access to NONE of your photos UNTIL there's MULTIPLE matches... only THEN do they actually have access to those specific photos to visually check, and then they decide whether to get anyone else involved.

Honestly, I'd rather have it the new way. One in a trillion per year are pretty good odd that I'll have absolutely nothing to worry about and Apple will never have access to any of my content.
 
Because of consent.

Like when a police officer asks for permission to search your property, you have the opportunity to provide or deny consent.

Im not even trying to stop my photos from being scanned. I literally have nothing to hide, so I’m happy to upload them to iCloud and have them scanned there. I consent to the scan when I upload them off my device to Apple. But apple’s implementation of on-device scanning is wrong in principle, so I must protest.
You give them "consent" when you agree to their terms and conditions.
 
I’m not going to tell anyone WHAT to be mad about, but you all have basically lost any credibility about being mad at this tech.

The reason? It is already on your phone and has been there for quite some time.

The only thing different with this implementation is that not only is Apple “scanning” pics on the phone like they always have, not only CAN they ”scan” pics on iCloud like they always have….the difference here is that if multiple images (again, ones that they have been looking at for years now) are flagged as a match with the child pornography database, they will do a secondary review to determine the match before reporting any flagged account to the authorities.

Why was there no major uproar when people found out Apple tracks every devices movement to provide better traffic info? Every device is preset for this option and is not easy to find out where to turn it off.

They added a separate option a couple of years ago to track detailed routing information when using Maps…uproar?? None.

If you don’t want Apple contacting the authorities when there are flagged illegal photos on THEIR servers, take it up with them. Don’t buy their products, don’t use iCloud (this option doesn’t even work without iCloud for photos turned on), but don‘t be mad about them having the tech on your phone NOW…it was already there and just used differently…nothing has changed on that front.
 
I wouldn't have any problem with Apple scanning images being uploaded to iCloud on the iCloud end, but having it on the device, even if it's just semantics, is a big difference to me. Using CSAM as the impetus for this new feature was the safest bet, because CSAM is just so objectionable, it's hard to argue that we shouldn't do whatever we can to prevent this type of criminality. But as so many others have already pointed out, it's the mission creep that makes this technology so dangerous. Next it will be people who post photos of their gun collections, with ban states looking for people who have illegal features on their rifles, suspect magazines, or suppressors that the state just wants to check stamps on. And then it's people who post pictures of themselves in MAGA hats, or who post memes that contravene the Twitter and Facebook narratives. Censorious, control-hungry governments are already salivating at the potential of this new feature.
 
The thing is. Apple can't allow CSAM images on their servers so either they have the encryption key for all of your photos, or they do it the new way where they have access to NONE of your photos UNTIL there's MULTIPLE matches... only THEN do they actually have access to those specific photos to visually check, and then they decide whether to get anyone else involved.

Honestly, I'd rather have it the new way. One in a trillion per year are pretty good odd that I'll have absolutely nothing to worry about and Apple will never have access to any of my content.
first, they don't have to do this on the phone, they are basically conducting warrantless searches

they could do it (the scanning) off the phone on their own servers, apparently they would rather we use our batteries to conduct their warrantless surveillance for them

but most importantly, ask a 1000 people, a mix of techies and "normals" whether they perceived a difference between the privacy of their data while on their device and the privacy of their data once it has left their device, the answer would be stark i believe with the overwhelming belief that what is on the phone is sacrosanct, it is private and personal and nobody should be allowed to scan for anything

apple has convicted us all of child abuse and we haven't even had a trial

what else will come next ?
 
you giving apple access

It's right there...


You give them "consent" when you agree to their terms and conditions.


Arguments that then go onto cite user agreements and such just prove how much we need to reign those in.

Using a device shouldn't mean user agreements that amount to "you agree to everything the platform owner wants or thinks about doing in the future"
 
I wouldn't have any problem with Apple scanning images being uploaded to iCloud on the iCloud end, but having it on the device, even if it's just semantics, is a big difference to me. Using CSAM as the impetus for this new feature was the safest bet, because CSAM is just so objectionable, it's hard to argue that we shouldn't do whatever we can to prevent this type of criminality. But as so many others have already pointed out, it's the mission creep that makes this technology so dangerous. Next it will be people who post photos of their gun collections, with ban states looking for people who have illegal features on their rifles, suspect magazines, or suppressors that the state just wants to check stamps on. And then it's people who post pictures of themselves in MAGA hats, or who post memes that contravene the Twitter and Facebook narratives. Censorious, control-hungry governments are already salivating at the potential of this new feature.
You might not wanna hear this, but I think it's time for you to move on from Apple if you can't trust them. End of story.
 
It's right there...





Arguments that then go onto cite user agreements and such just prove how much we need to reign those in.

Using a device shouldn't mean user agreements that amount to "you agree to everything the platform owner wants or thinks about doing in the future"
Before you update to iOS 15, there will be terms and conditions. If you disagree with any of those, you can't install the software. By clicking agree, you are giving consent. That's why it exists.
 
Before you update to iOS 15, there will be terms and conditions. If you disagree with any of those, you can't install the software. By clicking agree, you are giving consent. That's why it exists.

We aren't sure this is limited to iOS 15 and Monterey
Apple won't even say (they've been asked) what was just in BigSur 11.5.2 for instance
 
Why would Apple open this floodgate? Is the government requesting this? Is a new upcoming business related to it? Why do they offend all of their customers by searching everybody's phone for what only some idiots collect at all? This is so out of proportion I still can't believe they do it after having claimed to put customer privacy first for so long.
 
You give them "consent" when you agree to their terms and conditions.
Yes… and you’ve finally reached the point.

I am happy to consent to iCloud scanning. I do not want to be put into a position where I must consent to searching on my personal device, or stop using my personal device.

There is no path forward to continue using my personal property unless I consent to warrantless searches.

And don’t be flippant and just tell me not to update to ios15. We all know our devices will stop operating if we don’t update them.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.