Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
These features have good intentions and I’m fine with all of them except having iCloud Photo Library self reporting you to law enforcement. That feature has got to go. As long as parents have control over the other features I’ve got no issue with them.

Yes, you can disable iCloud Photo Library (and I have) but you can’t add additional storage capacity that you didn’t purchase up front with the expectation that you could use iCloud Photo Library to reduce your storage requirements.

The point is that for a company that touts privacy as one of their main selling points and uses it to justify higher prices for hardware this is an epic failure. Thankfully I always buy more storage than I think I’ll need so shutting off iCloud Photo Library on my Mac, MacBook Air, iPhone and iPad and downloading all of my photos on devices was feasible but if I had a huge library and didn’t buy devices with the capacity to hold them all because of iCloud Photo Library I’d be rightfully ticked off.

For the “Who cares if you have nothing to hide?” crowd see the 4th Amendment to the US Constitution. The founding fathers knew from experience that if there weren’t explicit protections preventing the Federal Government from conducting unreasonable search and seizure that it would be abused. This is no different. This technology will be abused by somebody at some point. Be it Apple, hackers, a government entity, a ticked off significant other who knows your passcode, etc. You mean to tell me US intelligence and the Russian and Chinese governments weren’t licking their chops when they heard about these features? If you think they weren’t I’ve got a bridge in New York City to sell you.
 
“Apple is inventing a world in which every product you purchase owes its highest loyalty to someone other than its owner. To put it bluntly, this is not an innovation but a tragedy, a disaster-in-the-making.”

Software and hardware don't have loyalty. It's a human/animal trait.

Software is meant to the bidding of the programmer and not the user. If you don't trust the programmer, don't use the software. It was true in 1950 and it is true today.
 
If I was leaving the house with said photos, then decide to hand them over to to a public corporation to keep safe. Yes.

If leaving the house and keeping for personal use, and for no one else to see. No. That’s my business.

iCloud is just like the first option. You’re choosing to hand it over by having iCloud photo library enabled.
You have to choice to say no.

Sure if I rent a storage compartment somewhere the landlord comes in randomly to browse through my stuff.
 
  • Like
  • Sad
Reactions: Pummers and dk001
IDK what policy is for linking to Reddit, but from the usual 2 mega-threads related to Apple spyPhone and CSAM:

spyPhone - Apple Surveillance.jpg
 
No it won’t. These are older images. This will nab idiotic viewers at best.
Now LEO may be able to use someone caught as the start of a breadcrumb trail in an attempt to catch a creator.
However, this feature does nothing to prevent new material nor does anything to stop nor nab the creators.

The transmission of such images is an abuse of the child itself.

The child is abused:
1. When the image is produced
2. When it is stored
3. When it is duplicated and transmitted or transported
4. When it is received

You might believe #1 is the worst and it probably is, but #3 and #4 might happen hundreds of thousands of times in some cases. Which makes the child suffer hundreds of thousands of abusing acts.
 
I think people are aware. And the biggest offenders of child abuse and sex trafficking rings are some famous people such as rich celebrities, actors, producers, politicians etc. Just recently I read about this.

"Oprah Winfrey said on Friday that she was cutting ties with a documentary centered on women who have accused the music mogul Russell Simmons of sexual misconduct. The untitled film, scheduled to have its premiere this month at the Sundance Film Festival, focuses primarily on the executive Drew Dixon, who accused Mr. Simmons of raping her, an accusation Mr. Simmons has repeatedly denied."

What was Apple's reaction? "Apple declined to comment."

Of course this is not child abuse related but it is disgusting and hypocritical from Apple to preach about protecting children and women while at the same time they don't take any action when there is clear indication that these high prominent public figures are caught up in such stories.

Apple shouldn't, as a general rule, comment on specific allegations, if they aren't party to it.
It's something which a lot of US politicians, organisations and even common people should also abstain from doing more often.

And they certainly shouldn't get involved in specific cases.

If they so pleases, Apple should deal with social issues in a general sense, not specific cases on the individual level.
 
"It is a tale
Told by an idiot, full of sound and fury,
Signifying nothing."

You're just full of bravado, going off at Apple because you are angry.

Apple has always been a secretive company especially since Jobs took over in the 96. To expect openness and detailed explanations from Apple is to make Apple into a company they probably never have been.

Also, it's not a backdoor when you are told about it. A backdoor implies secrecy. There is no secrecy here.
Yes I am angry. Angry because I have being sounding the alarm long before this event. Angry because in the last 5-6 years with every new product there are some decisions made by marketing committee not by product people.
Yes, I am full of bravado but only in the eyes of uneducated and blind Apple followers. Apple is not the computer maker with aesthetic feeling anymore, secrecy is acceptable in product development and that is one of the reasons for me and many others to use Apple products more than 20 years.

In the past the core product was The Mac, and everything done with it was not only professional but with focus on perfection and longevity. Since commercial success of iPods Apple started this shift and gradually started the war against user control over devices.

Nowadays Apple is legitimate monopoly, with portfolio of consumer products build around the iPhone and iOS.
Not only with closed source but with dark pattern design and UX everywhere. The Mac is something that Apple hates and the idea of running software like Little Snitch to circumvent telemetry makes them screech.

To call me an idiot and dismiss factual data is choice which speaks volumes about your level of understanding and expertise. Thank you.
 
Last edited:
the situation is even more complex, well, at least in my case. I support Apple‘s solution because I value my privacy whilst at the same time recognising that CSAM scans must be done in order to protect kids. Those against Apple’s solution and suggesting server side scans instead also claim to be protecting privacy
Privacy under surveillance is not privacy. That's rather the point.
 
He just doesn't know, its hard to argue against someone so dead-set on being wrong and so uninterested in the truth. He said this would protect his children of people stealing their phone and uploading their nude photos, which clearly it doesn't. Honestly, statistically, its parents and close relatives cause harm to their children. He should be looking elsewhere if he's concerned about his children's safety.

Many seem to feel that CSAM is nude child photos.
Far far from it.
 
"It is a tale
Told by an idiot, full of sound and fury,
Signifying nothing."

You're just full of bravado, going off at Apple because you are angry.

Apple has always been a secretive company especially since Jobs took over in the 96. To expect openness and detailed explanations from Apple is to make Apple into a company they probably never have been.

Also, it's not a backdoor when you are told about it. A backdoor implies secrecy. There is no secrecy here.

I suspect many here have two definitions of “back door”.
1. Secret entrance
2. Alternative locked(?) entrance
 
The transmission of such images is an abuse of the child itself.

The child is abused:
1. When the image is produced
2. When it is stored
3. When it is duplicated and transmitted or transported
4. When it is received

You might believe #1 is the worst and it probably is, but #3 and #4 might happen hundreds of thousands of times in some cases. Which makes the child suffer hundreds of thousands of abusing acts.

#1 is a known.
#2-4 are potentials.

We could argue semantics and legalities. Creation and knowing it’s out there is a given. All else is a maybe.
If I have 10k of photos of me out there or 9998 it makes no difference.
Creators caught and no more made? Yeah. That would be a huge plus.
Apple’s solution doesn’t do that at all. Not designed for it.
 
  • Like
Reactions: BurgDog
I really hope Apple does drop this altogether. I have no problem with cloud providers scanning for this kind of stuff on their servers. I do have a problem with this kind of scanning being done on my device because it will inevitably be expanded to look for “wrong think” given the way all governments are headed these days.
 
You honestly believe private companies should start monitoring their users? You want Apple Police, Microsoft Police, Google Police etc. of gathering info for some other private organisation so they can use it to their own purpose? Shouldn’t we leave hunting down the criminals to governments and law enforcement agencies and not to some shady groups who are not governed by the laws like law enforcement?
Hear, hear.. but isn’t apple already way beyond this point? Apps are removed from the apple store on requests of governments around the world. With that it did acted as the shady club we didn’t want to know. And yes (me too) we are paying for it..
 
They can't do it for anything.

The CSAM Detection System is very bad at detecting images of similar nature.

What's the hash of an illegal protest in Hong Kong?
How would you produce one?

The same way these protests photos have already been abused and hashed into the GIFCT (Global Internet Forum to Counter Terrorism) database. They just take the images right off of social media and hash them. Then the same social media companies that subscribe to that service so they can flag terrorist posts inadvertently end up taking down unrelated memes, and legitimate government criticism, as well as independent journalist posts on a wide range of topics.
 
Apple only actively scans iCloud mail. How many people use iCloud mail?
Me. At least while I still have apple devices. Scanning email is okay by me, it's their servers. I have a gmail account as well, and they probably scan more thoroughly, and that's still okay. hotmail and outlook too.
 
  • Like
Reactions: Alex_Mac
When?
If I go load 2k in pics to the iCloud from any of my devices (or even more photos), hash/scan during the upload is resource intensive.

From an execution point, it makes more sense to scan first, then compare when actually uploaded.
It’s really not that intensive, but yes the hash and voucher can be created any time. It’s only after upload that the server can do the comparison to see if there’s a match.
 
  • Like
Reactions: dk001
They can't do it for anything.

The CSAM Detection System is very bad at detecting images of similar nature.

What's the hash of an illegal protest in Hong Kong?
How would you produce one?
Why should Apple limit it to a hash? Why not use another kind of recognitionsystem.. or prohibit users from using apps which the use in their protest? Ever heard about hkmap? Removed from the appstore and not because of storing CSAM pictures..
 
  • Like
Reactions: Huck
Which is why the hash database was due to be auditable.. so we can check it is only for CSAM. Any other hashes would be noticed immediately..

photos stored from apps have this data in the meta, look how WhatsApp makes it own album etc.. you can search Pokémon go for all pogo saved photos.
If an app was doing this.. it should be easily traceable
Go volunteer to let your files be scanned then. They won’t be scanning mine
 
  • Love
  • Like
Reactions: So@So@So and Huck
Can’t say yeah or nay.
On one hand Apple’s response says scanned and hash matched on device then uploaded.
On another Apple’s response was scanned and hash matched during the upload process.

Apple has been less than clear on just how this process executes and when.

That is one of the “asked for” items; just how does this process really work? How about we get some independent peer review or an independent audit?

Apple have been very clear on how the overall process works as well as described how the individual technologies they use work. Any more and they’d have to be showing you code in their description and if you want that because you’re a security computer science researcher, Apple promised to make the code available. I really dont know what else they’d have to do to make it more clear to you.
 
  • Like
Reactions: brucemr
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.