Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
There is another way to save pictures from the internet without them being accessible by the photos app? I just tried a photo and didn't even see an option to save anywhere else. How do you do it?
You can forward them via messages, for example. So you can spread illegal porn instead of CSAM. At the latest there it becomes obvious that CSAM is a political protection assertion.
But by now, everyone has actually understood that. And the majority won't care at all again.
 
  • Like
Reactions: sog1927 and BurgDog
I don’t usually save pictures from the Internet. But you can use share and save it using the files-app. Or copy and save it elsewhere.
Some browsers allow you to save directly to the browser.
Seems convoluted if you are saving 20 or 30 pictures at a time. Besides, considering this program will be provided to third-party apps, I assume they will scan all iCloud data, not just the photos app.
 
You can forward them via messages, for example. So you can spread illegal porn instead of CSAM. At the latest there it becomes obvious that CSAM is a political protection assertion.
But by now, everyone has actually understood that. And the majority won't care at all again.
Not going to lie, you lost me someplace in the middle. But I felt the passion and that's 80% of the message.
 
They would need to write new code to have the iPhone back up itself without user consent. The code doing the scan just needs a minor tweak to likely one or two ‘if else’ statements. There is much higher potential for a government to request/require compliance with demands of this new built in scanning tool over asking them to create secret backups of devices. None of these changes has to be secretive.

They don't have to change the iCloud backup code. They can just change the value which says if you have turned it on or not.

If the user is in China the iCloud backup are on Chinese servers run by Chinese hosting providers. Apple could just give them the encryption key and they could monitor the data every day.

The CSAM detection system is very poor at doing what people think its good at, like finding picture in a certain category like "people in protests".
 
  • Like
Reactions: BurgDog
Seems convoluted if you are saving 20 or 30 pictures at a time. Besides, considering this program will be provided to third-party apps, I assume they will scan all iCloud data, not just the photos app.
Saving 20-30 pictures with an iphone is convoluted. You need to hold on each of them and choose “save to photos app”. Using the share-button is not many more clicks.

And let’s be frank. Peddos are usual “regular people with regular job and family”. How many peddos saves their kiddie-stash in the photos-app which contains your holidays pictures etc that others will probably see?
Hey, even regular people with LEGAL porn material doesn’t save it in photos-app due to that this app will be used with friends and family.

It’s a badly implemented system and it infringes on our privacy.
 
Last edited:
  • Like
Reactions: sog1927 and BurgDog
Seems convoluted if you are saving 20 or 30 pictures at a time. Besides, considering this program will be provided to third-party apps, I assume they will scan all iCloud data, not just the photos app.
Of course, we know that iCloud stores motion profiles, used IP gates etc. If this is new to you, you can request disclosure from Apple, but it will take several days before you receive your data-zip. I assume the iCloud structure and access capabilities currently to be too complex for governments, and the direct access to specific iPhones is the only efficient way.

sorry, again:
 
Saving 20-30 pictures with and iphone is convoluted. You need to hold on each of them and choose “save to photos app”. Using the share-button is not many more clicks.
One more step would double the number needed.
 
Fundamentally people understand what is going on. They don't like the feel of it.

This isn't just dumb customers who don't understand technology.

Apple's own employees have been registering their own concerns on internal communication systems.

Reuters reports have been wrong many times.

Apple did a really awful job of communicating;

Sure, which Craig Federighi admitted.


Apple did relatively quickly recognise they were shredding their reputation as the privacy company, and are now trying to undo the damage. The problem is it all looks a bit desperate now.

Purely subjective. Agree to disagree.

Rene Richie did an outstanding job of explaining the whole mess in a lengthy YouTube video. He's always pretty supportive of everything Apple does (to say the least), but even he registered concerns and some unease about it.

Not sure why you brought up Rene Richie. He's wrong about many things including about Xbox Game Streaming with respect to Apple's App Store policies. I suspect he cannot form an opinion for himself but rather he forms opinions based on what can generate clicks. He's hardly a figure to follow in the Apple world.
 
Last edited:
  • Disagree
Reactions: dgrey
I'm wondering about a possible DDoS of this disservice. For example, if this mysterious CSAM database were leaked, along with the algorithms it uses, people could craft images en masse that generate signature matches, using splotchy patterns and colors. That would be more than even Apple could handle.
 
  • Like
Reactions: IG88
One more step would double the number needed.

Do you honestly think this one step is a huge deal to peddos?

Think about it yourself, you are browsing the net and find a legal porn pic. Do you take the extra click to have a clean photos app or do you have porn in the middle of your family pictures because that extra step was such a pain?
 
  • Like
Reactions: BurgDog
I'm shocked 99% of them actually don't bother getting to know how it works, they even talk about backdoors without even knowing how it would possible lol but sure, hop on the trend and say you don't like this feature
There is a piece of code on your phone that’s not on your side. Apple could have done the matching locally and reject matching photos to be uploaded to Apple servers without disclosing to Apple or anyone that a match has been found. This would have achieved the same thing to prevent fire being lit on Apple’s asses.
 
Or rather you tunnel vision so hard you have given up foreseeing the trend of how thing would go.
CSAM scans images, which can be repurposed to scan any other type of images in the future. The technology is ready in mass scale.

"which can be repurposed to scan any other type of images"

How would a government use this to catch certain kinds of people?

I believe the opposite of you. It's poor at discovering a category of pictures, only exact matches of pictures and their derivative.
 
Fundamentally people understand what is going on. They don't like the feel of it.

This isn't just dumb customers who don't understand technology.



Reuters reports have been wrong many times.



Sure, which Craig Federighi admitted.

just laying out technical documents to explain what they were doing and then allowing the media to write the script. This allowed externals to build the narrative around privacy invasion, and all kinds of privacy groups who Apple usually cites as being supportive of them and their policies have been piling on.



Purely subjective. Agree to disagree.



Not sure why you brought up Rene Richie. He's wrong about many things including about Xbox Game Streaming with respect to Apple's App Store policies. I suspect he cannot form an opinion for himself but rather he forms opinions based on what can generate clicks. He's hardly a figure to follow in the Apple world.
Don‘t split us. U know dummastudetto is rather precise in his analysis
 
Do you honestly think this one step is a huge deal to peddos?

Think about it yourself, you are browsing the net and find a legal porn pic. Do you take the extra click to have a clean photos app or do you have porn in the middle of your family pictures because that extra step was such a pain?
It’s not about hiding content I don’t want seen on my Apple TV. Your suggestion forces me to hide every photo and not use the photos app at all.
 
"which can be repurposed to scan any other type of images"

How would a government use this to catch certain kinds of people?

I believe the opposite of you. It's poor at discovering a category of pictures, only exact matches of pictures and their derivative.
macOS currently does both: generate a face hash very accurately, and at the same time characterize abstract structures (e.g. grain fields). Apple is very good in this.
 
It’s not about hiding content I don’t want seen on my Apple TV. Your suggestion forces me to hide every photo and not use the photos app at all.

Maybe I’m not writing clearly. You should of course save pictures in the photos app using the default behavior. Pictures you don’t want to hide.

But if you want to hide a photo. Then the photos app will probably not be the wisest choice. I’m just saying that most of the CSAM-material is not stored in the photos app but rather somewhere else on the phone or icloud. Photos app is reserved for friends and family. Not for your legal or illegal porn collection.
Hey, you can save CSAM in the notes app and be perfectly safe with the current implementation.

All I’m saying is that this is a bad way of trying to catch peddos, and it infringes on our privacy. The amount of children they will help will be marginal.
Why not scan every photo that you are saving on your device or upload using share? This would protect the children far better.

Just to be clear, I’m pro-privacy and think this whole thing is a mess. And the implementation is laughable.
 
  • Like
Reactions: BurgDog
"which can be repurposed to scan any other type of images"

How would a government use this to catch certain kinds of people?

I believe the opposite of you. It's poor at discovering a category of pictures, only exact matches of pictures and their derivative.
This CSAM scan can’t rely on pixel perfect analysis otherwise it’s going to be weak. As for “category” of pictures, with enough slices, it is still possible to recognise object categories in the said photo. Does require more local storage to store category learning data but that’s mostly it imo.
 
What is your theory on why Apple, after years of positioning themselves as leaders in privacy, would choose to implement this policy now?

My speculation: They are felling pressure from numerous places that they aren't doing enough to fight child pornography and other crimes and helping law enforcements with evidence gathering.

They would probably be pressured to add scanning to iCloud and they don't want to break encryption to do it.

It's also a possibility in meetings with NCMEC they in fact also really feal they should do something.
 
  • Like
Reactions: sog1927 and bobob
Maybe I’m not writing clearly. You should of course save pictures in the photos app using the default behavior. Pictures you don’t want to hide.

But if you want to hide a photo. Then the photos app will probably not be the wisest choice. I’m just saying that most of the CSAM-material is not stored in the photos app but rather somewhere else on the phone or icloud.
Hey, you can save the in the notes app and be perfectly save.

All I’m saying is that this is a bad way of trying to catch peddos, and it infringes on our privacy. The amount of children they will help will be marginal.
Why not scan every photo that you are saving on your device or upload using share? This would protect the children far better.

Just to be clear, I’m pro-privacy and thin this whole thing is a mess.
I’m saying I want the same level of privacy for a photo I took of my TVs hdmi ports as I do for surgery, laboratory, and romantic events. The content of the image in no way diminishes my right to privacy.

This in no way protects children, and far as I can tell drives up abuse, not down. No one wins by scanning photos.

Why not hide the images we are concerned about?
 

Attachments

  • 20BF5972-3B9D-415D-A778-670938329547.png
    20BF5972-3B9D-415D-A778-670938329547.png
    200.7 KB · Views: 56
  • Like
Reactions: Ethosik and BurgDog
Looking through this thread page by page not many Apple customers Are enthused about this new CSAM idea, whether it’s for privacy concerts or back door concerns or encryption concerns etc
 
Today I spoke to a family friend who happens to be a lawyer for IT law and who deals with big companies all the time. He said to him it looks highly unlikely that this new general scanning of user devices would be permissible by European law. However he said some European divisions of US companies would be permitted to use US standards for their European businesses as well. So it's not clear to him yet if Apple could practically roll out this in the EU.
He said law enforcement has every tool to search for criminal things and Apple would not be required to look for child porn separately.
 
Last edited:
  • Like
Reactions: BurgDog
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.