Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
So if I understand this correctly, when forced by the Government (EU) to start recording specific user IP addresses, the complied.

Note: I use this service.
Basically. In the first case France had Interpol ask the Swiss authorities to obtain details from Proton. It appears France has designated the actions of the protestors as terrorism in order to fall under a category in Swiss law which would enable the Swiss to force Proton to comply. While I don't condone the protestors entering and squatting in private commercial properties, it certainly is not true terrorism. There is quite an outcry on Reddit and Twitter regarding ProtonMail, what it did and how it may have conveniently hid the fact that in some cases it does track IP info in its marketing material.
 
Well this debate has certainly opened my eyes to the attitudes of many members in this forum in that they do not want filth, they dislike filth but only as long as it does not affect them personally.

In my opinion it's an appalling attitude. Yes I know I will get many down votes but gauging from members attitudes on the issue that's to be expected.

Protection of children is paramount and if CSAM goes someway to stopping child filth images being distributed or even to the point of catching those involved in the distribution of child filth images then I am all for CSAM and however Apple wants to implement it.

I accept that people are allowed their views and opinions on this issue but some of the views and opinions expressed by some members in here are appalling in my opinion and therefore I no longer wish to debate with people in here because they disgust me. Enjoy you debating.

You do realize that this isn’t about CSAM?
It is about force installing nanny ware or spyware on our personal devices.
CSAM is the current “excuse” on why we should allow this.
 
Basically. In the first case France had Interpol ask the Swiss authorities to obtain details from Proton. It appears France has designated the actions of the protestors as terrorism in order to fall under a category in Swiss law which would enable the Swiss to force Proton to comply. While I don't condone the protestors entering and squatting in private commercial properties, it certainly is not true terrorism. There is quite an outcry on Reddit and Twitter regarding ProtonMail, what it did and how it may have conveniently hid the fact that in some cases it does track IP info in its marketing material.

Thanks!
I’ll have to check this out as I use their services.
 
  • Like
Reactions: BurgDog
Well this debate has certainly opened my eyes to the attitudes of many members in this forum in that they do not want filth, they dislike filth but only as long as it does not affect them personally.

In my opinion it's an appalling attitude. Yes I know I will get many down votes but gauging from members attitudes on the issue that's to be expected.

Protection of children is paramount and if CSAM goes someway to stopping child filth images being distributed or even to the point of catching those involved in the distribution of child filth images then I am all for CSAM and however Apple wants to implement it.

I accept that people are allowed their views and opinions on this issue but some of the views and opinions expressed by some members in here are appalling in my opinion and therefore I no longer wish to debate with people in here because they disgust me. Enjoy you debating.

You are just ascribing to a concept out of convenience. Prior to Apple's announcement last month nobody (in this forum at least) was advocating that Apple should be doing more to stop child sexual abuse. Apple could have been doing more the entire time. (at least since the introduction of iCloud.) They chose not to. Their only effort to date has been one that breaks established privacy norms, opens my personal device to future privacy abuses, and destroys all the good will they worked so hard over the years to establish regarding user privacy. I am not ok with Apple stopping CSAM "however Apple wants to implement it" That is a lazy and reckless position that makes you unworthy any serious debate so it's probably all for the better that you want to sit on the sidelines now.
 
Last edited:
Well, these 90 organisations probably do not trust Apple or perhaps any other tech company much. I am not naive either, but Apple’s proposed solution, according to them, would only trigger an alarm when there are over 30 of these hashed pictures on any given device. And then these pictures will be reviewed by Apple staff first. I also hope these pictures can then be traced back to their source? There are so many ways to frame somebody that this does not seem like a particularly easy one to me. 🤷🏻‍♂️
Then you missed the hacker who within days created a tool that was able to imprint data onto innocent images that would generate the same hashes as in the database.
 
The fact that no one important enough at Apple sat back and went "This is dangerous and an invasion of privacy for all users" makes it abundantly clear that all of the other privacy initiatives Apple has put out there are nothing more than performative marketing.

Google is worse, and I'll stick with my iPhone (because I prefer the experience), but people really need to stop drinking the Flavor-Aid.
 
Your point being if we (the royal we) can’t do the above we shouldn’t do any of it?
No. If someone is going to be a nanny, then at least they should be consistent instead of cherry picking the issues to attack people that believe in freedom and privacy.

If someone really cares about the children, then I expect them to be consistent and care about all children, therefore logically spending the most time, effort, and extremism on the issues most effecting the biggest dangers to children.

Otherwise they are just grandstanding for appearance, not trying to effect real change.
 
  • Like
Reactions: dk001 and BurgDog
Then you missed the hacker who within days created a tool that was able to imprint data onto innocent images that would generate the same hashes as in the database.

I am aware of that, yet still think these are quite extreme scenario for most people. I am not stuck in my opinion and will be following up on any further developments.
 
Well this debate has certainly opened my eyes to the attitudes of many members in this forum in that they do not want filth, they dislike filth but only as long as it does not affect them personally.

In my opinion it's an appalling attitude. Yes I know I will get many down votes but gauging from members attitudes on the issue that's to be expected.

Protection of children is paramount and if CSAM goes someway to stopping child filth images being distributed or even to the point of catching those involved in the distribution of child filth images then I am all for CSAM and however Apple wants to implement it.

I accept that people are allowed their views and opinions on this issue but some of the views and opinions expressed by some members in here are appalling in my opinion and therefore I no longer wish to debate with people in here because they disgust me. Enjoy you debating.

Nice try. If that was truly all you cared about, you would've left Apple a long time ago, as they've done practically nothing to catch CSAM to date, compared to every other big tech company out there. It's convenient that you take this attitude now, after years of ignoring it in the name of privacy.
 
No. If someone is going to be a nanny, then at least they should be consistent instead of cherry picking the issues to attack people that believe in freedom and privacy.

If someone really cares about the children, then I expect them to be consistent and care about all children, therefore logically spending the most time, effort, and extremism on the issues most effecting the biggest dangers to children.

Otherwise they are just grandstanding for appearance, not trying to effect real change.
Based on the first paragraph does seem like you’re proposing all or nothing.
 
If you don’t understand this, just ask yourself if you think someplace inside of Apple there will be a server full of child porn waiting to be compared to images on your phone…. Or will those images and services be provided by governments, who would then need direct unencrypted instant access to your device. Scary.

This means you don't understand the CSAM Detection System. Apple doesn't need access to CSAM and NCMEC doesn't need access to user's images, whether there in iCloud or on the phone.
 
  • Like
Reactions: Parintachin
Nice try. If that was truly all you cared about, you would've left Apple a long time ago, as they've done practically nothing to catch CSAM to date, compared to every other big tech company out there. It's convenient that you take this attitude now, after years of ignoring it in the name of privacy.
He just doesn't know, its hard to argue against someone so dead-set on being wrong and so uninterested in the truth. He said this would protect his children of people stealing their phone and uploading their nude photos, which clearly it doesn't. Honestly, statistically, its parents and close relatives cause harm to their children. He should be looking elsewhere if he's concerned about his children's safety.
 

"It is a tale
Told by an idiot, full of sound and fury,
Signifying nothing."

You're just full of bravado, going off at Apple because you are angry.

Apple has always been a secretive company especially since Jobs took over in the 96. To expect openness and detailed explanations from Apple is to make Apple into a company they probably never have been.

Also, it's not a backdoor when you are told about it. A backdoor implies secrecy. There is no secrecy here.
 
  • Like
Reactions: Parintachin
Protection of children is paramount and if CSAM goes someway to stopping child filth images being distributed or even to the point of catching those involved in the distribution of child filth images then I am all for CSAM and however Apple wants to implement it.
Well, what is your stand on killing babies with a heartbeat or otherwise viable in the womb? After all a baby cannot be more helpless than in the last few months before birth.

If you're like most techies, you'll probably make an exception for that!

If you do, then your argument is all about your self image and emotional state instead of the stated reason of protecting the children because you do not demonstrate that protecting children is the highest priority.
 
Once they can compare file hashes for this. They can do it for anything and compare against all kinds of files. That will allow ultimate tracking across all your activities.

Not to mention the possibility of a malicious app casually writing a bunch of these types of photos to a person’s library, instant sync to iCloud, boom life ruined.

This system is idiotic. The fact that Apple even thought it was a good idea is idiotic and people who think it’s a good idea need to use their brains a little harder to see beyond this ridiculous “save the children” narrative. It’s not Apple’s job.
They can't do it for anything.

The CSAM Detection System is very bad at detecting images of similar nature.

What's the hash of an illegal protest in Hong Kong?
How would you produce one?
 
I don’t know how we got from Tim and Apple vehemently defending against things like this to doing it without a legislative mandate. But I fear that this is appeasement of some kind of covert coercion from one or more government officials, or politicians, in the US or China.

Republican senator told Apple directly in Congress that if Apple didn't do anything, he would make sure to create laws which would force Apple to do something.
 
So it's okay if they search you every time you leave the house for illicit photos? It's not okay to me!

If I was leaving the house with said photos, then decide to hand them over to to a public corporation to keep safe. Yes.

If leaving the house and keeping for personal use, and for no one else to see. No. That’s my business.

iCloud is just like the first option. You’re choosing to hand it over by having iCloud photo library enabled.
You have to choice to say no.
 
The US Constitution prohibits all government under its authority from doing extra judicial searches for very good and non academic reasons.

Apple acting as an arms length extra judicial searcher for any government is a terrible precedent.

It's not a precedent. It has been this way since 1791.

In the beginning, the fourth amendment only applied to the federal government and not the states. It's only in the 20th century, particularly from the thirties, that the US Supreme Court started applying the Bill of Rights to the states, and slowly.
 
And there’s the problem. One thing that the “think of the children” brigade are determined to ignore is that folk don’t have a problem with the actual scanning. The problem is doing it on the phone.

Apple, Google, MS and Facebook have been running server side scans for ages. But it appears that Apple doesn’t seem to catch anywhere near as many images as the others. So either pedophiles don’t use iCloud (unlikely) or Apple’s server side scan doesn’t work as well as the competition.

The difference, I guess, is that the other companies run their services on their own hardware; iCloud sits on top of AWS and Google services. Now imagine the increased billing they’d get from Google and Amazon if they have to process every single file, as opposed to whatever they’re doing now that gives them such a poor hit rate.

I can see why they’d want their customers to shoulder the burden.

Apple only actively scans iCloud mail. How many people use iCloud mail?
 
  • Like
Reactions: Parintachin
If I was leaving the house with said photos, then decide to hand them over to to a public corporation to keep safe. Yes.

If leaving the house and keeping for personal use, and for no one else to see. No. That’s my business.

iCloud is just like the first option. You’re choosing to hand it over by having iCloud photo library enabled.
You have to choice to say no.

The fallacy in that analogy is you have not handed them over when they are searched. In fact, they are searched before being handed over using your own resources. In this case, Apple is treating a local process on your device as having the same legal jurisdiction and bound by the same EULA as a sever in their datacenter. That's a pretty dangerous precedent regardless of the nature of the search.
 
"It is a tale
Told by an idiot, full of sound and fury,
Signifying nothing."

You're just full of bravado, going off at Apple because you are angry.

Apple has always been a secretive company especially since Jobs took over in the 96. To expect openness and detailed explanations from Apple is to make Apple into a company they probably never have been.

Also, it's not a backdoor when you are told about it. A backdoor implies secrecy. There is no secrecy here.

Oh, a secondary front door then.
 
  • Haha
Reactions: Pummers
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.