Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I think you need to re-think this as well.

If your children are of the age when the internet and mobile phones with cameras were about, what if someone used their mobile phone and took indecent pictures of your children when they were younger and uploaded them to some cloud service, wouldn't you have wanted some type of system in place to prevent that from happening and to prevent the pictures from being distributed around?

Or are you still of the view that even if a system was in place to prevent that from happening, you would not want it because it intrudes on your privacy?

Just because someone is against this doesn't mean we're against finding and getting rid of this filth.

There are many who would be 100% ok if Apple was doing this all server side, like every other company does. It's the client side issue that's the problem. And yes, I fully understand how this works and how it's implemented and that it still requires iCloud being enabled to work.
 
When you receive this much backlash over a feature intended to protect kids from sexual abuse material and prevent adults from distributing said material, you know you’re doing something right.

I think many people are truly unaware of the staggering prominence of child abuse in society, if people knew how common and widely distributed the material is they might throw some support behind this.

Meanwhile, your government is actively tracking your location everywhere you go, QR code check ins show what places you visit and how long you stay. CCTV exists on every corner, every traffic light, monitoring your movement patterns through facial recognition & number plates. Every time you tap and buy something you reveal more of yourself. None of this surprisingly makes people revolt in protest, when it should, and yet the idea of Apple implementing a child-protection feature has everyone crying “encryption!”
I think people are aware. And the biggest offenders of child abuse and sex trafficking rings are some famous people such as rich celebrities, actors, producers, politicians etc. Just recently I read about this.

"Oprah Winfrey said on Friday that she was cutting ties with a documentary centered on women who have accused the music mogul Russell Simmons of sexual misconduct. The untitled film, scheduled to have its premiere this month at the Sundance Film Festival, focuses primarily on the executive Drew Dixon, who accused Mr. Simmons of raping her, an accusation Mr. Simmons has repeatedly denied."

What was Apple's reaction? "Apple declined to comment."

Of course this is not child abuse related but it is disgusting and hypocritical from Apple to preach about protecting children and women while at the same time they don't take any action when there is clear indication that these high prominent public figures are caught up in such stories.
 
Last edited:
Does Apple own MY phone? Last time I check it’s MY phone and in MY house. I didn’t rent it.

A condition of use of iCloud, at least for photos, would be to consent to on device CSAM scans. You don’t consent then don’t use iCloud (for photos) and no scans will be performed
 
For a million times: it’s not Apple job to do this!

Want to protect our children? Either donate fund to FBI team who’s dealing with this issue or talk secretly to Congress to pass a law requires ALL who store our photo to scan for CSAM. Apart from these GTFO of my devices!

I’m a paying customer. I don’t like being point finger at ‘Hey! Let me check you. You COULD be a criminal’. This is not a way to treat your loyal customer.
Exactly, This is a nice way to say it's the same as the Gestapo saying, "Show me your papers" to prove you are a good person. Can't wait to get trained with that special salute to let everyone know that I am a fully indoctrinated government supporter.

We (in the US) used to have some concepts like innocent until proven guilty, due process, no unreasonable searches, etc. These seem to fallen by the wayside due to fear and super pro government marketing.

You'all that want this kind of government overreach and oversight are not going to like the consequences. Of course, it will be your children, and their children that will suffer, so no worries.
 
Just because someone is against this doesn't mean we're against finding and getting rid of this filth.

There are many who would be 100% ok if Apple was doing this all server side, like every other company does. It's the client side issue that's the problem. And yes, I fully understand how this works and how it's implemented and that it still requires iCloud being enabled to work.
Please explain to me this line 'Just because someone is against this doesn't mean we're against finding and getting rid of this filth.'. If a person is against having CSAM on their device then they are against finding and getting rid of this filth. There is no middle ground here.

You can not say 'we are against it but we are not going to allow you to check our device'. It does not work like that no matter how you try to word it or spin it.
 
Please explain to me this line 'Just because someone is against this doesn't mean we're against finding and getting rid of this filth.'. If a person is against having CSAM on their device then they are against finding and getting rid of this filth. There is no middle ground here.

You can not say 'we are against it but we are not going to allow you to check our device'. It does not work like that no matter how you try to word it or spin it.

Yes, you absolutely can. It's not that black and white.

If the police came knocking on your door and said, "Hey, we're here to search your house for CSAM. We won't do anything to you unless you leave the house with it. But we we need to at least know it's here and get that documented."

Would you simply let them in without a warrant or anything else? Just at their word? If you would, then we'll end up disagreeing on this altogether and there's really no point in continuing this discussion with you.

But the other point is this - they can't do that without a warrant, and they can't get a warrant without due process and reasonable cause.
 
I am also a programmer for more than 28 years. You can be a developer and lack design system thinking and skills. That is one of the reasons why I pay salaries to dev's and not the other way around. The design fail in Apples approach is that they are introducing on device processing with third party non publicly auditable hashes. Period. The industry uses server-side processing with PhotoDNA from years. And this is enough.
I’ve just watched the video you attached. Was it meant as some evidence supporting your claim? I mean, first of all, the dude looked at just one aspect of the solution and, more importantly, he didn‘t use Apple’s code just other open source code. Hardly proof of anything and he even acknowledges that in the video by saying the issues he experienced could potentially also occur with Apple‘s code. Talking about jumping the gun with conclusion, goodness me
 
This debate will get no where because it is clear there are those who will verminatly defend their right to privacy over that of anything else, in this case the protection of children.

The thing is, don't children deserve our protection? is it not up to us, as adults to find ways to protect children from harm?. It's a daming question but given the way people are defending their right to privacy, they need to ask themselves this question..is your right to privacy more important than the protection of a child? I think members here are too scared in being judged if they reply 'Yes' to that question.

the situation is even more complex, well, at least in my case. I support Apple‘s solution because I value my privacy whilst at the same time recognising that CSAM scans must be done in order to protect kids. Those against Apple’s solution and suggesting server side scans instead also claim to be protecting privacy
 
This debate will get no where because it is clear there are those who will verminatly defend their right to privacy over that of anything else, in this case the protection of children.

The thing is, don't children deserve our protection? is it not up to us, as adults to find ways to protect children from harm?. It's a daming question but given the way people are defending their right to privacy, they need to ask themselves this question..is your right to privacy more important than the protection of a child? I think members here are too scared in being judged if they reply 'Yes' to that question.
What is your position on protecting children when they are in the womb and have a heartbeat?

1000s of children die every year from car accidents, what is your activism for protecting them? In all honesty, it seems like the convenance of automobiles is more important that protecting the children.

1000s of children die every year from drowning, what is your activism for protecting them? In ages 1 to 4 it is the leading cause of death. In all honesty allowing children around water is more important than protecting them? Why doesn't Apple scan for images around water to protect the children.

Death from child abuse is really far down the protecting children chain of problems. If you were being honest with yourself I think you would prefer to have a live abused child, rather than a dead non-abused child.

The point being, the feature will not protect children, it exists only to make people feel better about protecting children.

The other point is that life is not easy. It is not safe. It is not possible to protect children from everything in the world. 90% of child abuse comes from family or friends of family. Look it up. What to protect children, be really concerned about who your friends are. Do your part and let others do their part. It is not Apple job to be the police. It is not your job to protect anyone outside of your family.
 
That's exactly the point, for this reason Apple should remove this "feature" from iOS and move image scanning to the cloud. I don't accept spyware on device I paid for.
As my other posts state, this just degrades the security of everyone else’s non-abuse photos.
To maintain 100% server side scanning, apple cannot maintain 100% end to end encryption for normal photos, meaning law enforcement or whoever can get the keys can decrypt their images and see anything we’ve uploaded.

if you’re all for privacy and what not, you want this feature to go live to help keep other actors out of our normal photos..
 
  • Angry
Reactions: Euronimus Sanchez
Too many people here who have no actual idea on what Apple are proposing.

Too many people here are concerned about the rights of child abusers.
Nobody here is concerned about the rights of child abusers. We are very concerned about the rights of innocent people to have privacy and not be automatically considered a criminal just because some nanny wants to be the overlord.
 
What a headline "EFF Pressures Apple ...
regardless of how one feels about Apple's plan - how can the EFF "pressure" Apple? Do this or what?
Poor journalism, click/bait MR
 
  • Like
Reactions: Philip_S
Appears that even the Chinese govt is less invasive than Apple in secretly searching people's computers.Sorry - not secretly searching - openly searching people's files. Reminds me of those computer searches that think a lemon is a breast.
 
I don't believe icloud files are encrypted.
Yes they are but only some are end to end encrypted. Photos are not end to end encrypted.

 
  • Like
Reactions: brucemr
Yes, you absolutely can. It's not that black and white.

If the police came knocking on your door and said, "Hey, we're here to search your house for CSAM. We won't do anything to you unless you leave the house with it. But we we need to at least know it's here and get that documented."

Would you simply let them in without a warrant or anything else? Just at their word? If you would, then we'll end up disagreeing on this altogether and there's really no point in continuing this discussion with you.

But the other point is this - they can't do that without a warrant, and they can't get a warrant without due process and reasonable cause.
That’s not what is happening.
To use your analogy:
Apples proposal is to scan as the person is “actually leaving the house with said image”.
No scans take place if that photo is destined to remain on device and is not being uploaded to iCloud photos.

How hard is this to understand?

apple are not documenting the csam match unless the photo is leaving the house (phone) to go outside (upload to cloud). Only then the scan and documenting occurs
 
  • Like
Reactions: eatrains
What is your position on protecting children when they are in the womb and have a heartbeat?

1000s of children die every year from car accidents, what is your activism for protecting them? In all honesty, it seems like the convenance of automobiles is more important that protecting the children.

1000s of children die every year from drowning, what is your activism for protecting them? In ages 1 to 4 it is the leading cause of death. In all honesty allowing children around water is more important than protecting them? Why doesn't Apple scan for images around water to protect the children.

Death from child abuse is really far down the protecting children chain of problems. If you were being honest with yourself I think you would prefer to have a live abused child, rather than a dead non-abused child.

The point being, the feature will not protect children, it exists only to make people feel better about protecting children.

The other point is that life is not easy. It is not safe. It is not possible to protect children from everything in the world. 90% of child abuse comes from family or friends of family. Look it up. What to protect children, be really concerned about who your friends are. Do your part and let others do their part. It is not Apple job to be the police. It is not your job to protect anyone outside of your family.
Your point being if we (the royal we) can’t do the above we shouldn’t do any of it?
 
That’s not what is happening.
To use your analogy:
Apples proposal is to scan as the person is “actually leaving the house with said image”.
No scans take place if that photo is destined to remain on device and is not being uploaded to iCloud photos.

How hard is this to understand?

apple are not documenting the csam match unless the photo is leaving the house (phone) to go outside (upload to cloud). Only then the scan and documenting occurs

It's not hard. And I do understand that. My point was a counter to his point and show that it's not as straightforward as he made it out to be. The analogy wasn't perfect, but that wasn't the point of my post at all.

Would it have been better if I had said, "Police come knocking and ask to place a CSAM scanning device in your house." ??

My point remains the same. Most wouldn't allow that. And they couldn't do that without a warrant.
 
Please explain to me this line 'Just because someone is against this doesn't mean we're against finding and getting rid of this filth.'. If a person is against having CSAM on their device then they are against finding and getting rid of this filth. There is no middle ground here.

You can not say 'we are against it but we are not going to allow you to check our device'. It does not work like that no matter how you try to word it or spin it.

You are correct IF you live in authoritarian world.

We don’t. There is a point where human rights count. I seriously doubt anyone posting here is for CSAM. What we are against is the method Apple has chosen to search for it.

Watch the video in post #111.
There are some members of NCMEC (current and retired) who discus this.
 
  • Like
Reactions: Pummers
1000s of children die every year from car accidents, what is your activism for protecting them? In all honesty, it seems like the convenance of automobiles is more important that protecting the children.


self-driving cars. This will help protect children and everyone else. It’ll eliminate drink and drug driving too.
we can still have the convenience of having a car
 
  • Haha
Reactions: Pummers
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.