Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It’s important to me because I don’t like all the misinformation that is going around about this.
Misinformation is abound certainly. What makes this topic stand out for you? Is there some part of the misinformation that especially seems problematic? Corporate rights, individual right, child protection?
 
you can't leave your mac unattended anymore. 1 minute and your co-worker or "friend" can put some images on your mac and the police will come visit you a few weeks later;

Every macOS user becomes an easy target;

And even if you find and delete the pics, it's too late. You are flagged and the police is coming anyway;
This is hilarious.
You're not that important! Plus your buddies would have already been flagged if it's on their computer!!!
 
Or you could just tell them not to do that kind of thing, if they won’t listen to that advice then no amount of punishment or shaming is going to help… maybe Apple will send some therapist to your house to assist. Point being if your kids are doing this you have already lost

Come on, when I was that age, when I was told not to do something, I’d just do it on the sly.


But there is a difference between playing say GTA 3 when your parents say you shouldn’t, and sending sexting pics of yourself.
At 10, you may not realize the true ramification of doing that.


All the feature is, to me, is a way for parents to keep an eye on their kids. It starts to build trust between parent and child, but gives the kid a gentle reminder that, if they start sharing images that maybe deemed “too far”, their parent will get an insight into what they are doing.

Basiclly it’s a, do you think your parents would be okay with what your doing right now.

I’m 100% against the CSAM on iCloud photo though. What I do online should not be accessible to anyone but me.
 
  • Like
Reactions: crymimefireworks
you can't leave your mac unattended anymore. 1 minute and your co-worker or "friend" can put some images on your mac and the police will come visit you a few weeks later;

Every macOS user becomes an easy target;

And even if you find and delete the pics, it's too late. You are flagged and the police is coming anyway;
This is the problem with any system where you are assumed guilty until you are proven innocent. It opens up for abuses. Apple putting such scanning mechanism into all iPhones worldwide by the basis of child safety means that Apple assumes all iPhone and Mac users are potential predators. And so far Apple has not said anything about due process, appeal process, nor the consequences of scenarios like you mentioned. They even went as far as using hashes so they created plausible deniability for themselves. Basically if something happened, Apple just claimed ignorance since "we are only matching hashes." A private company turns into moral police and judge without checks and balances.

It sends a truly chilling message, especially to people living in countries with extreme moral laws.
 
you can't leave your mac unattended anymore. 1 minute and your co-worker or "friend" can put some images on your mac and the police will come visit you a few weeks later;

Every macOS user becomes an easy target;

And even if you find and delete the pics, it's too late. You are flagged and the police is coming anyway;

Do you test your food for poison with a silver needle before eating either?

All these “what if someone I know deliberately uploads child porn to my device?” scenarios sound more like undue concern trolling, as if child porn is so easy to come by and everyone around you is waiting to backstab you like an episode of game of thrones.
 
That has been an ongoing question; “what is driving this and why this solution?”.
You keep bringing this up but the truth is only apple has the answer to that question. Asking us isn’t going to help you find it.
Do you test your food for poison with a silver needle before eating either?

All these “what if someone I know deliberately uploads child porn to my device?” scenarios sound more like undue concern trolling, as if child porn is so easy to come by and everyone around you is waiting to backstab you like an episode of game of thrones.
not only to that but the same thing can be done if the scanning happens in the cloud. If someone puts those images on your phone (30 of them) then you’re caught. Also, I think it would be even easier to frame someone if the scanning happens in the cloud because then all you’d need is to log into their iCloud account and upload them via the web browser.
 
not only to that but the same thing can be done if the scanning happens in the cloud. If someone puts those images on your phone (30 of them) then you’re caught. Also, I think it would be even easier to frame someone if the scanning happens in the cloud because then all you’d need is to log into their iCloud account and upload them via the web browser.

And Apple isn’t the only company scanning for CSAM; google does it even more aggressively. If this were a genuine issue, we would have heard of more people being “busted” this way from people uploading such images to their google drive account, and it doesn’t even need to be Apple devices.

Like I said - just a whole load of concern trolling.
 
And Apple isn’t the only company scanning for CSAM; google does it even more aggressively. If this were a genuine issue, we would have heard of more people being “busted” this way from people uploading such images to their google drive account, and it doesn’t even need to be Apple devices.

Like I said - just a whole load of concern trolling.
One company never marketed user privacy, the other did.
 
Come on, when I was that age, when I was told not to do something, I’d just do it on the sly.


But there is a difference between playing say GTA 3 when your parents say you shouldn’t, and sending sexting pics of yourself.
At 10, you may not realize the true ramification of doing that.


All the feature is, to me, is a way for parents to keep an eye on their kids. It starts to build trust between parent and child, but gives the kid a gentle reminder that, if they start sharing images that maybe deemed “too far”, their parent will get an insight into what they are doing.

Basiclly it’s a, do you think your parents would be okay with what your doing right now.

I’m 100% against the CSAM on iCloud photo though. What I do online should not be accessible to anyone but me.
IMO, vigilant, loving parents don't even give their prepubescent children a smartphone. If someone's giving their kids a poison pill, it's just that--a poison pill. There is no way to mitigate the poison. There is no way to protect the kid from ingesting it. It doesn't matter what you do, they're going to get around any frivolous safety features you try to implement. When it comes to this, you can't have your cake and eating it too. Either you protect your kids by not handing them the poison, or you hand it to them and deal with the eventual consequences... whatever that may be.
 
Wow, so much just in the first 2 pages and it's almost up to 30 pages already!
Oh God! Don’t just delay it. CANCEL THIS, Apple. Can’t you see… people won’t be ordering the new iPhone 13 if you launch this child safety crap.
Say it sister, say it loud and proud!
I understand why this is a slippery slope but I don’t like the idea of child predators breathing a sigh of relief.
Do you like the idea of people being falsely accused of being child predators? Maybe it could be you. Or your wife or husband. Or one of your children. Because that's the possibility we're talking about here if this were to go through.
Just going to wait until everyone forgets and do a quite release a .1 update with some security enhancement etc..
I don't know what a "quite" release is. Now if you're talking about a "quiet" release, that's possible. But woe be to the Apple person who approves such a move.
Didn’t think Tim Apple would respond so quickly to my email!
"Tim Apple"? I get it if you're trolling him, but most of the time your attempt at being cutsie distracts from what you're trying to communicate. In fact, it's so distracting in this case that it makes it impossible to tell if you're just mocking him or if you actually did send him an email.
I wonder how many additional children will be victimized from now until then? Apple the greatest company in history with the greatest humanitarian intentions forced to deal with grandstanding ignorant politicians and self centered selfish advocacy groups. It’s unbelievable!
There's a new term that's out now, and I don't even think it's in Wikipedia or the Urban Dictionary yet. But I'll fill you in.

It's called "The Tyranny of Only Two Ideas".

Here's an example:

"Well, I have to come up with $30,000 by Friday or we lose the farm!" (basically the plot behind every "Dukes of Hazzard" episode ever broadcast)

Here's another example:

"If you're not with me, then you're against me."

Here's another example:

"I like waffles!"
"Why do you hate pancakes so much?"


You've just given a great 4th example:

"I don't want anybody to have their right to Due Process taken away."
"Then you support pedos and rapists."


Please don't succumb to the Tyranny of (only) two choices. You're smarter than that, right? You do realize that there are ALWAYS more choices than just the two that you've given, ESPECIALLY in your example.
They will end up CANCELLING IT. 🤫

It’s been a hot mess. Even good ol’ Craig (executive) admitted it. He knew Apple’s approach was wrong.

I love your passion, but this fight isn't over yet!
So Apple cares about the screeching minority after all?
"Screeching minority?" How do you even know it's a minority? Answer: You don't.
Delayed doesn’t mean cancelled just yet. Still not upgrading to iOS 15 until the latter state is reached.
This would be the smart thing to do.
Great news and a small victory for the "screeching voices of the minority"!

My fear is they just want the iPhone 13 launch to happen without any bad press or having to launch it with 14.X.
Again with the "screeching voices of the minority", really?

And as to making the launch happen without bad press? I'm actually fine with that. :) I don't care if the reasons for stopping it is that Apple is afraid of missing sales numbers, if that's what it takes for the right thing to happen, then I'm good with it!
You're either trolling or being rather hyperbolic. How many children were victimized over the past 15 years when Apple didn't have this in place? Was Apple's inaction, say 5 years ago, promoting the abuse of children?
Well played, and a great point to make!
I support the CSAM measures Apple proposed but there were legitimate concerns with its implementation. I think many of the concerns were overblown and potential issues in my opinion did not outweigh the benefits; others have different opinions. The discussion is worth having. What Apple is doing is spending more time to assess the implementation and implications of this measure. Doing it correctly is important.
Thank you for being willing to discuss it even if you have reservations.
I’ve spent a few weeks trying to figure out how to cut ties completely with iCloud… maybe more. I love Apple for privacy, not because I want them to be police. I already have tax dollars paying for police.
I'm with you on that, and I too would like to cut ties with iCloud, especially for backups and for getting music into my iPhone.

Oh and I'd like to get that U2 album cover out of my iPhone. Wouldn't it be tragic if that album cover got flagged as kiddie porn all because of a hash?
Maybe they saw how upsetting it was for the people. Our voices have spoken!
Keep speaking. It's not over until the fat lady sings, and she's not even out of bed yet!
A Pyrrhic victory.

Don't worry, in short order this will come back and be presented in a different light but with the same underlying concerns. Ostensibly- a drop in architecture that can eventually be replaced with searching for any signs of dissent, thought crime, the list goes on.
Thought crime; that's really where this goes, isn't it?
Plus the toothpaste of trust broken can't simply be put back into the tube: they insisted we didn't understand it, we were looking at it wrong, etc. etc.

How about an apology for insulting the intelligence of your user base and poor judgment?
Yep, this is actually more insulting than Microsoft's talking paperclip from a few years back.
The gaslighting failed miserably. And it is reported that Congress forced Apple to implement this.
🙄 <---- this is my surprised face at the Congress being at the root of this.
They ought to roll out beta enablement of this exclusively to Congress, those sickos.
LOL, for sure! Oh and maybe for governors of the state of New York, and certain Hollywood people for good measure.
And Apple ought to examine how bipolar they are to go on a big PR privacy push, only to reverse it and then some in the same year. Their brand is a mess.
You're not wrong. Some repairs are definitely needed. I mean really, within the span of 2 weeks, Apple pushed me from "ready to spend about $6,000 on a new MacBook Pro AND new iPad" to "ummm....it would cost me only about 30% if I just build another PC and will Samsung do everything that iPad will?"

And it turns out that yes, A Samsung WILL fit right into my music studio as well as an iPad.

Apple has about 4 weeks to kill this iSpy thing before I start ordering new PC parts.
They’re still going to release it at some point so this is all moot.
That's why it's up to us to be on guard. Protecting your own rights is a full-time job. Well, if you live in a place that recognizes that you have rights. ;)
 
Wow.

So you’re good with the poster I replied to stating “child safety crap”? Hope you don’t have kids.
No. I hope you don't have kids. If you require Apple to spy on your children in order for you to parent, then you already failed. But this isn't about our parenting skills so I'll leave it at that. There is also no point in discussing anything with you because you are dead set to think that anyone that opposes this equates to them being for Child Porn which is completely false and insulting. Having a discussion with some one that plugs their ears and goes la la la la is a waste of everyones time. If you are truly interested in understanding the big picture and why people are opposed to this, maybe go back and reread the comment section with an open mind, instead of responding with your childish personal attacks (ie. what wrong with you people, Hope you don't have kids, I hope my children never come into contact with you) which offers absolutely nothing to the discussion. Why not respond to our points and prove us wrong instead, but I'm sure you already knew that's how civil people debate because you are the only great parent on this forum.
 
Ya know…I’ll take the risk to save even one child….
But it doesn't work that way.

Nobody can guarantee that this would save "even one child".

But it definitely would raise the risk of false accusation and false imprisonment, especially if certain people figure out how to use this against you. You know, people such as your political enemies, business rivals, or hell, even some guy who is just jealous of you because you have a pretty wife.

Our Founding Fathers opposed that and made it very difficult to imprison somebody, and for good reason. If false accusation and false imprisonment were acceptable, then pretty much anybody could have their enemies jailed just by making a phone call. Or maybe in this case, sending you a photo from a burner email account.

Can you guarantee that you've never pissed anybody off in all your years of delaying gratification? All it takes is one, and then poor old Gene has some 'splaining to do.

Even if he hasn't done what he's being accused of. Good luck defending yourself against your enemies, and good luck paying those attorney fees! :oops:
 
You're not wrong. Some repairs are definitely needed. I mean really, within the span of 2 weeks, Apple pushed me from "ready to spend about $6,000 on a new MacBook Pro AND new iPad" to "ummm....it would cost me only about 30% if I just build another PC and will Samsung do everything that iPad will?"

And it turns out that yes, A Samsung WILL fit right into my music studio as well as an iPad.

Apple has about 4 weeks to kill this iSpy thing before I start ordering new PC parts.

I've certainly been eyeing the Galaxy Tab S7 (or S7+) myself, and I use my iPad Pro 12.9 quite a bit. And the Pixel 6 is just around the corner too...
 


Apple has delayed the rollout of the Child Safety Features that it announced last month following negative feedback, the company has today announced
Obviously child abusers aren't gonna put their filth on Apple devices when its known Apple will catch them. So what's this really all about? Slippery slope.



The planned features include scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

Apple confirmed that feedback from customers, non-profit and advocacy groups, researchers, and others about the plans has prompted the delay to give the company time to make improvements. Apple issued the following statement about its decision:

Following their announcement, the features were criticized by a wide range of individuals and organizations, including security researchers, the privacy whistleblower Edward Snowden, the Electronic Frontier Foundation (EFF), Facebook's former security chief, politicians, policy groups, university researchers, and even some Apple employees. Apple has since endeavored to dispel misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more.

The suite of Child Safety Features were originally set to debut in the United States with an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. It is now unclear when Apple plans to roll out the "critically important" features, but the company still appears to be intent on releasing them.

Article Link: Apple Delays Rollout of Controversial Child Safety Features to Make Improvement
 
Exactly this, and as I mentioned in a previous post, catching these morons often helps law enforcement actually nail the scumbags who are creating this stuff.
No. It may help law enforcement ARREST them, but without due process, including the service of a warrant signed by a judge, it might actually fail in NAILING them by putting them in jail.

Or, as I've said above, it will ensnare people who didn't do the crime, such as the guy in one of the earlier posts who said he received pictures of male "junk" because somebody transposed a couple of numbers on a text message.

The bottom line is that the vast majority of the disturbed people who collect CSAM honestly don't think they're doing anything wrong. After all, "they're just pictures," so they don't make much of an effort to hide them.
I'm all for catching them and putting them away. But we have to do it right and we need to not be imprisoning innocent people, either accidentally or on purpose.
 
  • Like
Reactions: PC_tech and BurgDog
Obviously child abusers aren't gonna put their filth on Apple devices when its known Apple will catch them. So what's this really all about? Slippery slope.
It’s not even a slippery slope anymore, it’s Apple jumping in head first. A slippery slope might suggest innocence and unintentional consequences. When experts already showed how bad this is, and Apple still being firm about it, it can be deduced that Apple had a full intention and commitment into this.
 
No. It may help law enforcement ARREST them, but without due process, including the service of a warrant signed by a judge, it might actually fail in NAILING them by putting them in jail.

Or, as I've said above, it will ensnare people who didn't do the crime, such as the guy in one of the earlier posts who said he received pictures of male "junk" because somebody transposed a couple of numbers on a text message.


I'm all for catching them and putting them away. But we have to do it right and we need to not be imprisoning innocent people, either accidentally or on purpose.
Experts already said that this won’t really do much for its advertised purpose. And yet a mass device scanning system will be in place. It’s a matter of using it for other purposes because “why not?”
 
Apple thinks we don’t understand. WRONG! We fully understand. You are whoring for law enforcement by unconstitutionally searching our devices WITHOUT A SEARCH WARRANT FROM THE COURT. That’s the bottom line regardless of their so-called good intentions “for the children”.
 
No. It may help law enforcement ARREST them, but without due process, including the service of a warrant signed by a judge, it might actually fail in NAILING them by putting them in jail.
Well, of course there's due process... We're talking about good old-fashioned detective and forensic work here. Obviously it's not as simple as finding a few photos on somebody's iPhone.... It's everything that happens after this that's important.

The discovery and reporting of photos is generally enough to get a warrant signed by a judge to then carry on the rest of the due process. In the case that I was most directly familiar with (which I explained in another post earlier in this thread), the person working for me who was found guilty of possession of child pornography was arrested and charged only after the usual due process. A warrant was obtained as a result of a statement from a co-worker in his full-time job who had observed CSAM on his PDA. That was enough to get a judge to sign a warrant, at which point they searched his home and found a lot more, and the case proceeded from there.

I'm not privy to the details of the other investigations that resulted from this, but the detectives involved told me that the information they found as a result of this possession case was instrumental in getting an arrest and conviction for another local person who was actually creating CSAM — making videos of kids in his basement.

Or, as I've said above, it will ensnare people who didn't do the crime, such as the guy in one of the earlier posts who said he received pictures of male "junk" because somebody transposed a couple of numbers on a text message.
While that's a fair statement, I don't think Apple's proposed implementation of CSAM Detection is very likely to ensnare people who aren't actually collecting CSAM.

Firstly, it only runs against photos that are uploaded to iCloud Photo Library. It doesn't scan Messages at all, so nobody is going to get caught "accidentally" receiving CSAM. It also has a threshold (supposedly about 30 photos) before anybody even knows about the flagged photos.

I don't think anybody who has more than 30 photos of known CSAM in their Photos app can honestly claim that they got there "accidentally." Nothing gets added to your photo library unless you specifically add it — either by taking it with your iPhone camera or manually saving it in there from another app.

It's also fair to say that law enforcement is still going to need to follow due process — a CSAM report from Apple may be enough to get a search warrant (like it did in the case I mentioned above), but they're not going to go and arrest anybody on the basis of that alone. If the cops don't turn anything else up as a result of a warrant, it's going to be pretty hard to build a case. Chances are pretty good, however, that anybody who has 30+ images of known CSAM in their photo library probably has a lot more elsewhere...
 
Last edited:
Do you like the idea of people being falsely accused of being child predators? Maybe it could be you. Or your wife or husband. Or one of your children. Because that's the possibility we're talking about here if this were to go through.
The chances of this happening are astronomically tiny. Less than 1 in 1 trillion a year per account according to Apple. At this stage we have to accept their numbers. People are getting freaked out (fair enough) and because of this are fearing the worst and creating a little bit of mass hysteria.

Please don't succumb to the Tyranny of (only) two choices. You're smarter than that, right? You do realize that there are ALWAYS more choices than just the two that you've given
Good point. Unfortunately this is pretty much the entire USA in 2021.

I'm with you on that, and I too would like to cut ties with iCloud, especially for backups and for getting music into my iPhone.
It's not that hard to do.

Oh and I'd like to get that U2 album cover out of my iPhone. Wouldn't it be tragic if that album cover got flagged as kiddie porn all because of a hash?
Nice joke.

Keep speaking. It's not over until the fat lady sings, and she's not even out of bed yet!
Moaning on a forum does jack all. Gotta contact Apple or a politician etc.

LOL, for sure! Oh and maybe for governors of the state of New York, and certain Hollywood people for good measure.
Absolutely.

Apple has about 4 weeks to kill this iSpy thing before I start ordering new PC parts.
Why wait. Just do it. This isn't going to change. Apple aren't doing this for tin foil hat reasons. It's about optics. So I recommend you to do it if you feel so strongly about it.
 
Last edited:
  • Like
Reactions: jhollington
And Apple isn’t the only company scanning for CSAM; google does it even more aggressively. If this were a genuine issue, we would have heard of more people being “busted” this way from people uploading such images to their google drive account, and it doesn’t even need to be Apple devices.

Like I said - just a whole load of concern trolling.
I spent $2000 on a new Linux laptop because of this. Bye bye Apple!
 
  • Like
Reactions: Pummers
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.