Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I think you need to re-think this as well.

If your children are of the age when the internet and mobile phones with cameras were about, what if someone used their mobile phone and took indecent pictures of your children when they were younger and uploaded them to some cloud service, wouldn't you have wanted some type of system in place to prevent that from happening and to prevent the pictures from being distributed around?

Or are you still of the view that even if a system was in place to prevent that from happening, you would not want it because it intrudes on your privacy?
What on earth are you going on about? IT DOESN"T DO THAT!
 
What a headline "EFF Pressures Apple ...
regardless of how one feels about Apple's plan - how can the EFF "pressure" Apple? Do this or what?
Poor journalism, click/bait MR
It's easy (at least disproportionately easy) to pressure Apple by keeping a subject in the headlines and people talking about it when Apple would prefer it just go away.
 
George Orwell’s “1984” enough said 😡!!! It’s incredible Apple ever considered this let alone attempts to execute it. Seriously! No one at Apple ever read “1984”? Apple even made a commercial related to this book! 😠 REMEMBER APPLE! The Super Bowl ad. Stop this already, stop hurting the trust your users had in you about being a CHAMPION of privacy. Stop acting like BiG BROTHER!!!
 

Attachments

  • 262F17D0-E139-4A46-8F99-463B02FABCD8.jpeg
    262F17D0-E139-4A46-8F99-463B02FABCD8.jpeg
    165.7 KB · Views: 55
This is the only remaining mystery, since we by now understand the technology and have seen feedback from interest groups and academia.

We've discussed this point in many of the other threads but are, obviously hampered by our lack of inside data.

Below is a a summary of the prevailing hypothesis with a dash of Occam's razor applied as we go along:

1 - An assumption that Apple has suddenly turned into a company willing to throw years of consumer trust overboard for a noble social cause sits very poorly with the evidence that shows Apple to be predominantly extraordinarily cautious and commercially cold blooded and analytical. Therefore we accept that Apple has taken this initiative after careful commercial and legal evaluation.
2 - Once we accept the premise from 1, we must consider possible reasons for the action:
2.1 Apple projects increased revenues and sales from this initiative. Give the almost universal, public negative feedback and Apple's to date lack of interest in the area it's unlikely that's the reason.
2.2 A more likely explanation is, that Apple is facing a potential loss from their inaction - as one of the few IT companies - towards removing CSAM from their services. We can't gauge the size, probability or the timetable for the loss of inaction to occur, but given Apple has taken action, we hypothesize it's substantial and likely. Let's define that weighted loss as WLoss(inaction)
2.3 @Nuvi is entirely correct, there is certainly a loss associated with the action taken. Let's define that weighted loss as WLoss(action)
2.4 Apple's analysis must have shown WLoss(inaction) > WLoss(action). CSAM scanning then gets implemented, to minimize the projected loss.

Without more information, we can't refine our analysis further. So, here we are 🥳.

I wonder how much Apple has actually gamed this out, and if we're on one of their playbooks or into unknown territory.

Please, can someone sue already - so we can get to discovery and get at those emails 😂

Apple said in their initial announcement it was due to the fact that they've contributed so little to catching CSAM compared to the other "FAANG" companies, in the name of privacy. They thought this clever way of doing it would maintain privacy while jumping on the bandwagon of catching CSAM being uploaded to iCloud.

They were so proud of their methodology they made it this big public announcement which ultimately had unintended consequences (making this issue more public and undermining their privacy narrative).

Their methodology is indeed clever, but their attitude (pride?) about it likely blinded them to the potential pushback of shifting some of the responsibility on to the physical devices and creating a workaround to maintaining encryption while also being able to scan. Either that, or they thought this might be an issue, which is why they announced it prior to iOS 15 release, to gauge how much pushback really existed.
 
  • Like
Reactions: BurgDog
Apple has made this too public. They should've enabled iCloud scanning like all of the other companies do, made a subtle quiet announcement about it, and left it at that. THEN they would've caught a lot more.
They would have caught a lot more crap, and they might have even gotten sued in numerous countries. No no, Apple wouldn't want THAT, because that's even worse than how this drama is playing out right now.
 
The problem is that Apple may act in good faith, but when they are compelled by a court order, they will no longer be acting in good faith, but under orders.

Do you (not you specifically, you in general) honestly think Apple will pull out of China if China wants them to scan for something? Ignoring of course that China might already be doing it since Apple already caved and is housing Chinese iCloud servers inside China?

If the fact that Apple already caved to China isn't evidence enough, just wait until Russia does it. Or the US. Or Australia or Canada or the EU.

Apple won't withstand the pressure and it will have to cave.

Under US Law, the Government cannot hijack a company to build something, like a back door.
However, it is already built, it can be repurposed. Though the law there is a bit grey too.
 
You honestly believe private companies should start monitoring their users? You want Apple Police, Microsoft Police, Google Police etc. of gathering info for some other private organisation so they can use it to their own purpose? Shouldn’t we leave hunting down the criminals to governments and law enforcement agencies and not to some shady groups who are not governed by the laws like law enforcement?
You act like this isn't already being done, so manly are misinformed on this subject it is scary. These companies have to report any known images stored on their cloud servers to the FBI.
 
I think people are aware. And the biggest offenders of child abuse and sex trafficking rings are some famous people such as rich celebrities, actors, producers, politicians etc. Just recently I read about this.

"Oprah Winfrey said on Friday that she was cutting ties with a documentary centered on women who have accused the music mogul Russell Simmons of sexual misconduct. The untitled film, scheduled to have its premiere this month at the Sundance Film Festival, focuses primarily on the executive Drew Dixon, who accused Mr. Simmons of raping her, an accusation Mr. Simmons has repeatedly denied."

What was Apple's reaction? "Apple declined to comment."

Of course this is not child abuse related but it is disgusting and hypocritical from Apple to preach about protecting children and women while at the same time they don't take any action when there is clear indication that these high prominent public figures are caught up in such stories.
"The Oprah" is NOT a court of law. This could be defamatory. That's okay, she can afford it. Probably.

So let's allow this to play out in court. It sounds as though Mr/Ms Drew should press charges, and let's see if a DA is willing to let a Grand Jury have a look at it.

Until that happens, nobody should make any assumptions one way or another. And Apple SHOULD decline to comment, especially if they might be called upon to provide testimony.
 
Well, if it’s the law, then they’ll have no choice.




So if I understand this correctly, when forced by the Government (EU) to start recording specific user IP addresses, something they had the capability of doing, they complied.

Note: I use this service.
 
Last edited:
  • Like
Reactions: Philip_S
Tanks in Tiananmen Square for one thing.

Hong Kong protests.

The Taiwanese flag.

Rubber duck sculptures. (used to relate to tanks)

Candle icons. (Used to mourn dead protesters)


Those are some of the things that are censored.
So you have no idea what this tech is or does... It only can compare images to known images that have already been hash...It isn't some AI looking for this or that inside of a photo...
 
  • Like
Reactions: one more
Really? EVERYTHING? So are you advocating eliminating the 4th Amendment in the US? Cameras in your house on 24/7 so we can do "EVERYTHING possible to stop this kind of abuse"? Microphones active there 24/7? Recordings 24/7 to be able to verify?

No curtains in your home, after all, if you don't have something to hide, why would you need curtains? Or doors on the bathroom?
Oooh, well done, well done.

But there's a problem with no curtains. If children were walking by when you were inside getting dressed, they'll just send you straight to jail anyway.

And that's what this is all about anyhow. Locking more people up without first having to do the hard work required to actually convict them of a crime.
 
So you have no idea what this tech is or does... It only can compare images to known images that have already been hash...It isn't some AI looking for this or that inside of a photo...

No, the problem is you trust that no other country with a billion potential customers will force Apple to include other hashes in their system and turn it on in their country in order to sell their devices there.
 
So… Too many customers turned off automatic ios updates? Customers stopped renew icloud subscriptions? They are afraid of selling rates of new iphones with a scanning options violating privacy?

And now they are planning to ”delay” this until people update ios and then they turn the scanning on as ”fixed”?
 
Photos you receive don't get scanned, photos that merely look like a bad photo don't get you in trouble, and one photo isn't enough to raise an alarm. Your worries are not based in reality.

Can’t say yeah or nay.
On one hand Apple’s response says scanned and hash matched on device then uploaded.
On another Apple’s response was scanned and hash matched during the upload process.

Apple has been less than clear on just how this process executes and when.

That is one of the “asked for” items; just how does this process really work? How about we get some independent peer review or an independent audit?
 
  • Like
Reactions: mr_jomo
Well this debate has certainly opened my eyes to the attitudes of many members in this forum in that they do not want filth, they dislike filth but only as long as it does not affect them personally.

In my opinion it's an appalling attitude. Yes I know I will get many down votes but gauging from members attitudes on the issue that's to be expected.

Protection of children is paramount and if CSAM goes someway to stopping child filth images being distributed or even to the point of catching those involved in the distribution of child filth images then I am all for CSAM and however Apple wants to implement it.

I accept that people are allowed their views and opinions on this issue but some of the views and opinions expressed by some members in here are appalling in my opinion and therefore I no longer wish to debate with people in here because they disgust me. Enjoy you debating.
 
Well this debate has certainly opened my eyes to the attitudes of many members in this forum in that they do not want filth, they dislike filth but only as long as it does not affect them personally.

In my opinion it's an appalling attitude. Yes I know I will get many down votes but gauging from members attitudes on the issue that's to be expected.

Protection of children is paramount and if CSAM goes someway to stopping child filth images being distributed or even to the point of catching those involved in the distribution of child filth images then I am all for CSAM and however Apple wants to implement it.

I accept that people are allowed their views and opinions on this issue but some of the views and opinions expressed by some members in here are appalling in my opinion and therefore I no longer wish to debate with people in here because they disgust me. Enjoy you debating.

Hard to argue .... with someone who has no idea whats going on. Are you done yet?
 
Tim Cook is acting like the bad guy from Titanic who used the little girl to get into the lifeboat 😬 that’s how I feel Apple is trying to do. To use children as an excuse when in reality it’s for other reasons they want to spy on us.
Yep. Look no further than QAnon to see how concern for the welfare of children can be weaponized for nefarious purposes. “Think about the children” has long been a Trojan horse to either repress people (the argument of choice for homophobes in the late ‘90s) or to increase surveillance.

it’s interesting to note how many people who thought that tech companies were becoming too powerful are the ones who are all gung ho for this incredibly creepy “feature.”
 
What, over 90 organizations (not just 90 people) come out against this, and you say you don't get why? Why not do a little research into why?

But here, I'll help you how this could have disastrous consequences for many children:

1. False accusation against a minor who receives or takes a photo that "looks like" a bad photo enough to get a hash match. Now we get into a court case where the government is going after a minor (or the minor's parents) over a photo?

2. A person who is a parent gets falsely accused for taking or receiving a photo that matches a hash. Now what? We go to court, or based on what some people are saying here maybe we just skip the court phase and directly put that parent in jail and we send all his or her kids off to state care.

There are probably a thousand other ways where this hurts children, but that's just the first two that I could think of.

Well, these 90 organisations probably do not trust Apple or perhaps any other tech company much. I am not naive either, but Apple’s proposed solution, according to them, would only trigger an alarm when there are over 30 of these hashed pictures on any given device. And then these pictures will be reviewed by Apple staff first. I also hope these pictures can then be traced back to their source? There are so many ways to frame somebody that this does not seem like a particularly easy one to me. 🤷🏻‍♂️
 
"The Oprah" is NOT a court of law. This could be defamatory. That's okay, she can afford it. Probably.

So let's allow this to play out in court. It sounds as though Mr/Ms Drew should press charges, and let's see if a DA is willing to let a Grand Jury have a look at it.

Until that happens, nobody should make any assumptions one way or another. And Apple SHOULD decline to comment, especially if they might be called upon to provide testimony.
That's not the point here. I'm not talking about the court of law. I'm talking about Tim's Apple being hypocritical when they glorify Oprah as if she's the best thing since sliced bread and they refuse to cut ties with her even after her very well documented connections to Weinstein. Why does Apple insist on glorifying her and doing business with her on Apple TV when her "activism" and morals are highly questionable as her own actions show?
 
absolutely!
Its mind-blowing how people are not prepared to do EVERYTHING possible to stop this kind of abuse. People need to get it in to their heads that if they are online in anyway there is no such thing as complete privacy. Someone will be able to get into whatever you think you are protecting. Your location, everything.. its all tracked. You're fighting for an illusion.
So your point is that, because we’re already being surveilled 24/7, we should fight LESS against increased surveillance, because we’ve already lost? Nice.

Surely if we already have no privacy, child abuse has already dropped to zero, right?
 
Can’t say yeah or nay.
On one hand Apple’s response says scanned and hash matched on device then uploaded.
On another Apple’s response was scanned and hash matched during the upload process.

Apple has been less than clear on just how this process executes and when.

That is one of the “asked for” items; just how does this process really work? How about we get some independent peer review or an independent audit?
The photos are hashed and uploaded, which constitutes the upload process.
 
So… Too many customers turned off automatic ios updates? Customers stopped renew icloud subscriptions? They are afraid of selling rates of new iphones with a scanning options violating privacy?

And now they are planning to ”delay” this until people update ios and then they turn the scanning on as ”fixed”?

You do realize this was intended to roll out to iOS, iPadOS, and MacOS.
 
  • Like
Reactions: Pummers
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.