Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
For the last time, what is on my phone is private and requires a warrant to search. Factual today. Apple searching my phone and reporting possible criminal findings to a group that reports to law enforcement is problematic, likely a warrant work-around, and likely unconstitutional. This if it happens will definitely be challenged in court. Several lawyer groups including the ACLU expressed this.

My opinion aside, if you feel they are exaggerating and basically wrong, you are entitled to your opinion. I happen to agree with the opinion that it is very likely illegal.
Just turn off iCloud sync and you’re all set.
 
  • Like
Reactions: ct2k7 and dk001
Your data is not (barring outright nefarious phishing and scams) done on an individual user basis. Even with a device ID or an email user, these are hashed and transacted in MASSIVE swathes of aggregate data based on very broad attributes. Individuals' data is worthless. PEOPLE'S data is valuable. This girl's data on is own has no value. Okay maybe fractions of a penny. (Also, email behavior is generally not transacted and only used for first-party analysis on how those actual emails themselves are performing)

Yes, your data is being transacted (including by Apple!) but it's nothing like how Apple portrays it. Google doesn't target you on an individual basis. Meta doesn't target you on an individual basis. Nor does Amazon or any other ad platform you've heard of.

Furthermore, ads are a reality that aren't going away in many cases where we receive things for "free." I live in Texas and like podcasts. Since I have opted out from Apple getting access to my data (Apple requires an opt-out for itself, but requires developers to get an opt-in) I receive ads in Spanish. A lot. This is because Apple can't transact my info with audio ad networks so they just guess based on the fact that I'm in Texas. It's... kind of annoying.
 
It doesn’t mean that proper Authorities with proper reasons shouldn’t have some level of knowledge.

That's the rub, isn't it? Who gets to decide that? When people want to protest a government action, is that a 'proper reason'? What about when government says it is? When people want to stand up against communism, or locking up political dissidents, is that a 'proper reason'? When it's parents working to actually PROTECT their children, but some agency decides that the parents' actions are not in the 'best interest' of the child, does that now become a 'proper reason'?

Only fools think that using such technology and allowing such access won't eventually be abused by the very people who are supposed to be 'protecting' the the individual.
 
Last edited:
Maybe I'm misunderstanding how Apple's tracking feature works, but it seems to me a more accurate ad would be Ellie going up to the microphone and asking the bidders to leave. Then, half the bidders exit out altogether, while the other half exit into another room. Ellie has a false sense of privacy because she thinks she's cleared out all the bidders, but half of them are in the next room continuing to bid on her data.

Well done ad, though, even if disingenuous.
 
  • Like
Reactions: dk001
So I wouldn't count on it being canceled just yet. Just "improved".

In fact, 21 April 2022 one sees:
“CSAM detection features are also being added to Apple apps such as Siri, Spotlight and Safari Search. These apps will intervene if the user searches for queries related to child exploitation.”

see https://www.siliconrepublic.com/enterprise/apple-csam-child-safety-sexual-abuse-material-uk-privacy

Not to mention the recent stories about the EU and requiring it.

I’m shocked this isn’t being talked about. Safari checking everything you search for is a major privacy invasion IMHO, it’s bypassing any https encryption between you and the search engine and also any organisational security (like a company VPN).

While I’m sure people will try to defend it saying it only looks for certain keywords, how can we know what keywords are safe? What happens if someone mis-types something, or someone searches for something unsafe as a ”joke” on someone else’s device? What if it’s a child looking for help, is a message telling them searching for those terms is “problematic“ (Apple’s own words) helpful? While it’s only focused on abuse terms now, what happens when governments start wanting other terms to be monitored? We know from China that Apple are more than happy to oblige governments.

This is exactly the slippery slope everyone was concerned about when the CSAM scanning was first introduced. Having our own equipment monitoring us is absolutely not the right way to do this.
 
  • Like
Reactions: dk001
That's the rub, isn't it? Who gets to decide that? When people want to protest a government action, is that a 'proper reason'? What about when government says it is? When people want to stand up against communism, or locking up political dissidents, is that a 'proper reason'? When it's parents working to actually PROTECT their children, but some agency decides that the parents' actions are in the 'best interest' of the child, does that now become a 'proper reason'?

Only fools think that using such technology and allowing such access won't eventually be abused by the very people who are supposed to be 'protecting' the the individual.
Yep. That’s the Rub for sure. And most of the facts are hidden from us too. Thus the reason we need whistleblowers such as Snowden & Assange.
 
You cannot use the wider internet without giving up some data. Macrumors will have ad profiles, however loose on all of us so that we see relevent topics. I doubt very much any US readers can see the 'RHS Chelsea Flower Show' adverts that pop up in the UK for me.

Its impossible to avoid, even with VPNs and the like. Sites will always have some sort of data on you. You could login here using a VPN and a privacy browser but Macrumors knows that because you logged in using the same user account and they will keep track of what articles you engage with. I am not critical of this methodology; Macrumors need to pay the bills and they do not charge users.
You seem to be unaware that you can go into your account settings on MacRumors and select "Paid Membership" - pay them upfront, and get zero ads. That's why some users have "Contributor" or "MacRumors Demi-{god,goddess}" under their names. Highly recommended.

What I do not like is Apple's monetisation of people's alleged privacy. Apple still have access to your customer profile and will send you emails suggesting apps you might like.
In my experience, Apple only bothers sending such email if you leave the boxes checked saying you welcome marketing email from Apple.
 
Look up how CSAM works. It compares your pictures (in a hashed form) to known images in a database. The match has to be identical. So your personal pics would never be flagged. Plenty of people have pics of their kids in the bath, etc. They wouldn't be flagged because they aren't in the hashed database.
Regardless, I'd think the photos would be looked at before the police knocked down their door (and they'd see they're not relevant).
 
Regardless, I'd think the photos would be looked at before the police knocked down their door (and they'd see they're not relevant).

True however that was one of the concerns; why is Apple inserting itself in the middle instead of letting some system like MCMEC or ICMEC do it?

Only realistic answer was to keep confidential how good/bad their matching system really is.
Most likely bad.
 
Yes it is insidious, but let's not lose sight of the fact that it is not evil to prevent child pornography.
I don't buy the "ends justify the means"-type arguments. I don't believe that private corporations setting themselves up as law enforcers through surveillance of their customers is a worthy or desirable situation. By the same reasoning, one could argue that ExxonMobil should be allowed to set up roadblocks and checkpoints to check for drunk drivers because "stopping drunk driving is a worthwhile goal", or maybe Coca Cola could bring a team of drug dogs to your office to make sure there isn't any illegal drug activity because "illicit drugs are bad". The problem is not in the stated goal (i.e. help curtail child porn), the problem is in a private corporation getting into law enforcement. Since they are not government, they are not subject to the Constitution, hence all of our normal protections against unlawful search and seizure are moot. If we allow corporations to become a shadow police state, we've got much larger problems than child porn.
 
I don't buy the "ends justify the means"-type arguments. I don't believe that private corporations setting themselves up as law enforcers through surveillance of their customers is a worthy or desirable situation. By the same reasoning, one could argue that ExxonMobil should be allowed to set up roadblocks and checkpoints to check for drunk drivers because "stopping drunk driving is a worthwhile goal", or maybe Coca Cola could bring a team of drug dogs to your office to make sure there isn't any illegal drug activity because "illicit drugs are bad". The problem is not in the stated goal (i.e. help curtail child porn), the problem is in a private corporation getting into law enforcement. Since they are not government, they are not subject to the Constitution, hence all of our normal protections against unlawful search and seizure are moot. If we allow corporations to become a shadow police state, we've got much larger problems than child porn.
In the same vein as color printers not copying money? Private corporations manufacturing Printers are into law enforcement.
 
In principle?
The non-scan was a global cooperative between Governments and tech on both currency design and scanning technology.
What Apple proposed was strictly a single corporative initiative.
It seems they this type of thing will be commonplace. Either apple scans your phone or iCloud. Either way private companies are doing the work of law enforcement.
 
We don’t know where the line for choice will be. And if android will have the same regulations.

Right now we know most are doing the scanning, if they scan, in email and cloud. Apple so far is true only one trying on device.
So far not a single professional group came out in favor of Apples design.

So far we have nothing to show this is being required.
 
Right now we know most are doing the scanning, if they scan, in email and cloud. Apple so far is true only one trying on device.
So far not a single professional group came out in favor of Apples design.

So far we have nothing to show this is being required.
The EU has some pending legislation (according to a post by MR editors). But it seems the handwriting is on the wall.
 
I don't buy the "ends justify the means"-type arguments. I don't believe that private corporations setting themselves up as law enforcers through surveillance of their customers is a worthy or desirable situation. By the same reasoning, one could argue that ExxonMobil should be allowed to set up roadblocks and checkpoints to check for drunk drivers because "stopping drunk driving is a worthwhile goal", or maybe Coca Cola could bring a team of drug dogs to your office to make sure there isn't any illegal drug activity because "illicit drugs are bad". The problem is not in the stated goal (i.e. help curtail child porn), the problem is in a private corporation getting into law enforcement. Since they are not government, they are not subject to the Constitution, hence all of our normal protections against unlawful search and seizure are moot. If we allow corporations to become a shadow police state, we've got much larger problems than child porn.

First, your examples are out there. ExxonMobil doesn't control the roads, and it does not have any responsibility to what goes on those roads; they have no duty to mitigate potential road crimes. Coca Cola doesn't control my office, and certainly doesn't have any responsibility for what happens in my office; they have no duty to stop consumption of drugs in my office. But Apple does control it's iCloud servers, and to some degree also Apple controls the software running on all iPhones. And moreover, Apple can be responsible (and moreover, can be held partially legally liable) for crimes that occur on the iCloud servers or crimes that are facilitated by iPhones. Apple might have a duty to mitigate those harms.

Second, it's not binary. I agree that "the ends justify the means" is not by itself a valid reason for anything. I did not suggest it was. But it is one factor among many. So in this case, I am not saying Apple is free to do anything at all to help curtail child porn. I was responding to a comment that said Apple's actions are evil. But in this case Apple is not evil, they are in fact trying to prevent evil. That is one factor to consider.
 
  • Like
Reactions: CarlJ
Currently, photos are encrypted on iCloud, but Apple has access to the key. They can respond to warrants to access your iCloud data. I think the CSAM scanning will be a requirement once Apple moves to end to end encryption. If they don't have a key, they need a way to prevent child abuse materials from being stored on their servers.
 
Currently, photos are encrypted on iCloud, but Apple has access to the key. They can respond to warrants to access your iCloud data. I think the CSAM scanning will be a requirement once Apple moves to end to end encryption. If they don't have a key, they need a way to prevent child abuse materials from being stored on their servers.

Has there been any change in Apple's stance on this?
Last I saw (during CSAM mess) was that Apple was not pursuing E-E.

If Apple does, this could get really interesting.
 
Has there been any change in Apple's stance on this?
Last I saw (during CSAM mess) was that Apple was not pursuing E-E.

If Apple does, this could get really interesting.
No. Just speculation on my part. I don't see the point of Apple risking the backlash without an obvious consumer benefit as a tradeoff.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.