Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Turn off iCloud to disable it is just a policy decision. The problem is not solved. We are just given an "illusion" of remaining in control, while in fact they are taking away our actual privacy keys.

The Core OS contains a backdoor now (a new door in any case), policy "for now" allows you to flip the scan button on/off. Such policy can change overnight and there is already a door ready for it, not to mention the precedent for further expansion.

It always starts with an inch.

Privacy is not secrecy, privacy is not hiding, privacy is a pillar of your individual freedom. There can be no good compromise when someone else (not you) holds the keys to your privacy. It doesn't matter at all for what they say it is being used for. Apple building this backdoor (call it new doors if you prefer) is holding the keys, not you. All you will be able to do from iOS15 onwards is, open or close those new doors. You will never be able to lock this new door in case you wonder.

Apple made a new door you didn't ask for, unlocked those doors for your convenience and told you: "Don't worry, as long as you keep this new door closed (not locked, as you can't be trusted with actual privacy), your privacy remains same as always, all we did is taken your privacy keys (ability to remain private from us) but we remain committed to allow you (for now) to close/open "our new" doors (for scanning on your property and you paying for it) so you can continue to feel safe and private as always."

In other words, we, the Apple can bring some new furniture to your living space anytime our policy changes, we may also invite some guests you don't know, all to keep you safe and more private than ever. For the time being nothing should change for you, those iCloud doors is something you are already used to being either closed or open, all we did is take your ability to remain private from us (shut us out) but we will use this to add new features we otherwise would not be able to add.

As obvious as all of this is to "screeching voices" some of us have, it sounds like unreasonable paranoia to the majority. Two decades of being committed to apple, a family of 6 trusting them, with our wallet and privacy …lost forever. Even if they reverse their announced policy, never again can they be seen as champions of privacy. This is not something to be happy about, I feel we, the users, all of us, even those who don't see the threat yet at this time (or ever) have lost much with it.

I have very little hope for this being reversed. If this gets reversed it will be a big win for all users …unfortunately trust lost can not be reversed, the damage is already done but it would still be very helpful and seen as a step back towards users privacy.
 
I think you are overthinking this.

The Apple representative will look at the images from the safety voucher and if at least one of them looks like CP as defined by federal law to them, it will be reported.

If Apple does this in good faith, they gain limited immunity from civil and criminal charges in any federal or state court.
I would not want to be the Apple rep that has to review.
 
Thanks for the links @Mac4Mat, Techcrunch is an interesting link I haven't seen yet, and the piece you listed from PatentlyApple is an absolute first in my own observations. I'm highly amused reading the "screeching voice" of those wanting an authoritarian government in the article below (apparently the writer from PatentlyApple who supports warrantless searches):


I'm Gen X, who generally has little in common with Gen Z or the boomers, but it seems like Apple has finally drop kicked an issue out there which absolutely unities and unifies people in their common disdain and dislike over big tech acting like Big Brother. Getting rid of any opinion that differs from your own and deplatforming people from social networks, or editing reality itself, are apparently all good so long as it has no effect upon your privacy, personal beliefs and desires.

But, oh hey, Dear Apple? Get the **** out of my property appears to be a rallying cry, everybody can agree upon. Who would've thunk? the backlash to #spyphone seems to be spanning all generations. I'm amazed.

Apple on the wrong side of the ACLU is a first. Does that revoke their wokeness?

Genuinely curious, you appear to have personal knowledge regarding Tim Cook, how would he feel about what Apple is trying to do to an entire generation of kids, if it were applied to him growing up in a Baptist family in Alabama? He didn't come out about his own sexual orientation until 2014.
 
Last edited:
Contradictions: For a start mass scanning of all images in the cloud is actually a stretching of the situation. Because MORE hardware scans will take place than scanning in the cloud, as its logical that there is far more material that some do not use the cloud? A billion plus devices all with software to scan, irrespective of whether its a hash only, represent more surveillance than assuming that every picture on every device ends up on the cloud?

"Apple identifies fewer of this harmful and illegal content than other cloud suppliers"

Evidence please? or could it be that Apple identifies fewer as it intends to use our hardware rather than the cloud, so it figures if they'e not been scanning via the cloud, they would hardly be up with Google etc. in identifying such material via the cloud.

In any event the focus point isn't the nature of the scanning, it is the infestation of 1,000,000,000+ users private hardware with software tools that can easily be modified, giving a backdoor to over 1,000,000,000+ individual users, even allowing targeted modification and where because the software is on your systems it bypasses System Integrity Protection so modification is comparatively easy and where unlike a lot of functions on Apple which have users in mind to provide software often with choice to the user as to switch on or off, this software is not designed to assist the user at all, the user who pays for their hardware.
Contradictions: For a start mass scanning of all images in the cloud is actually a stretching of the situation. Because MORE hardware scans will take place than scanning in the cloud, as its logical that there is far more material that some do not use the cloud? A billion plus devices all with software to scan, irrespective of whether its a hash only, represent more surveillance than assuming that every picture on every device ends up on the cloud?

"Apple identifies fewer of this harmful and illegal content than other cloud suppliers"

Evidence please? or could it be that Apple identifies fewer as it intends to use our hardware rather than the cloud, so it figures if they'e not been scanning via the cloud, they would hardly be up with Google etc. in identifying such material via the cloud.

In any event the focus point isn't the nature of the scanning, it is the infestation of 1,000,000,000+ users private hardware with software tools that can easily be modified, giving a backdoor to over 1,000,000,000+ individual users, even allowing targeted modification and where because the software is on your systems it bypasses System Integrity Protection so modification is comparatively easy and where unlike a lot of functions on Apple which have users in mind to provide software often with choice to the user as to switch on or off, this software is not designed to assist the user at all, the user who pays for their hardware.
evidence : https://9to5mac.com/2021/08/20/apples-anti-fraud-chief-child-porn-statement/
 
Bit hard reading message above, you mixed in your message with your quotes, but agreed. Other than all that, Apple's SIP (System Integrity Protection), cryptographically sealed volumes, etc, blah blah, doesn't appear to actually work. I mean Pegasus seems to punch through all that and turn it into Swiss cheese anyway, so along with Apple's "walled garden" it's just another narrative that has little to do with objective truth.

Build a better mousetrap, and shortly thereafter, a better mouse will emerge.
 
With that logic, there should be no parental control features on any software or service since it can be abused by those with sufficient access to another adult's device or accounts.

I agree, there really shouldn't be. If such controls have to exist they need to be operated and verified by a third party.
 
Expect the class action law suits to start flying if allegations are proved correct that Apple implemented this without notification and where it is alleged it was noticed hence the public announcement on it.

Oh Apple, you have really screwed up over this.


Worth learning to use Console or similar depending what device(s) you own.
Thanks for the link!
I watched the Craig Frederighi interview , in the other very useful link
where he tells the Wallstreet Journalist that checking the images happens on the “pipe” taking photos from your iPhone to iCloud. In this article they say it happens on your iPhone
While everywhere else has been said it’s in iCloud.
3 different versions.
So confusing! And if it happens on our iPhones than , really, Apple go get 😡
 
But if Apple cannot legally have the raw images of CSAM, WHAT are they comparing matched images to? If they have nothing to compare, then they need to make judgement calls. If its any form of nude image with a subject that might LOOK young, will they say its a positive match?

So you basically said if it LOOKS like CP. So if someone is 25 but looks 15, Apple will make a judgement call on that as in most areas 17 years and 11 months is still considered CP.

Yes.

Apple won't compare the image to anything. They will look at the image and make a judgement call if they believe it's CP. If they believe so, they are required to report it.

Nudity is not pornography. It has to be a visual depiction showing sexually explicit conduct or lascivious exhibition of sexual areas of the body involving a minor, or be virtually indistinguishable from a real minor.

If both the CSAM Detection system and Apple's human review makes a mistake, innocent people will be reported.

For those people, an affirmative defense will be to show no actual minor was used in producing the image or they have less than three such images.
 
My contribution for the day:



 

Attachments

  • Screenshot 2021-08-21 at 08.58.13.png
    Screenshot 2021-08-21 at 08.58.13.png
    153.9 KB · Views: 65
Nobody made any noise about Microsoft’s PhotoDNA being used by by Gmail, Twitter, Facebook, Adobe, Reddit and Discord.



Apple’s version is better because it stops abusive images being shared in the first place and gives the user the choice to disable it and change their behavior (ie. Please don’t use our servers for your illegal behavior)
 
Yes.

Apple won't compare the image to anything. They will look at the image and make a judgement call if they believe it's CP. If they believe so, they are required to report it.

Nudity is not pornography. It has to be a visual depiction showing sexually explicit conduct or lascivious exhibition of sexual areas of the body involving a minor, or be virtually indistinguishable from a real minor.

If both the CSAM Detection system and Apple's human review makes a mistake, innocent people will be reported.

For those people, an affirmative defense will be to show no actual minor was used in producing the image or they have less than three such images.
Yes but 17 years and 11 months is a minor in most places. So how can Apple make a judgement call in those cases?
 
If what you say is true, then why fight crime?
• Let's just get rid of Law Enforcement, because it's not going to make any difference.
• Don't put locks on your doors, because the burglars will break them anyway.


People downloading these images, either for a price or for free, are encouraging those who abuse children to provide more material.
By your logic, it shouldn't be illegal to carry a controlled substance with you as long as you don't give it to someone else. But it is!
I can carry a bag of weed and show it to a officer and be fine. It’s a controlled substance
 
Apple has explained how the system works very clearly, and made very vocal promises to not let the system get compromised or be abused by governments. All of the uproar is about what _could_ happen (aka. speculation), not what Apple has created.

1) The scanning / comparison of photos on your phone is 100% private. Nobody sees your photos.
2) Scanning only happens IF you have enabled iCloud Photos.
3) Only photes that "match" have a voucher attached. Nobody has seen your photos yet.
4) If you upload more than 30 violating photos, only then can the server-side system decrypt the photos, and only then is a *derivative* (low-resolution) version of the photo revealed, not the original. Th original remains fully encrypted.

We live in a world of "innocent until proven guilty", so Apple is innocent of any wrongdoing until there's proof that their system has been used outside of its primary purpose.

I choose to believe that Apple is not doing anything sinister here. It's a more advanced and more private means of detecting "bad" images. That is good for us, not bad.
So basically what your saying is that this whole fiasco won’t do anything to prevent child porn. But it will hurt apples reputation and trust is hard to earn but easy to lose
 
  • Love
Reactions: SFjohn
So basically what your saying is that this whole fiasco won’t do anything to prevent child porn. But it will hurt apples reputation and trust is hard to earn but easy to lose
I don't think that is what is being said. What is being said is that this will cause people to think twice about using the Apple platform to store CSAM, which may have a positive impact on this issue. The whataboutisms are great conspiracy theories but remain that until and when it's proven. And additionally Apple is taking it on the chin due to this stance.
 
  • Haha
Reactions: KindJamz
I don't think that is what is being said. What is being said is that this will cause people to think twice about using the Apple platform to store CSAM, which may have a positive impact on this issue. The whataboutisms are great conspiracy theories but remain that until and when it's proven. And additionally Apple is taking it on the chin due to this stance.
I think your wrong. This will do nothing but possible push the creation of new cp. this is all about apple trying to look good to the public. This attempt is failing miserably.

They are taking it on the chin because they got caught lying about their privacy policy. Plain and simple.
 
  • Like
Reactions: SFjohn
I think your wrong. This will do nothing but possible push the creation of new cp. this is all about apple trying to look good to the public. This attempt is failing miserably.

They are taking it on the chin because they got caught lying about their privacy policy. Plain and simple.
Please explain how they got caught lying.
 
Yet- it remains true. You need to upload it yourself to iCloud before anything leaves your phone.
They scan before it leaves your phone. So it’s a lie. Nothing else matters at that point.

Apple can scan until their little hearts are content once it’s on their server. Not a moment before. Not until it is physically and entirely uploaded
 
  • Like
Reactions: SFjohn
Why defend a company so hard. Your nothing but a wallet to apple. Apple makes great phone.

They are feeling the punches everyday now. Hope one really stings soon. Time for some change.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.