Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Meanwhile, a member on another forum, whose wife uses Apple products (he does not), when informed of what Apple's doing, told him she's not replacing what she has, but will buy no more Apple kit. My wife is taking the same approach.
Good. Freedom of choice. Stand by for what you believe in.
 

"Does ’s new on device CSAM policy affect your decision to upgrade?"​


Not in the slightest. Not because I agree with the program, but because I don't see why this can't be added to iOS 14, or earlier iOS versions. I'm expecting this to be the policy from many tech companies going forward, so I think the best approach will be to mitigate and limit exposure, instead of pretending I can protect myself from these new privacy invasions by holding on to older hardware.
 
I keep having to remind myself that I don't know what people are really thinking beyond what they are posting. But, at first blush, most all the posts are of the "what about me" variety. Admittedly, privacy is an important social issue, but most all posts are stated in a way that the only concern of the posters is how it affects themselves.

Even if you don't believe in the viability of conscious capitalism (or whether Apple practices it), there is the chance that Apple is testing a slippery slope with some imagined harm for sake of addressing some real harm. Only a few posts mention this real harm as part of what influences the poster's stance.

We put our lives in the hands of tech companies all the time, with limited or no awareness of the risks to ourselves. Most of those companies don't even pretend to care about us (in any believable way). We do this largely to benefit ourselves.

Apple intentionally pushed this change to front and center in the news. They're introducing a risk to their consumers that doesn't directly benefit their consumers. I hope they didn't overestimate the nobility of their consumers, but they might have. Certainly their PR skills weren't up to the task.

The technical points raised in this thread are good ones. I hope Apple pays attention and provides more clarity over time with regard to how it will work and the safeguards that will be put into place. People should be able to make informed choices. At this point, some people have arrived at their positions without understanding what is really happening.
 
This came out a month after the DNC a political organization said they were going to try to work with carriers to scan SMS messages to combat vaccine “misinformation. “
I believe that is a misunderstanding of working with SMS providers to combat misinformation. I do not believe individual texts are to be scanned.
 
  • Haha
Reactions: Barbareren
I keep having to remind myself that I don't know what people are really thinking beyond what they are posting. But, at first blush, most all the posts are of the "what about me" variety. Admittedly, privacy is an important social issue, but most all posts are stated in a way that the only concern of the posters is how it affects themselves.

Even if you don't believe in the viability of conscious capitalism (or whether Apple practices it), there is the chance that Apple is testing a slippery slope with some imagined harm for sake of addressing some real harm. Only a few posts mention this real harm as part of what influences the poster's stance.

We put our lives in the hands of tech companies all the time, with limited or no awareness of the risks to ourselves. Most of those companies don't even pretend to care about us (in any believable way). We do this largely to benefit ourselves.

Apple intentionally pushed this change to front and center in the news. They're introducing a risk to their consumers that doesn't directly benefit their consumers. I hope they didn't overestimate the nobility of their consumers, but they might have. Certainly their PR skills weren't up to the task.

The technical points raised in this thread are good ones. I hope Apple pays attention and provides more clarity over time with regard to how it will work and the safeguards that will be put into place. People should be able to make informed choices. At this point, some people have arrived at their positions without understanding what is really happening.
https://www.apple.com/child-safety/...del_Review_of_Apple_Child_Safety_Features.pdf This is an excellent read. I hope you read the whole thing. It explains everything down to how they plan to keep it safe from being abused and also providing the root hash to the encrypted database that can be audited by anyone to check for changes. The full root hash of the database will be provided in the settings app and you can check it against a knowledge base article.
 
  • Like
Reactions: svenmany
I believe that is a misunderstanding of working with SMS providers to combat misinformation. I do not believe individual texts are to be scanned.
No it was to be individual texts. That’s why I was thankful for iMessage until the announcement that it now has a crack in it in iOS 15. Looks like people are going to have to get all of their contacts to download Signal.
 
In respect of Signal, I am actually more comfortable with the Russians scanning and reading all my messages than I am with the US government doing so, or Apple doing so as an agent of the US government. The Russians owe me no Fourth Amendment rights against search and seizure; the US government, on the other hand, does — at least it used to. The Russians cannot suddenly decide to prosecute me for ideological reasons; the US government is doing so to others right now.
 
Still will buy an iPhone 13 mini since I'm on an ancient phone, but I really, really dislike the direction this is going.
 
Seeing that they've proven over and over and over again of being incapable of releasing bug free new software, there's no reason to believe that when this surveillance software goes live that there's not going to be some major screw ups.
If their track record is any guide, It's guaranteed to not work as intended.
 
Yup.
Apple either cares about Privacy or they don’t.
If they don’t backpedal on this I won’t be buying another Apple product.
They won’t unlock iPhones even with a warrant or court order, but they’ll run searches of content on my device in a black box without warrant or court order?
No.
 
I actually have no device for upgrade either way. I entered the Apple ecosystem rather recently and my devices are still new enough to not need an upgrade. I would definitely observe the situation for the next 2-3 years. I am against CSAM and I do agree that this is one thing (of a lot of others) that we (as a society) need to tackle. As a former Software Developer, I also do think that technology can help Law Enforcement to catch criminals and maybe in the not so near future (currently just in the sci-fi realm) also for crime prevention. That being said I am not sure that I want big tech companies (like Apple) to decide on their own and employ their own solutions to tackle such issues. I would much prefer tech companies to be contracted by government/LE with the need for such solutions. This way for me there is more transparency and we know what to expect. Plus it does not come with the brand of device I am choosing for me, but with the law. And I kind of see LE have the law behind them to do things like that, tech companies I see them a bit more in the gray area of are they allowed or not.

I read the white paper and if I do want to point out the few technical details that I got as I do not see all of them mentioned in the comments.

  1. Apple does not have access to our photos in plain view, they would decrypt the data if a certain threshold is met
  2. Scanning could be done in two ways
    1. By checking the actual content (implies that whoever is doing the scanning has full access to our data during the scanning process)
    2. By generating hash based on the content and using to draw conclusions (implies that whoever is doing the scanning does not have access to the actual data so much than it is looking for specific attributes)
    3. The second approach is a bit less intrusive when it comes to privacy violation
  3. The detection works based on machine learning model.
    1. This machine learning model cannot be used right away for detecting for something else. This is not how it works. It takes a lot of test data and thus time to train a model to detect something. In this case Apple has used its consumers (people like you and me) and their photos to generate hashes and train this model.
      1. As a note this is how Scribble or the new text recognition in images work too. In other words be aware that we are helping Apple to improve their algorithms as much as they help us with their solutions.
I am concerned that we might not have a choice in the future and these kind of scanning might get regulated and established through government. I personally find this point rather more concerning than pointing the finger at Apple. The thing is that the more technologies and devices get developed and improved, the more LE might think about incorporating technology in their work. On one side as a Developer I feel good that we (Software specialists) can help make the world better place, on the other side I am concerned on what happens when we (Software Developers) really screw things up because let's face it - we are people and we can make mistakes.

The white-papers are btw interesting read but I am not sure that they are for everyone and I do not feel that Apple managed to addresses people concerns in a constructive and in an efficient way. I doubt that everyone would want to read the papers or go into details.
 
https://www.apple.com/child-safety/...del_Review_of_Apple_Child_Safety_Features.pdf This is an excellent read. I hope you read the whole thing. It explains everything down to how they plan to keep it safe from being abused and also providing the root hash to the encrypted database that can be audited by anyone to check for changes. The full root hash of the database will be provided in the settings app and you can check it against a knowledge base article.

My post was kind of a lay of the land impression; people mostly just care about themselves and Apple implemented something that tries to address an issue that its customers mostly do not care about. These customers won't fully relax until they're given rock-solid proof that they've given up nothing.

Apple will never be able to give this kind of proof. The paper you linked to is excellent, but not good enough for everyone. Some people will need more detail and some will need less. Most people will put very little energy into learning what's been implemented; outrage is just too easy and enjoyable.

My personal stance is that Apple has earned my trust. I'm willing to compromise and accept some risk in order to help others. I believe Apple will do a good job in minimizing the risk to me.

Many tech companies violate your trust silently. Apple clearly advertised what they're doing. No good deed goes unpunished.
 
  • Haha
Reactions: Cycom
My post was kind of a lay of the land impression; people mostly just care about themselves and Apple implemented something that tries to address an issue that its customers mostly do not care about. These customers won't fully relax until they're given rock-solid proof that they've given up nothing.

Apple will never be able to give this kind of proof. The paper you linked to is excellent, but not good enough for everyone. Some people will need more detail and some will need less. Most people will put very little energy into learning what's been implemented; outrage is just too easy and enjoyable.

My personal stance is that Apple has earned my trust. I'm willing to compromise and accept some risk in order to help others. I believe Apple will do a good job in minimizing the risk to me.

Many tech companies violate your trust silently. Apple clearly advertised what they're doing. No good deed goes unpunished.
People give them hell about being open and honest with what they're planning to do. They could've just snuck it in there and put a tiny blurb in the Terms and Conditions and most people wouldn't even bat an eye, but as soon as someone found out, there would be even more backlash for being sneaky about it. It's a lose, lose, but I think Apple did the right thing by talking openly about it.
 
In respect of Signal, I am actually more comfortable with the Russians scanning and reading all my messages than I am with the US government doing so, or Apple doing so as an agent of the US government. The Russians owe me no Fourth Amendment rights against search and seizure; the US government, on the other hand, does — at least it used to. The Russians cannot suddenly decide to prosecute me for ideological reasons; the US government is doing so to others right now.

Precisely. This is counterintuitive to so many people but it actually makes a ton of sense.
 
  • Like
Reactions: Cycom
My post was kind of a lay of the land impression; people mostly just care about themselves and Apple implemented something that tries to address an issue that its customers mostly do not care about. These customers won't fully relax until they're given rock-solid proof that they've given up nothing.

Apple will never be able to give this kind of proof. The paper you linked to is excellent, but not good enough for everyone. Some people will need more detail and some will need less. Most people will put very little energy into learning what's been implemented; outrage is just too easy and enjoyable.

My personal stance is that Apple has earned my trust. I'm willing to compromise and accept some risk in order to help others. I believe Apple will do a good job in minimizing the risk to me.

Many tech companies violate your trust silently. Apple clearly advertised what they're doing. No good deed goes unpunished.

You can't accuse the critics of this technology of "mostly caring about themselves" before making the statement "I'm willing to compromise and accept some risk" because the gaping caveat in that position is that by making the compromise you are also forcing others to make the same compromise too. This is not some opt-in technology, this is for everyone. It's not just you compromising and accepting some risk, it's others too. Forcing others to adopt a surveillance system they oppose because you personally don't have an issue with it puts you in the category of people that mostly care about themselves.

If you're willing to sacrifice everyone's privacy (not just yours remember) because it might "help others" then why don't you oppose the technology considering it can be used to harm others? What about all the whistleblowers, activists, journalists, and technologists that are being targeted by existing surveillance technology initially introduced to "only target terrorists" (which turned out to be a blatant lie)? That's one of the major reasons I'm opposed to it, not just because of my personal privacy, but because the privacy of other people more important than me is at risk.

Your personal stance has implications for others.

Many tech companies violate your trust silently. Apple clearly advertised what they're doing.

So violating your trust if okay so long as the violator announces what they're doing first? I don't buy that. Apple have earned your trust? When? Was it when they backtracked on encrypting iCloud backups after the FBI asked them not to (not even a judicial request, just a casual ask)? Was it when they handed over iCloud encryption keys to the control of the Chinese government after they told them to?

Apple's history and corporate history in general tells me they'll do whatever the authorities want them to do so long as it doesn't hurt their profits. I don't trust them at all and never will, that would be naive and illogical to do.
 
There is not one picture or video on my phone or laptop that I would be afraid of anyone seeing.

Good for you. Can you say the same for activists on US soil? Can you say the same for gay people in Saudi or Russia? Can you say the same for journalists in China? You're not afraid today because the current powers that be in whatever country you live in haven't come for you yet. Once they do they'll have all the infrastructure in place to make their job easy. I wonder how afraid you'll be then.

This issue doesn't just affect you directly, it affects everyone with a smartphone whether people like to recognise that or not by pretending authoritarian mission creep doesn't exist. This sets a precedent that scanning on your local device is okay.

Oh and I'll take you up on your challenge. Create an unencrypted backup of both your phone and laptop, put them on a hard drive (I can provide it for you if you want) and send it to me. I'll take a browse around your files just for fun. Make sure to include all the login information for all online services you use (including the ones your wife doesn't know about) because clearly you don't have anything to hide.
 
Good for you. Can you say the same for activists on US soil? Can you say the same for gay people in Saudi or Russia? Can you say the same for journalists in China? You're not afraid today because the current powers that be in whatever country you live in haven't come for you yet. Once they do they'll have all the infrastructure in place to make their job easy. I wonder how afraid you'll be then.

This issue doesn't just affect you directly, it affects everyone with a smartphone whether people like to recognise that or not by pretending authoritarian mission creep doesn't exist. This sets a precedent that scanning on your local device is okay.

Oh and I'll take you up on your challenge. Create an unencrypted backup of both your phone and laptop, put them on a hard drive (I can provide it for you if you want) and send it to me. I'll take a browse around your files just for fun. Make sure to include all the login information for all online services you use (including the ones your wife doesn't know about) because clearly you don't have anything to hide.
I'm talking about children being harmed.
 
  • Haha
Reactions: Cycom
With respect - people focusing on the specific case of scanning for CSAM are missing what the story is.

What's really happening here is Apple are putting a tool onto users phones whose purpose is to compare users private data against black box third party databases.

What it's used for is simply subject to Apple policy and what local jurisdictions might force them to use it for.

Installing a tool like this on user devices is a huge mistake.
I was going to say I don’t care because my photos are all of mundane family life and pets and I’m frankly more concerned about the truly invasive information that the law deems “public information” and makes readily searchable.

And every service I’ve ever used apparently scans photos. It’s understandable none of these companies want criminal content on their servers, especially criminal content that harms and exploits children.

But you’ve stated in very clear direct terms why this is something to care about, even if all I have are mundane photos, many of them of my finger, ever since camera bumps got big enough to leave no place to grip the phone. :rolleyes:

Ideally our stuff wouldn’t ever get scanned on our devices unless a warrant is issued due to probable cause. On their servers, well that’s their property and their rules. I think Apple feels our phones are their property because they own the OS. At least that’s how they’re acting.

I don’t understand why Apple’s not just doing it the way they had been and the way Google and FB still do it.
 
Last edited:
I don't think the whole CSAM thing will impact any decision making I make on my next phone (currently using an 11 Pro Max which is still doing everything I need it to do) - i mean, putting aside that I don't use iCloud for photo storage, and the fact that I'm not a raging paedo, the one thing that stands out for me is that it's one less thing that differentiates iPhones from other smart phones. I've always maintained that some of the key benefits of iOS/iPhones over their Android counterparts are hot ticket items such as the Play Store (absolutely riddled with garbage), and Android Auto (I've used both recently and Car Play is by far the more polished of the two) .. and privacy. And yes, while Apple are not *currently* going all out selling our data, or opening it up to any government that asks, this has highlighted that what was once a key benefit of the ecosystem - privacy - is not inviolate.
 
Joplin looks exactly what I want in a notes app. Thanks again!
You are welcome. Since Joplin doesn't offer app pass lock prevention once you are logged in, I took the following steps. You can do this for any third party app. I currently use it for Joplin, Agenda, and Apple Mail.

1) Open Clock app and select Timer

2) Select when Timer ends and scroll down to the bottom and select "stop playing."

3) Select Shortcuts app and click on +

4) Create Personal Automation

5) Scroll down and select app

6) Select the application you want to lock, select done and then click next

7) Under Action - start timer and set to 1 second and click next

8) Make sure "ask before running" is turned off

With this action, if someone has your phone and decides to see what Joplin is and what's in it, if your phone doesn't recognize their face, it will prompt them for your phone passcode before giving that person access to the app. A person can't see much in 1 second. When you close Joplin, it opens to the last note viewed. If you wanted to be uber private, you could always make sure the last page viewed is a note with something stupid for any potential 1 second viewers.
 
All the people claiming they're leaving the echo system make me laugh. Where are you going to go? Google? Hah. Good luck. It's even worse over there. The only difference now is that Apple are disclosing this whereas other companies don't.

Google, Microsoft, Facebook, Twitter, Banks, etc... they've all been doing this for years and most people don't even know it or have their heads so far in the sand that they don't want to believe it.

If those who are that bent out of shape about this really want their privacy back it goes way beyond your cell phone. We've gotten to a point in society where there really is no turning back. All you have left is the ability to bitch and moan about it because there is no stopping it.

the problem I think most people have is Apple is so trusted, deeply by users and owners of Apple solutions. The fact Apple planned for this in their TOS back in 2018/2019 and avoiding this mention at the last WWDC or event about iOS 15 it seems shady, not unlike battery health metering they got pinched for - it’s worse because Apple is not ASKING end users, Jobs critique on privacy at D8 (All Things Digital with Walt) being super famous and ingrained.

Apple isn’t following their holier than thou long page leader which Cook tends to highlight ever memory milestone chance he can get, somehow forgotten this. Add this to still not having our iCloud info on server still not fully encrypted.

this is different than Apple trying to re fashion the old Microsoft has to lose for Apple to win behaviour/mindset that even lasted until the last year in commercials from Apple poking fun at Microsoft or Windows.

Samsung and Microsoft have been making big strides to have synergistic daily life between Android and Microsoft Windows. They’re learning getting better.

heck even Dell’s XPS lineup has the sex appeal and best of Intel mobile chips - they don’t make AMD chips in their top line of laptops which is odd to me. But the build quality and design has never looked this good before and great option to run Ubuntu fully supported on the developer edition has been a Dell thing for years now int hero XPS 13” lineup
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.