Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
you are incorrect

having the database on device gives more security to the end user not less, anything done "in the cloud" or on apple's servers is completely blind to us the users, images could be added willy nilly, the database could changed and we the user would have no idea what is taking place

apple is specifically placing the database on-device because it gives us the users more transparency,

1) it is put on device only with a signed operating system
2) no remote updates are possible this ensures that the same database is every phone worldwide
3) this prevents apple from replacing the database with another database
4) apple will provide a root hash with every database which will published as a knowledge based article
5) the user can then inspect the hash on their own device to see that it hasn't changed
6) all of this can be audited by third parties

can you see that if this done in the cloud we have no transparency, we have no access to know what database is being used and how it is used or what is in it

basically apple is placing the database on a table and allowing the whole world to gather round the table and affirm that the database hasn't been changed or substituted (the phone) since all eyes are on it

all of this makes the process far more secure than doing scanning on the cloud where we have no idea how it is taking place
...but we do not trust you, so we do store that database on your device and check every picture, you are uploading to our servers. You might be a liar, so you will get checked on every upload, slomojoe. If there is a false alarm, we will review your images and approve them. You will not notice that, if you are not cheating on us. We will not tell you about your private pictures, our clerks were glaring at (...nice butt btw.) So just sit back and relax.

Regards Apple Control
 
Apparently, if you do not upgrade to IOS 15, you may continue to upload CSAM as you wish. I assume APPLE will "update" earlier IOS versions to accommodate this CSAM software. So, for now, continue on IOS 14 or elect not to upload to iCloud in IOS 15. Problem solved for you, at this moment. It still leaves that code on your iPhone for future exploitation, doesn't it? Either we all stop updating now, or we are stuck with the "backdoor".

Who knows what's in all that code that runs our phones, but APPLE has sure drawn our attention to it, haven't they?

It seems to me the solution is for APPLE to backtrack a bit. Admit they didn't anticipate the level of backlash, and simply scan ON THE SERVERS like everybody else does. Even if they truly believe everything they say about present and future protections, etc, it should be obvious to them that a large portion of their customers don't. We will never take their insistence that they truly value our privacy and will never breach it. Probably true today, but who knows how the next CEO feels about potentially losing the China market or the EU market or the Indian/Pakistani market?

Code resident on the phone that can scan content for any particular content that is deemed "security of state" worthy, that we don't want on the phone, not necessary to the operation of the phone is crossing the Rubicon. I hope they recognize this and rethink this whole approach to CSAM.
 
  • Like
Reactions: Mercury7
The pushback is about wanting the back door not to be installed in the first place, once it’s in place you will not know if it’s being used nefariously…. People like Snowden that blow the whistle on that type of stuff is pretty rare, Once people grant access then that’s pretty much it… your monitored for as long as you participate in the ecosystem
so then you would rather have it on icloud servers and completely in the dark ? having no knowledge of what is being scanned or compared against, having no knowledge of what kinds of images are being scanned ?

also, calling this a backdoor is really an inaccurate example of what is happening

apple is placing a database on our phones in full view of the users and the authorities, it is completely unable to be modified or changed since the hash is provided for everyone to see, we know that apple has obtained the images from 2 different databses so we know it is authentic csam and not any other kinds of material

it is figuratively sitting in the glare of a large spotlight ... to call this a backdoor is really entirely inaccurate
 
yeah, totally, does facebook or google lay out in detail how they are scanning for csam ?

i don't think so, their system of server side scanning is infinitely more opaque, dangerous and open to considersably more abuse, they may only be using 1 database of csam which could be corrupted, they are not detailing how they are getting images or using and comparing them

apple is using only those hashes which appear in 2 separate databases which insures that the material being compared against is actual verified csam

apple is putting the database in plain sight and providing the hash for all to see and allowing 3rd party audit, we the users can check the hash on our device

if the nsa really wants to use a backdoor via the fisa they can damn sure do it anytime they want, this attempt by apple to deal with csam neither stops them nor does it make it any easier for the nsa to do what it wants, they'll just do it

i am afraid we are seeing a river of paranoia here along the lines, as you say, of "anything is possible", yeah anything is possible, tim cook may be the manchurian candidate, an nsa stooge

i don't think people are taking into account just what it would take for governments to pass legislation permanently breaking encryption and security on phones

we need to find a new way forward that balances the needs of government and the rights of users and from what i see apple is trying to do just that

in the end it all comes down to trust, if you really don't trust apple then go somewhere else, how about google :)

i think we have no choice but to trust and take the tradeoff of transparent, verifiable on device scanning, for the e2ee encryption that i believe apple is working toward
We do have one temporary choice, don’t upgrade and don’t use iCloud, it might open the door for smaller privacy based companies to emerge in the market, obviously Google and Apple are about it right now but that could change, I imagine my iPhone 12 will run several years on ios14 so it’s not like I have to find alternatives immediately
 
true, i am, but it seems logical, apple continues to add as much privacy to their products as they can and e2ee seems a logical goal
I would have thought they would announce it at the same time as the on device scanning. Now, i don't think it's possible with their biggest market, China, probably not allowing it. (and I wouldn't bet on the U.S. government allowing it either) in any case, I don't trust Apple anymore, but that's certainly not unique!

I don't trust Google or Samsung either. No expectation of trust kind of makes it easier to buy things, so I pre-ordered a Samsung flip when i could. :)

I'll still have an iPhone, but probably not use it as much as before.
 
...but we do not trust you, so we do store that database on your device and check every picture, you are uploading to our servers. You might be a liar, so you will get checked on every upload, slomojoe. If there is a false alarm, we will review your images and approve them. You will not notice that, if you are not cheating on us. We will not tell you about your private pictures, our clerks were glaring at (...nice butt btw.) So just sit back and relax.

Regards Apple Control
no one likes to be assumed a criminal and child pornographer, which i grant you is what it feels like

but we need to balance a lot of competing needs here and apple is trying to do that, yes, we need to trust apple, but we trust all kinds of people every day of our lives and this is one more level of trust

i do think apple will do its best to maintain trust with users and i also think that all operating systems, phones, tablets etc are going to have to develop trust with users if they want to maintain their user base and market share

trust is currency

ps. i see your arsenal of apple gear, wow, are you really going to just up and leave apple ?
 
Apparently, if you do not upgrade to IOS 15, you may continue to upload CSAM as you wish. I assume APPLE will "update" earlier IOS versions to accommodate this CSAM software. So, for now, continue on IOS 14 or elect not to upload to iCloud in IOS 15. Problem solved for you, at this moment. It still leaves that code on your iPhone for future exploitation, doesn't it? Either we all stop updating now, or we are stuck with the "backdoor".

Who knows what's in all that code that runs our phones, but APPLE has sure drawn our attention to it, haven't they?

It seems to me the solution is for APPLE to backtrack a bit. Admit they didn't anticipate the level of backlash, and simply scan ON THE SERVERS like everybody else does. Even if they truly believe everything they say about present and future protections, etc, it should be obvious to them that a large portion of their customers don't. We will never take their insistence that they truly value our privacy and will never breach it. Probably true today, but who knows how the next CEO feels about potentially losing the China market or the EU market or the Indian/Pakistani market?

Code resident on the phone that can scan content for any particular content that is deemed "security of state" worthy, that we don't want on the phone, not necessary to the operation of the phone is crossing the Rubicon. I hope they recognize this and rethink this whole approach to CSAM.
Yeah, was just saying that’s my plan, staying on iOS 14 and disabling iCloud… I think ios14 is going to become a feature for used iPhones, will cost more than those upgraded
 
We do have one temporary choice, don’t upgrade and don’t use iCloud, it might open the door for smaller privacy based companies to emerge in the market, obviously Google and Apple are about it right now but that could change, I imagine my iPhone 12 will run several years on ios14 so it’s not like I have to find alternatives immediately
sure there are alternative places to upload photos securely and we can all do that or build a home cloud or ??

i just don't think you are going to find absolute trust anywhere else in technology, risks are present everywhere

to me transparency, openness and communication between the company and its users is the way to go and i think apple wants the loyalty of its users and so far up to now, they have earned it, no other company has taken privacy so seriously as apple, we need to give them benefit of the doubt
 
  • Like
Reactions: cupcakes2000
I don't trust Google or Samsung either. No expectation of trust kind of makes it easier to buy things, so I pre-ordered a Samsung flip when i could. :)
Indeed. I don't trust Google and I trust Samsung even less. So my smart watch days, at least, are probably over for good. I'll probably move to Android phone and tablet. I'll pick devices that have LineageOS distros in case I feel I need to go that way. (I'd put CyanogenMod on my Samsung tablet before I switched to an iPad.)

Loading something like LineageOS and shunning all possible Google services, I think it'd be possible to approach Apple-privacy-that-was on an Android device.
 
Indeed. I don't trust Google and I trust Samsung even less. So my smart watch days, at least, are probably over for good. I'll probably move to Android phone and tablet. I'll pick devices that have LineageOS distros in case I feel I need to go that way. (I'd put CyanogenMod on my Samsung tablet before I switched to an iPad.)

Loading something like LineageOS and shunning all possible Google services, I think it'd be possible to approach Apple-privacy-that-was on an Android device.
My Apple watch is something I'm not ready to get rid of, it's too important! So I'll have an iPhone too, but probably not as my primary.
 
no one likes to be assumed a criminal and child pornographer, which i grant you is what it feels like

but we need to balance a lot of competing needs here and apple is trying to do that, yes, we need to trust apple, but we trust all kinds of people every day of our lives and this is one more level of trust

i do think apple will do its best to maintain trust with users and i also think that all operating systems, phones, tablets etc are going to have to develop trust with users if they want to maintain their user base and market share

trust is currency

ps. i see your arsenal of apple gear, wow, are you really going to just up and leave apple ?
... I always act in a reasonable way. I will not throw away that Apple stuff, but I try, not just to say "ok, just do it". Let them check that stuff on their servers with the famous database or/and find other ways to control, what obviously needs to be checked.

It is a first step into something, we might not want to have...
My first step will be, not to upgrade to iOS 15 and first see what happens.
 
My Apple watch is something I'm not ready to get rid of, it's too important! So I'll have an iPhone too, but probably not as my primary.
I understand completely. I loved my Apple Watch. (Just upgraded to an SE in November.) My Apple Watch was on my wrist 24x7x52, except when it was charging. But now it's turned off, sitting in a drawer, and there I expect it will remain until I put it up for sale. I'm currently thinking about what I want to wear for a wrist watch.

I think I will miss my Apple Watch most of all :(
 
  • Like
Reactions: bobcomer
Yes interesting but they have no reason for this to be used at the user end on users's hardware which is surveillance. If they so wish use it on their own servers, and kicking against that sends out signals that its nothing to do with child safety and everything to do with the forerunner to a backdoor.
I don't disagree with you. Just commenting on the technical sophistication of the cryptography they developed. Perhaps that's the reason why they couldn't see the forest for the trees (and completely misjudged how it would be received). It looks to me like they drank their own Koolaid. They have been focused very strongly on doing other things on the device instead of the cloud (which in other contexts is great for privacy) and perhaps didn't fully understand that this is something completely different.
 
Last edited:
  • Like
Reactions: mw360
Going into everyone's iCloud seems the wrong approach. But I like the idea of being able to turn on a parental control for my child's phone that would screen photos in/out. Then if I am alerted about something, I could view and then report the user if they were trying to send something to my child. Apple could then take steps screening a particular person. I guess they could also screen photos that anyone sends to someone else (since they are shared), but just screening all photos you have stored in your own account does seem like an invasion no matter how good-intentioned it is.
 
  • Like
Reactions: Mega ST
Well, there are many security experts and privacy advocates who are question, are in opposition to, etc.

And we have not been standing on the slope as noted above. Rather we have been sliding down the slope gaining speed and will in the not too distant future reach a point of no privacy. Phone, surveillance cameras, unknown search, seizure or imprisonment. Hopefully the opposition to arbitrary and overly capricious actions by both government and corporations will increase.

We have reached this point because most people believe that life can be and should be 100% safe, and are willing to give up freedoms, rather than protect freedoms. And in concert believe being able to send emojis with ease is more important than privacy. Do not misunderstand me, I am subscribing to tin hat theories, only what is being done by legislation, often well intentioned, but with many unforeseen consequences.
You really think we have slid further down the slope since the day before Snowden dropped his bombshell? Back then the NSA had everybody's pants fully down to our ankles. Apple and others were rightly furious and since then governments have been begging. Not begging for new surveillance tools, begging for Apple and co. not to shut down the ones they had. But Apple have retaliated with encrypted phones, self-erasing phones, encrypted iMessages and they are not stopping there. This is a minor concession to some genuine law enforcement/political concerns, but I fully expect the next announcement to be another step back up the slope with e2ee Photos. (Because without that, there is literally no point to this system.)
 
  • Like
Reactions: quarkysg
I think I'd rather have safety vouchers uploaded to Apple vs letting them have full access to all of my photos in the cloud, but hey that's just me. I don't collect CSAM, so Apple will never know what's on my iPhone.
 
  • Like
Reactions: giggles
You said "People who are into pornography of any type are usually addicted to it and collect hundreds and thousands of images...", which is absolutely ridiculous. So I laughed at you.

Except it's not ridiculous. It's the truth. You do realize I said "hundreds AND thousands", not "hundreds OF thousands", right? In other words, most people downloading and sharing pornographic images probably have at least 100, and many have 1000 or more.
 
  • Haha
Reactions: Scotticus
While I'm "not happy" about this scanning I'm also not throwing away Apple gear and rushing to get Android (or Linux). I don't buy all the slippery slope arguments, although they are really tempting to dive into the whataboutisms that these arguments represent. I'll see how this starts to shake out before making a move to determine if I buy a Galaxy Note or Iphone 13.

However, and I'm wondering, this is 2021, do people actually ask for a recommendation of whether they should get an iphone or Galaxy or ipad or tab or Mac or Windows? I used to be the goto person for that type of information and it's been 5 years since I've been asked any questions about "which gear is best for me?"
You are assuming that people will have to ask before former Apple fans say anything.
 
Last edited:
  • Like
Reactions: Mercury7
You are assuming that people will have to ask before former Apple fans say anything. Not true. This travesty by Apple has made many ANTI-Apple evangelists to their friends and family. After spending so many years urging them to GET Apple products, they now feel they have a duty to tell/warn them.
Many in relationship to 1 billion? That probably amounts to a hill of beans. I’ll bet the next fiscal quarter is going to be a blockbuster. People assume,imo, that everybody feels the way they do. Not so.
 
You are assuming that people will have to ask before former Apple fans say anything. Not true. This travesty by Apple has made many ANTI-Apple evangelists to their friends and family. After spending so many years urging them to GET Apple products, they now feel they have a duty to tell/warn them.
I don't even bother trying to convince anyone anymore. It's pointless. Most people made up their own mind about something and will plug their ears and scream (figuratively) when you try to tell them about anything else.
 
I think I'd rather have safety vouchers uploaded to Apple vs letting them have full access to all of my photos in the cloud, but hey that's just me. I don't collect CSAM, so Apple will never know what's on my iPhone.
Yes the I have nothing to hide so it’s all good…
 
so then you would rather have it on icloud servers and completely in the dark ? having no knowledge of what is being scanned or compared against, having no knowledge of what kinds of images are being scanned ?

I think every major tech company has already been scanning pictures. For years. Providers don’t want to be liable for hosting illegal content.

I wonder how many people actually use iCloud for illegal content like child porn. They must be incredibly stupid.
 
Going into everyone's iCloud seems the wrong approach. But I like the idea of being able to turn on a parental control for my child's phone that would screen photos in/out. Then if I am alerted about something, I could view and then report the user if they were trying to send something to my child. Apple could then take steps screening a particular person. I guess they could also screen photos that anyone sends to someone else (since they are shared), but just screening all photos you have stored in your own account does seem like an invasion no matter how good-intentioned it is.
From what I understand, it doesn't screen photos in. Just out. So if your kid is about to send something that the apple AI feels shows too much skin, it warns them, and if they click "Send anyway", it sends you a copy of the picture.

Which could, possibly, put you in possession of child porn yourself....

I haven't read anything about it scanning incoming nudes, etc.
 
From what I understand, it doesn't screen photos in. Just out. So if your kid is about to send something that the apple AI feels shows too much skin, it warns them, and if they click "Send anyway", it sends you a copy of the picture.

Which could, possibly, put you in possession of child porn yourself....

I haven't read anything about it scanning incoming nudes, etc.
That’s actually two different programs Apple launched at same time, Apple was trying to blame that for confusion… however they completely missed why people were complaining…. Not sure they are still not in the dark
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.