Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
What I find interesting is that if law enforcement or the security services wanted to do what Apple is going to do, they would have to get court orders and warrants to search a persons private data but all Apple has to do is change the wording on in it's Terms and Conditions and they can look through users private data as much as they want with no court order or warrant required.
 
  • Like
Reactions: Euronimus Sanchez
This is just your completely arbitrary way to put it tho.

With new technologies and the vast prairies of the internet, pedos have enjoyed new ways to carry on their criminal activity. The number of the CSAM incidents reported by Facebook are there to show it.

On the other hand, it seems fair that new technologies may also enable new ways to track them down.

You can’t just copout and label all of this as “we are all suspects” using last-century reasoning.

The tech behind this is more nuanced than that.
Technology or rather in this case, coding, is point blank, specific, and straight forward.

Human interpretation of technology, can be opinionated. No one is copping out besides Apple. If the government was really concern about CP, congress would invest heavily into an online task force that sniffs out pedophiles.
 
What I find interesting is that if law enforcement or the security services wanted to do what Apple is going to do, they would have to get court orders and warrants to search a persons private data but all Apple has to do is change the wording on in it's Terms and Conditions and they can look through users private data as much as they want with no court order or warrant required.

What Apple and every other cloud host already do.
Private data actually back-scanned only if it is uploaded to Apple’s server.
Police need no warrant if you show your private parts while on the police station premises.
 
  • Like
  • Disagree
Reactions: dk001 and MozMan68
Technology or rather in this case, coding, is point blank, specific, and straight forward.

Apparently it’s not since much of the confusion and drama stems from stating incorrectly what this system actually does.

“Government could invest in other ways to do it” is whataboutism.
 
  • Like
Reactions: MozMan68
Apparently it’s not since much of the confusion and drama stems from stating incorrectly what this system actually does.

“Government could invest in other ways to do it” is whataboutism.
No, it's not. It's the government's responsibility. The best way to catch pedophiles is to pretend to be bait.

I don't think the apple police will be very successful at catching pedophiles since they announced that the scans will occur in an unreleased update. If the pedophiles are not stupid, they would remove incriminating content from their phones.

You're really missing the point. You keep circling back to CSAM. No one is for CSAM. We're disappointed that Apple is giving up privacy in order to create an utopia. That's never going to happen. Crimes will continue to occur while good law abiding citizens become more and more regulated.
 
Why can’t I deposit a suitcase full of cash at my bank without raising a big red flag and possibly be reported? That’s MY money.

Why should I go thru airport security?

Why check my driver license, you think I would drive without a license?

Why check my vaccination status? You think I would spread the China virus?

I’m tired of feeling like a suspect all day. What happened to the presumption of innocence? We law abiding citizens get regulated more and more, whereas the wrong doers slip thru the cracks anyway…if only there were ways and figures to scientifically assess the effectiveness of such measures…
 
  • Like
Reactions: miniyou64
Why can’t I deposit a suitcase full of cash at my bank without raising a big red flag and possibly be reported? That’s MY money.
The bank is similar to iCloud photo servers. Apple should definitely scan and root out CP from it's servers. The money, prior to the deposit, is not deemed illegal. No one is a suspect for having cash on hand.

Why should I go thru airport security?
Likewise, you're utilizing a service. There are rules. iCloud Photos is the service in the case for Apple.

Why check my driver license, you think I would drive without a license?
Driving is a privilege and also a danger to others. Therefore it should be regulated. This is parallel to Apple's iCloud Photos.

Why check my vaccination status? You think I would spread the China virus?
Vaccines save lives. Not sure why this is being brought up. It does not have any similarities with the Apple intrusion. Also everyone should get vaccinated.

I’m tired of feeling like a suspect all day. What happened to the presumption of innocence? We law abiding citizens get regulated more and more, whereas the wrong doers slip thru the cracks anyway…if only there were ways and figures to scientifically assess the effectiveness of such measures…
That's the issue. It's always the good people that is burdened. None of the questions you inquired, directly regulate the source (in Apple's case, photos).
 
Better DeInternet, it's for the best.

Thats what Google would love to make you think, they are the internet. I assure there is a lot of people who use the internet completely blocking any Google services.

heck, there are people who do not use closed software at all.
 
Tim Cook... privacy Privacy PRIVACY... is a lying hypocrite.
Apple have lost the trust. This time its obvious to all.
This is not a backdoor, this is a front door that will be opened tomorrow.
 
  • Like
Reactions: Euronimus Sanchez
only only those meant to be uploaded to iCloud.
So far.

It’s no different from google photos doing the same thing to photos after you upload them.
It's quite different. The scanner is taking up space on your phone rather than a server. It's taking up your battery charge, and I once thought my phone was private.
 
I agree. We have no idea how far Apple may take this, and some can imagine the worse, and I have nothing to say to that except “maybe it will happen, maybe it won’t”.

It's quite different. The scanner is taking up space on your phone rather than a server. It's taking up your battery charge, and I once thought my phone was private.
You will likely never notice the hit to battery life (my guess is that this process will be optimised for iPhone processors). And well, if the thought of this continues to disturb you, perhaps start to look at other brands of phones.

The phone may be yours, but the software continue to be Apple’s to modify as they deem fit.
 
Tim Cook... privacy Privacy PRIVACY... is a lying hypocrite.
Apple have lost the trust. This time its obvious to all.
This is not a backdoor, this is a front door that will be opened tomorrow.
Yes, somehow it feels like Apple held up quite long but finally couldn't resist the temptation of abusing their position. Spying on the most private and personal things that their customers have trusted them to keep safe and secure.
There is a clear cut: Do not spy/scan/read/compare etc. in any shape or form. It's only yes or no.
Next up is scanning documents and conversations.
The only thing that Apple had is trust. But not any more, at least for me.
 
and I have nothing to say to that except “maybe it will happen, maybe it won’t”.
Before this on device scanning was announced, I would have said it would never happen, now I know it will. Apple saying they will never do more reeks of damage control and nothing about their plans.

You will likely never notice the hit to battery life (my guess is that this process will be optimised for iPhone processors). And well, if the thought of this continues to disturb you, perhaps start to look at other brands of phones.
But we don't know that yet. (battery) As for looking for another phone, I already have a second, android, phone, maybe I should make it my primary. Apple isn't near as far ahead in usability as before. And in fact, the only reason I can see to make me want to keep my iPhone is my Apple watch. Disturbs me is a good way to put it, Apple talks about privacy, but this is the biggest breach so far, on device scanning is unprecedented by anyone. I know this kind of scanning wont catch me at anything, but the very idea is objectionable in the extreme -- it's so 1984 it hurts.

My intel mac mini is staying with an older version of the OS, and my M1 MBA is going back to 11 and staying there too (if I keep it at all, it never has worked out for the way I do things anyway) And the new 14" MBP with an M1X, probably not in the cards for now. and the new iPhone 13 I was going to buy soon, mmmmmmm we'll see. I'm hoping Apple reverses their policy, then all will be well again.
 
  • Like
Reactions: blueflame
Whenever you hear about this sort of crime on the news it seems to involve heavily encrypted harddrives and the dark web - not iCloud! I can't see this catching anyone, especially now with all the press this is getting. So if Apple don't find anything will they remove this? No. This will creep in one direction only.

If this makes all the ones collecting these pictures staying away from Apple's product and services, I do think Apple will be pretty happy.
 
This is what worries me the most:

Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.

In addition, any time an account is flagged by the system, Apple conducts human review before making a report to NCMEC. As a result, system errors or attacks will not result in innocent people being reported to NCMEC.

This means that if a photo is wrongly flagged by the system, it will be verified by an human. This is already a huge privacy issue. Notice that they say "not reporting", but an human will be visualising your legit private photo.
 
  • Like
Reactions: Euronimus Sanchez
EC785CA6-A007-410C-A6EB-F9B9A369B827.jpeg
 
This is what worries me the most:





This means that if a photo is wrongly flagged by the system, it will be verified by an human. This is already a huge privacy issue. Notice that they say "not reporting", but an human will be visualising your legit private photo.

Only if you have multiple wrongly flagged photos.
Basically impossible.
One isn’t enough.
 
  • Like
Reactions: Michael Scrip
Scanning part is invasion.

But the Photos app already does that. Very resource intensive analysis is being made on every photo in there. And I believe in the latest versions the information from this scanning is sent to iCloud.
 
Maybe I am splitting hairs, but I still see no verbiage that states the matching/scanning does not happen if iCloud Photos is disabled. Nothing happens with that match until it is uploaded to iCloud Photos, but nothing clearly states the match does not happen if it is disabled.

"Before an image is stored in iCloud Photos"

You are right. It depends on what Apple means with 'before'.
 
This is what worries me the most:





This means that if a photo is wrongly flagged by the system, it will be verified by an human. This is already a huge privacy issue. Notice that they say "not reporting", but an human will be visualising your legit private photo.

How else do you expect Apple to verify that your misflagged image isn’t child pornography?
 
Even after reading everything I could find on this, reading through the few threads here on this, I am still left with a couple of nagging questions:

1. Why has Apple chosen to scan this way?
2. Why now? What is the over-riding reason for this now?

For #1 I have not found anything that remotely explains this. Why not do it the same way everyone else is; server side.
For #2 I wonder if the push to scan is a result of regulation the EU is driving.
 
  • Like
Reactions: 09872738
Even after reading everything I could find on this, reading through the few threads here on this, I am still left with a couple of nagging questions:

1. Why has Apple chosen to scan this way?
2. Why now? What is the over-riding reason for this now?

For #1 I have not found anything that remotely explains this. Why not do it the same way everyone else is; server side.
For #2 I wonder if the push to scan is a result of regulation the EU is driving.
1) It prevents Apple from having any knowledge of your photos until they are confident that it matches something in the database
2) Their reasoning is that they’ve wanted to do it earlier, but they haven’t discovered a way of doing it “privately” until now.
 
Even after reading everything I could find on this, reading through the few threads here on this, I am still left with a couple of nagging questions:

1. Why has Apple chosen to scan this way?
2. Why now? What is the over-riding reason for this now?

For #1 I have not found anything that remotely explains this. Why not do it the same way everyone else is; server side.
For #2 I wonder if the push to scan is a result of regulation the EU is driving.

For #1, it's so they can implement this CSAM "protection" while also claiming they still can't access your photos. If they scan them on iCloud, they have to decrypt them, which opens the door (pun intended...) to being able to access/view all of your photos. Other companies simply tell you, "Hey, you're uploading to our servers, so we can do this." Apple, to their credit (or grandstanding?), doesn't like that solution. The way they're implementing this allows them to basically say, "Hey this is going to iCloud anyway, and this keeps us from seeing all of your photos, and only puts the questionable ones in front of us."

The irony of this is in order for them to be able to claim they can do this while not accessing your photos, they've essentially created a way for others (and themselves if they needed) to access and eventually, if need be, see things that were formerly protected by E2E. In the process of saying, "Hey we can do this without seeing your encrypted stuff", they've decreased one of the primary benefits of E2E.

#2 - From everything I read, Apple truly was doing the least out of all companies in this regard. So either they felt like they created a solution that was acceptable from an encryption standpoint, or they were getting pressured to come up with this "solution" (see #1 again...). Or maybe a little of both.

Honestly I don't question the accomplishment they've made here, and there is precedent for me to trust that they're doing it with the proper motivation. I simply think it's penny wise and pound foolish, and don't like them having to use my device to do it.
 
Last edited:
  • Like
Reactions: turbineseaplane
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.