Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
First problem with this is, "Someone on Twitter". Haha. Never trust anything you read of that cesspool. You have to accept airdrops and ALSO have to have it turned on to everybody for that to even be possible. Switch it to contacts only or only turn it on when needed.
I mentioned something similar on a previous thread except the threat I pointed out is what happens if a hacker were able to hijack your Apple ID. Remember a few years ago when there were the celebrity iCloud breaches? Never mind that individuals and account security don't really go hand in hand well (passwords and such).

Previously the worst case scenario was that you lost access to your account and purchases. Now you'll have individuals hijacking accounts and holding people for ransom with the threat of uploading kiddie porn via a VPN in their location in order to frame them which would set off the triggers and as you said... reported to the authorities, no recourse and you'd end up in jail with your life ruined. Doesn't even have to be a hacker, could be a revenge actor doing it, etc.

Apple can preach about the tech being sound till the cows come home, but the mechanism behind it stinks of bad actors, potential criminality, and interference.
Yes. The account takeover and AirDrop threats are both extremely big attack vectors. This is going to be the new ransomware, except when done for targeted purposes, where the first warning you get will be your door coming off its hinges from the SWAT team breeching it.

If you believe Apple's one-in-a-trillion nonsense on this, you just need to keep reading and paying attention.
 
  • Like
Reactions: PC_tech and shbumc
Yes. The account takeover and AirDrop threats are both extremely big attack vectors. This is going to be the new ransomware, except when done for targeted purposes, where the first warning you get will be your door coming off its hinges from the SWAT team breeching it.
Did you read what I wrote?

Airdrop has safeguards in place. Airdrop isn't a threat unless you let it be. FIRST, you must accept the airdrop. SECOND, unless you have airdrop turned on for EVERYBODY, it's not possible for somebody outside your contacts to send you an airdrop. You can even set it to receiving off.
 
Wait - I’m confused by this. Does this mean that everyone’s iCloud is going to be scanned without user’s authorization in the name of child welfare??

While I am sure people may agree with this, it seems like one step away from doctors/dentists submitting DNA samples of *every* patient because it is in the public interest.

This program seems just a small morale slip away from being an invasion of privacy on a monumental scale. Give what Snowden revealed the US government has a huge thirst for data collection like this. It’s a short hop to scan for compromising photos of your political rivals, yes?
No your phone will be scanned. It’s not scanning iCloud
 
Dear MR Mods:

Please consider moving this political topic to the politics section. It is VERY difficult to discuss security, crime, and criminology without getting ding'd, warned, penalized, or otherwise punished for political discussion. That's because these topics are, at their very root, political in nature.

Thank you your consideration of this request.

Your Flight Plan


It sounds like he was able to get into the system.
Yep!
Will CSAM report my Nirvana Nevermind Album Cover in iTunes?
No, but it WILL grab all of your Disney movies and build a profile identifying you either as an 8 year old girl or somebody with a creepy attraction to kid flicks.
Matthew Green, who teaches cryptography at Johns Hopkins University

BAM! 👊 POW! 💥
But people in here will still side with Apple, unbelievable. 🤦‍♂️
You must be reading a different thread than I am.
Too much risk is involved. No idea, why Apple is even allowing this in first place.

The fact anyone can reverse engineer. Being able to find the code inside the platform, scary stuff!
I agree.

As this whole scanning thing continues to get more publicity, it’s going to be difficult for customers to not have heard about it by the iPhone 13 release and all the security researchers calling out potentially dangerous flaws is surely going to make the masses have doubt in it. This is beginning to look really bad for Apple.
I fear that nothing will really change until a couple of Senators or Supreme Court justices get busted without due process. Kind of like what happened with the no-fly lists.
I have to ask, how does it feel to be on the other side of the fence now, with a social justice program that you don’t support, being told it’s for the greater good and to just shut up?

Not so great huh?
To whom are you directing this question? "Other side of the fence?" I and many others here have ALWAYS been on the other side of the fence. You seem to think we're all waffling, but I think we're very consistent here.
I would love to see this entire “feature” scrapped but I won’t hold my breath.
Shouldn't you fight it? I mean really, if it's all that bad and a rotten egg too, why wouldn't you fight it?
Again, no one is going to be reported without a human confirming the results. If you don't post CSAM, you won't be reported to NCMEC. Apple isn't reporting anyone to the police or FBI. That isn't their job.
People misreport on other people all the time, and sometimes maliciously so, just to try to have them damaged or jailed in some way! What makes you think it won't happen in technology too? Or do you need to be unjustly jailed as a victim of false reporting without due process before you will learn this for yourself?
And that's why Apple says there's a one in a trillion chance an account will be accidentally flagged and not that it's impossible to be accidentally flagged.


No.
The hashes to scan for are built into iOS.
If a photo is fingerprinted and does not match a known hash then it's ignored.
ONLY if a hash fingerprint match is made is a voucher with the hash and matched photo created, and that voucher doesn't leave your iPhone and no-one can see the contents of the hash (or I think that it even exists) until you get to 30 vouchers and then the entire lot is sent to Apple for validation by automated systems and if that system concurs is sent to an Apple validation team. ONLY of that team sees the images actually are CSAM is the content reported to any form of government.
You say that like you think it won't be misused. Am I right?
How soon before they start deleting spicy political memes?
It has already begun on just about every platform out there. Did you just wake up from a long nap?
I will always criticize Apple for not catching more zero-days. But I just don't understand why this algorithm running on your devices is seen as more susceptible to attack than any other. I think it is likely that it is actually less likely to be hacked given the amount of security and privacy scrutiny it is receiving.
"I think it is likely that it is actually less likely"

Really? Did you REALLY mean to type that?

Your thinking and your likely/notlikely logic is just not good enough for me.

The ironic thing is that if you are ever framed for a crime, you'll want me on your jury. But you might not even GET a jury, because you'll see no way out through all of the fake but convincing evidence against you, and you'll take your lazy lawyer's advice and plead guilty to a crime you didn't commit.
I mentioned something similar on a previous thread except the threat I pointed out is what happens if a hacker were able to hijack your Apple ID. Remember a few years ago when there were the celebrity iCloud breaches? Never mind that individuals and account security don't really go hand in hand well (passwords and such).

Previously the worst case scenario was that you lost access to your account and purchases. Now you'll have individuals hijacking accounts and holding people for ransom with the threat of uploading kiddie porn via a VPN in their location in order to frame them which would set off the triggers and as you said... reported to the authorities, no recourse and you'd end up in jail with your life ruined. Doesn't even have to be a hacker, could be a revenge actor doing it, etc.

Apple can preach about the tech being sound till the cows come home, but the mechanism behind it stinks of bad actors, potential criminality, and interference.
Yep! Right now they call you as fake IRS agents and tell you they're going to sic the cops on you for not paying your taxes. Can you imagine how rich they'll get if they can fake up some evidence like a hashish and THEN demand money from you?
The remark above seems to be coming from someone who has no idea what they’re talking about.
You seem to be very trusting. That's adorbs!
 
And that's why Apple says there's a one in a trillion chance an account will be accidentally flagged and not that it's impossible to be accidentally flagged.


No.
The hashes to scan for are built into iOS.
If a photo is fingerprinted and does not match a known hash then it's ignored.
ONLY if a hash fingerprint match is made is a voucher with the hash and matched photo created, and that voucher doesn't leave your iPhone and no-one can see the contents of the hash (or I think that it even exists) until you get to 30 vouchers and then the entire lot is sent to Apple for validation by automated systems and if that system concurs is sent to an Apple validation team. ONLY of that team sees the images actually are CSAM is the content reported to any form of government.
I have no problems with the way Apples vision of how this is going to work but you are making the assumption that this is going to be used the way Apple intends it to be used. The world doesn't work that way (See Pegasus story to see how Apples Vision and their security stacks up). Hackers and government agencies don't care about Apple's intentions. If the algorithm used for the hash is figured out, like this individual already did before its even released (not to the final version but only a matter of time) then a person can create a fake image which produces the same hash.

So someone emails you a few times with their logo being an image of harmless trees and all of sudden you have the police knocking on your door with a search warrant to seize all your computers and devices. Heck, even if the system is working perfectly, what's stopping someone from a foreign country sending you 30 images of child porn. Now its up to you to defend yourself that this is someone planting stuff on you. Sure... Tell it to the judge.

This doesn't even go into all the possibilities of what the NSO (Pegasus) is going to do with this technology. My guess is to change who gets notified (insert government agency here) and add your own images with our new Apple algorithm hash creator.

This system is a disaster waiting to happen. I get it. It's great to be a champion at fighting child Porn. This isn't it.
 
I mentioned something similar on a previous thread except the threat I pointed out is what happens if a hacker were able to hijack your Apple ID. Remember a few years ago when there were the celebrity iCloud breaches? Never mind that individuals and account security don't really go hand in hand well (passwords and such).

Previously the worst case scenario was that you lost access to your account and purchases. Now you'll have individuals hijacking accounts and holding people for ransom with the threat of uploading kiddie porn via a VPN in their location in order to frame them which would set off the triggers and as you said... reported to the authorities, no recourse and you'd end up in jail with your life ruined. Doesn't even have to be a hacker, could be a revenge actor doing it, etc.

Apple can preach about the tech being sound till the cows come home, but the mechanism behind it stinks of bad actors, potential criminality, and interference.
Is that happening now? They could do all if that now, but instead of automatic tagging, just “narc” the iCloud account after uploading images. If APPL is told CSAM is present they are obligated to investigate.

in fact, it is easier now. Once this system is in place they would need the hashes of the database and the image that produced the hash and those two things would need to be linked.

the danger of the proposed system isn’t bad actors, but governments legislating that more images are included in the dataset, and that all images are scanned, and the hashes exfiltrated even if the photos are not shared to iCloud.
 
  • Like
Reactions: mindsaspire
We still have to remove our shoes in airport security because of some looney tunes guy with a fuse sticking out of his shoes like 20 years ago (after Bush's 9/11 war scam).

This is a noble cause, but it's only going to go downhill from here.

Welcome to the surveillance globe. Now, show your social credit credentials now or we just pull them up ourselves. Your choice, global citizen.

We tried to fight this, but here we are.
 
Did you read what I wrote?

Airdrop has safeguards in place. Airdrop isn't a threat unless you let it be. FIRST, you must accept the airdrop. SECOND, unless you have airdrop turned on for EVERYBODY, it's not possible for somebody outside your contacts to send you an airdrop. You can even set it to receiving off.
Strange, I was replying to the second quote and not sure how that quote from your post got in there. Sorry about that.

That said, I stand by AirDrop being a huge vector for attacks. Stories like this are out there all the time:
 
  • Like
Reactions: shbumc
I'm sure a lot of people here do understand what is going on but you can't deny that there are a lot of people also here who don't understand. In this small thread alone we have people asking if album covers will get them in trouble, denying that they authorized Apple's scanning, and those who don't believe that the scanning is disabled if you don't use iCloud.

Yes on the above. Many people. On EVERY SINGLE THREAD having to do with this subject for the last two weeks.

Even after a handful of people here described in great and accurate detail how the system will work, many many times. People are just predisposed to ignoring the facts and believe the end is near. Astonishing that so many here choose to cling to ignorance.
 
Wait - I’m confused by this. Does this mean that everyone’s iCloud is going to be scanned without user’s authorization in the name of child welfare??

While I am sure people may agree with this, it seems like one step away from doctors/dentists submitting DNA samples of *every* patient because it is in the public interest.

This program seems just a small morale slip away from being an invasion of privacy on a monumental scale. Give what Snowden revealed the US government has a huge thirst for data collection like this. It’s a short hop to scan for compromising photos of your political rivals, yes?
They’ll add something to the EULA people will have to agree to in order to upgrade to iOS 15, assuming they haven’t already done so.

Accepting the EULA will be granting Apple permission to do these types of scans.
 
  • Like
Reactions: sos47 and shbumc
Yes on the above. Many people. On EVERY SINGLE THREAD having to do with this subject for the last two weeks.

Even after a handful of people here described in great and accurate detail how the system will work, many many times. People are just predisposed to ignoring the facts and believe the end is near. Astonishing that so many here choose to cling to ignorance.
I agree. I don't care if there are so many security experts and users against this. Their persistent ignorance means that they are no experts after all. Dissent must be suppressed by the regime with re-education camps and texts.
 
they promise they won't but the backdoor is there now.
This.

makes the whole privacy ads released a few months ago BS.

eta, the photo scanning for images, they already have a feature for some time now where they group based on the subject (so this could have been something they’d been working towards for a long time now) and they even stated recently the tech behind the face recognization to grouping of pics had improved much, but there’s still pretty obvious mistakes within the albums they curated for me.

Keep giving them inches and over time it’s gonna be the whole damn mile they take.
 
Last edited:
They would say this. They have been caught with their pants down here and they are now trying to cover their tracks. If you are going to make code public to be audited, it needs to be the full code end to end.

At this point, I wouldn't be surprised if they do push ahead with this reputation-damaging policy. But I am really hoping they u-turn on this. Although, even if they did u-turn, the trust is damaged and I am not sure I would so easily trust them again. Like when a girlfriend cheats on you. Silly mistake with lasting damage.

I already have a Pixel 5 running CalyxOS ready to go. It will be hard to de-Apple myself but privacy is too important to me.
 
Wait - I’m confused by this. Does this mean that everyone’s iCloud is going to be scanned without user’s authorization in the name of child welfare??

While I am sure people may agree with this, it seems like one step away from doctors/dentists submitting DNA samples of *every* patient because it is in the public interest.

This program seems just a small morale slip away from being an invasion of privacy on a monumental scale. Give what Snowden revealed the US government has a huge thirst for data collection like this. It’s a short hop to scan for compromising photos of your political rivals, yes?
This is the core issue that keeps escaping Apple Inc and those who can't see it for what it is--an invasion of your personal property with the justification that it is for a noble cause. If you had a photo album sitting on your coffee table, would Apple have any right to it? Of course not. But because this is your photo album on Apple made devices and services, they (now clearly) feel they have both a duty and a right to scan your content for specific criminal behavior. This offends Apple customers because Apple has been on a massive campaign about personal data and privacy. This move undermined all that work in my mind.

No one has ever been against the cause of punishing the guilty, but we demand due process. It's yet another affront to use such a heinous crime as the linchpin into the scanning of customer content to detect criminal behavior, and then transmitting that information to a third party. Apple keeps assuring us the process is solid, but they should be collectively smart enough to know what the real issue is. They seem to be dodging the point on purpose, which makes me all the more suspect as to the why and to what end. They are counting on 99.9% of us having "nothing to hide." That just makes the pill easier to swallow.
 
Nicholas Weaver, told Motherboard
I wouldn't trust Nicholas Weaver as far as I can throw him.

This whole thing is his brainchild, he wrote up this idea, almost verbatim, back in 2019 for Lawfare blog. A national security blog.

Since then, he's been zealously defending this move by Apple. Using his credentials to lend credibility to this whole mess, without actually engaging with his peers in the field on the issues and concerns that have been raised.

His "rebuttals" can typically be summed up as intellectually dishonest at best and downright lacking in substance to withstand a simple sneeze at worst.

Take his actual argument for example:

all people can do with manipulating non-CSAM hashes into CSAM is "annoy Apple's response team with garbage images until they implement a filter" to get rid of false positives. Actually fooling Apple's system would also require access to the hashes provided by NCMEC and it would require the production of over 30 colliding images, with the end result not fooling the human oversight.
This is plainly false. Since what has been accomplished is pre-image, meaning that the researchers have managed to create a collision by merely using a hash. The resulting image causes a collision but looks like random noise.

All that is necessary for an adverse actor is to use the noise as a mask over legal pornographic material and the user is screwed. Apple's human reviewer isn't going to do an entire CSI analysis to make sure if the people depicted in the photo (or their body parts) were underage at the time the photo was taken.

They'll ask themselves one question: could this be CSAM? If the answer is yes then the account gets blocked and a report is files.

As for his statement that "it would require the production of over 30 colliding images", that's just intellectually dishonest. They don't have to be 30 unique images, it could be 30 of the same. And even if it would require unique images, generating a colliding image is trivial both in effort and time as has been demonstrated, to then apply that colliding image to legal porn is even less of a feat.
 
I’ve had more time to simmer on this whole CSAM/Apple situation since the story broke 10 days ago and I just can’t bring myself to feel like this is ok. I usually give Apple the benefit of the doubt on these types of things but I just can’t get past the hashing being done on-device.
You should keep giving them the benefit, as you clearly don’t understand how iCloud works. Apple never has access to your images. That’s why the hashing is done on your device, since only your device can view the image.
 
  • Angry
Reactions: Shirasaki
I have to ask, how does it feel to be on the other side of the fence now, with a social justice program that you don’t support, being told it’s for the greater good and to just shut up?
Something about apples and oranges... But I won't waste my time arguing on the 'net. ;)
 
This makes it even more concerning because it must mean that they are not self-critical enough to see the flaws in their system or potential abuse. They refuse to allow even the possibility that they have blindspots, because anyone who disagrees is just confused and ignorant.
I’d argue this is part of the culture in Apple for a very long time: treat user like dumb sheep, believe they know everything better than anyone else, does not listen to customer complaints etc. Antennagate is one of the products of such twisted mindsets. Granted, these mindsets have lead Apple from infancy to being the most successful company on the planet, but customers ain’t as dumb as they’d assumed. So this happens.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.