Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I think that there has been a struggle between the new "woke" hires and the Jobsian old guard and the wokesters won. Too often, the younger generation views the purpose of corporations is social activism on issues just like this, rather than making insanely great products for customers. I am shocked that more people at Apple don't see this, but apparently they don't or are just out numbered and out flanked.

As with all major corporate decisions, they're usually done not for one reason, but as an intersection of several, and I'm fairly certain this was one of the reasons, which should make the implications even more concerning.
 
I think that there has been a struggle between the new "woke" hires and the Jobsian old guard and the wokesters won. Too often, the younger generation views the purpose of corporations is social activism on issues just like this, rather than making insanely great products for customers. I am shocked that more people at Apple don't see this, but apparently they don't or are just out numbered and out flanked.
I don't think 'woke' has anything to do with Apple's move. Being authortarian does. Shame on Apple.
 
Sure and your privacy is still being exposed, so we're ok now then?
Obviously you didn't read the part where I stated I'm NOT on either side. I was in fact defending a member for stating how they feel about the subject. Go somewhere else with that disrespectful tone. 🙄 👎
 
But isn’t this optional? I must be missing something, but if it’s disabled by default I don’t see the problem.

Building these capabilities into the devices is the issue.

As Ben Thompson put it so well...once the capabilities are built in, all that's stopping a change here is "policy" from Apple (or any local jurisdictions).

Nothing goes "awful" in one fell swoop -- it builds step by step with moves like this
 
Last edited:
No.



1. You'd need to find or generate images that would create a collision. You have a better chance of finding a UUID collision which would take 80 years to do on today's computers to find just ONE.
2. This is why Apple set an extremely high threshold on the amount of before an Apple employee can even decrypt the images. Meaning potentially you'd need to find hundreds of collisions.
3. Apple stated that the chances of a mistake is one in a trillion
4. Assuming you're unlucky enough to be that one in a trillion mistake, Apple will manually review those images in question and will correct the mistakes.



China iCloud data centers are already being reviewed by the government. This is why you get a warning if you fly to China with your iPhone and switch your country setting to China, Apple will tell you you're on China servers which are treated differently.



Chances of winning powerball: 1 in 292 million
Chances of erroneously being flagged via Apple: 1 in a trillion

You have a MUCH higher chance of winning the powerball. And even after being flagged, Apple will review and re-active your account if it was in error.

You're missing the point - NOBODY WANTS IT! We get it, we know what they want to do. NOBODY WANTS IT! Lol
 
The others applies to images stored in iCloud Photos and happens in Apple’s servers.
That is wrong, it's happening on device. Don't get your information from a YouTube guy making money with Apple. It's just not a good idea.
Yeah because Apple hasn't hired anyone with a PhD, right? *cough* John Giannandrea *cough*
Not sure what to address first here.

People at Apple are told what to do, it's not their choice. Plenty of people working at Apple disagree with it. Doesn't matter as it's not their choice. Anyone at Apple who'd come forward about this is the equivalent of Snowden.

Second, about Giannandrea. His highest degree he earned is a Bachelor of Science. He received a Doctorate Honoris Causa. That means, he received this without going through the actual process of education. He never wrote a dissertation, no examination. It's for esteem, nothing else. To give you an idea how much that means. Meryl Streep (the actress) has four of these honorary degrees, including from Yale and Stanford. Opera Winfrey has one from Harvard, Jack Nicholson from Brown, J.K. Rowling (Harry Potter) has seven, Orlando Bloom, Ben Affleck and Kanye West have them too. Should we listen to them? No. Talking about facepalm and then this... 😂
 
Of course they are still pushing forward with this. Apple is no different than Amazon or any large company in this sense. They're not going to listen to anyone but themselves because they think they know better than everyone else. Backlash? "Pfft. It's not us who is wrong! The public just needs it dumbed down and explained better!"
 
  • Like
Reactions: Philip_S
No.



1. You'd need to find or generate images that would create a collision. You have a better chance of finding a UUID collision which would take 80 years to do on today's computers to find just ONE.
2. This is why Apple set an extremely high threshold on the amount of before an Apple employee can even decrypt the images. Meaning potentially you'd need to find hundreds of collisions.
3. Apple stated that the chances of a mistake is one in a trillion
4. Assuming you're unlucky enough to be that one in a trillion mistake, Apple will manually review those images in question and will correct the mistakes.



China iCloud data centers are already being reviewed by the government. This is why you get a warning if you fly to China with your iPhone and switch your country setting to China, Apple will tell you you're on China servers which are treated differently.



Chances of winning powerball: 1 in 292 million
Chances of erroneously being flagged via Apple: 1 in a trillion

You have a MUCH higher chance of winning the powerball. And even after being flagged, Apple will review and re-active your account if it was in error.
You misunderstand what chances are. To win something in a lottery you need at least two to three correct numbers and the odds to have these are not trillons, but only hundreds.

In Apples case one picture is enough to give an alert, even though they claim not to report that, it's a lie because one or hundred child porn pictures doesn't matter, both is a crime you have to report. They would get steong legal problems allowing their users have one or more pictures of this kind. So there will be a zero tolerance.
 
  • Like
Reactions: Victor Mortimer
Building these capabilities into the devices is the issue.

As Ben Thompson put it so well...once the capabilities are build in, all that's stopping a change here is "policy" from Apple (or any local jurisdictions).

Nothing goes "awful" in one fell swoop -- it builds step by step with moves like this
Makes sense. Weird that apple’s willing to drop the privacy flag just like that, though.
 
1. You'd need to find or generate images that would create a collision. You have a better chance of finding a UUID collision which would take 80 years to do on today's computers to find just ONE.
2. This is why Apple set an extremely high threshold on the amount of before an Apple employee can even decrypt the images. Meaning potentially you'd need to find hundreds of collisions.
3. Apple stated that the chances of a mistake is one in a trillion
4. Assuming you're unlucky enough to be that one in a trillion mistake, Apple will manually review those images in question and will correct the mistakes.
https://www.hackerfactor.com/blog/index.php?/archives/929-One-Bad-Apple.html
In addition, review the late 2020 and 2021 peer reviewed papers on hash based adversarial attacks and examples. Simple Google Scholar search will yield everything you need.
 
No.



1. You'd need to find or generate images that would create a collision. You have a better chance of finding a UUID collision which would take 80 years to do on today's computers to find just ONE.
2. This is why Apple set an extremely high threshold on the amount of before an Apple employee can even decrypt the images. Meaning potentially you'd need to find hundreds of collisions.
3. Apple stated that the chances of a mistake is one in a trillion
4. Assuming you're unlucky enough to be that one in a trillion mistake, Apple will manually review those images in question and will correct the mistakes.



China iCloud data centers are already being reviewed by the government. This is why you get a warning if you fly to China with your iPhone and switch your country setting to China, Apple will tell you you're on China servers which are treated differently.



Chances of winning powerball: 1 in 292 million
Chances of erroneously being flagged via Apple: 1 in a trillion

You have a MUCH higher chance of winning the powerball. And even after being flagged, Apple will review and re-active your account if it was in error.

Other perspectives:
- Odds of you dying from a car crash: 1 in 107
- Odds of you being struck by lightning: 1 in 1.2 million
- Odds of you dying from a shark attack is 1 in 3.7 million
- Odds of you dying from a plane ride: 1 in 29.4 million


Think about it. 1 in a trillion. What other event in the world happens with 1 in a trillion odds.
The answer is simple: I do not want to be reviewed about my very private things even one in a thousand gazillion gazillions. Until the universe may explode: NO
 
So how soon until Apple starts scanning all our content for "illegal material?" When does iMessage begin scanning every incoming and outgoing text for keywords and phrases?

People will say you're being hyperbolic - I don't think so at all..

Don't think for a second that movie studios (just picking something random) don't want to enforce copyright down to you even freakin' talking about their content -- let alone sharing clips of it in messages, etc.

Hey, maybe if I start texting a friend about Top Gun 2 in theaters we can have them scan it and inject ADs and showtimes from AMC and then Apple can take an advertising cut from them ---- just because I "talked about it" with a friend in an iMessage.

That sounds "fun"
/s
 
Apple really likes kids for some reason. They are now a preschool company. Why don’t you make me high chair with Bluetooth, apple?? 🍎🍎🍎🍎
 
So how soon until Apple starts scanning all our content for "illegal material?" When does iMessage begin scanning every incoming and outgoing text for keywords and phrases?
Just switch platforms (phone, computer, tablet, watch, etc) and your privacy and security will be top notch. Check out Google and Microsoft for your next products.
 
  • Like
Reactions: Shadow Demon
That is true if you naively believe Apple's estimate of one in a trillion false positive reporting of accounts. I don't.

We're all guilty unless proven innocent in Apple's eyes. The hubris of the company is showing.

Can't really help you there. Cherry picking what you want to believe and what you don't want to believe makes no sense to me.
 
Apple really likes kids for some reason. They are now a preschool company. Why don’t you make me high chair with Bluetooth, apple?? 🍎🍎🍎🍎
It goes back to their whole deal with MECC and schools with their Apple II systems. All elementary schools in the 80s had nothing but wall-to-wall Apple //e systems
 
  • Like
Reactions: Mrjetsondc
No.



1. You'd need to find or generate images that would create a collision. You have a better chance of finding a UUID collision which would take 80 years to do on today's computers to find just ONE.
2. This is why Apple set an extremely high threshold on the amount of before an Apple employee can even decrypt the images. Meaning potentially you'd need to find hundreds of collisions.
3. Apple stated that the chances of a mistake is one in a trillion
4. Assuming you're unlucky enough to be that one in a trillion mistake, Apple will manually review those images in question and will correct the mistakes.



China iCloud data centers are already being reviewed by the government. This is why you get a warning if you fly to China with your iPhone and switch your country setting to China, Apple will tell you you're on China servers which are treated differently.



Chances of winning powerball: 1 in 292 million
Chances of erroneously being flagged via Apple: 1 in a trillion

You have a MUCH higher chance of winning the powerball. And even after being flagged, Apple will review and re-active your account if it was in error.
OK - so you seem to understand this process, so it seems worth asking you. It sounds like Apple is not using an exact match, otherwise it'd lead to arms race between pedophiles and Apple, for Apple would have to add a new hash if so much as a pixel was edited in a CSAM image. It doesn't sound like Apple is doing that. It sounds like Apple is doing an approximate match, which must be down to perceptual similarity, assuming the hashing system is some sort of dimension reduction/compression that represents either explicitly, or implicitly, features of the image. That means false positives are likely to involve a lot of exposed skin (I presume that is something often seen in child porn). So... Apple's hair-brained algorithm has a hissy fit over some image in your library and then some human being at Apple decrypts an image of your partner wearing lingerie. Is that about right?
 
Last edited:
1. You'd need to find or generate images that would create a collision. You have a better chance of finding a UUID collision which would take 80 years to do on today's computers to find just ONE.

You do not know this and are just assuming. As nobody knows the algorithm you can not check these assumptions. And Apple could always make mistakes when implementing it.

Assuming you're unlucky enough to be that one in a trillion mistake, Apple will manually review those images in question and will correct the mistakes.

Or they could make a mistake when reporting a user and confuse the name of thiis users with yours.

And do not forget, that the reward for a potential attaker with the resources of a gouvernement is potentially very high so a lot of resources could be invested e.g. when trying to influence an election or when trying to get rid of an politician.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.