Top Stories: Magic Keyboard With Touch ID, New MacBook Pro and Apple Watches Incoming, and More

By far, the TOP story of this week is Apple's decision to automatically scan photos on your iPhone for illegal content. That should be the story up top; by comparison, all of the rest of these stories feel really insignificant.
I'm not defending Apple on this BUT they are NOT scanning photos ON your phone, only those that are synced to their servers.

Do you have something to hide?
 
A) Too many people in this thread don't understand what hashing is or how it works. It is a one-way process. To use the example the gave in Godzilla: Singular Point - you cannot reconstruct a potato from hash browns. Any change to the file changes the hash. Now maybe this hashing system has various permutations, but you should still only be worried if you have a permutation of known child pornography.
Maybe this might help some understand with a very general crypto currency example: Apple is searching a local copy of a public ledger for specific transaction IDs.
B) Does anyone think this isn't getting done by others already? They use hashes to ensure that your files uploaded uncorrupted, for one. Why? Even a single byte change radically alters the hash. Again, I am not familiar with this particular hashing function - maybe they have used AI to include hashes for permutations, but that seems excessive and like it would increase the possibility if a collision w/ an "innocent" photo (in this case one not in the CSAM ledger).
C) Case in point: https://blog.cloudflare.com/the-csam-scanning-tool/

Sorry to go on a rant about it, but it feels like a lot of misinformed people "did their own research" and are making a supermassive mountain over a quantum molehill.
 
I'm not defending Apple on this BUT they are NOT scanning photos ON your phone, only those that are synced to their servers.

Do you have something to hide?
People that are into child pornography or weird stuff ends up getting caught eventually. This is the law enforcement job not Apple’s. Apple is a tech company not law enforcement.

I don’t think people have anything to hide but we all have our secrets. It’s more about the privacy that is being exposed to Apple and who knows what they are going to do with it. I’m 100% sure people around the world take pictures of their passport, ID card, license number, social security number, credit card numbers, any sensitive information you can think of.

Bottomline: It’s just not safe/secured anymore and if any hacker gets into Apple’s server system. It’s GAME OVER.

Who’s going to be held responsible for that?
 
Last edited:
A) Too many people in this thread don't understand what hashing is or how it works. It is a one-way process. To use the example the gave in Godzilla: Singular Point - you cannot reconstruct a potato from hash browns. Any change to the file changes the hash. Now maybe this hashing system has various permutations, but you should still only be worried if you have a permutation of known child pornography.
Maybe this might help some understand with a very general crypto currency example: Apple is searching a local copy of a public ledger for specific transaction IDs.
B) Does anyone think this isn't getting done by others already? They use hashes to ensure that your files uploaded uncorrupted, for one. Why? Even a single byte change radically alters the hash. Again, I am not familiar with this particular hashing function - maybe they have used AI to include hashes for permutations, but that seems excessive and like it would increase the possibility if a collision w/ an "innocent" photo (in this case one not in the CSAM ledger).
C) Case in point: https://blog.cloudflare.com/the-csam-scanning-tool/

Sorry to go on a rant about it, but it feels like a lot of misinformed people "did their own research" and are making a supermassive mountain over a quantum molehill.
You do know Apple itself said after it’s detected a physical Apple employee would review the photos.

An Apple employee looking at your wife’s picture, baby pictures, maybe Apple employee’s ex, family, friends, pets, all the other information is exposed. It is reviewed by a physical human being. Just because it’s a 2 trillion dollar company that doesn’t give them a right to be invading the privacy.
 
Last edited:
A) Too many people in this thread don't understand what hashing is or how it works. It is a one-way process. To use the example the gave in Godzilla: Singular Point - you cannot reconstruct a potato from hash browns. Any change to the file changes the hash. Now maybe this hashing system has various permutations, but you should still only be worried if you have a permutation of known child pornography.
Maybe this might help some understand with a very general crypto currency example: Apple is searching a local copy of a public ledger for specific transaction IDs.
B) Does anyone think this isn't getting done by others already? They use hashes to ensure that your files uploaded uncorrupted, for one. Why? Even a single byte change radically alters the hash. Again, I am not familiar with this particular hashing function - maybe they have used AI to include hashes for permutations, but that seems excessive and like it would increase the possibility if a collision w/ an "innocent" photo (in this case one not in the CSAM ledger).
C) Case in point: https://blog.cloudflare.com/the-csam-scanning-tool/

Sorry to go on a rant about it, but it feels like a lot of misinformed people "did their own research" and are making a supermassive mountain over a quantum molehill.
You should stop trying to sound more competent than all these experts named here: https://appleprivacyletter.com/
 
Given their sensitivity over the Touch ID sensors, how are they allowing the new keyboards to function at all? I assume they work? When I heard of the release, I wondered if someone in marketing was sniffing glue or something. I mean, you used to be able to brick an iPhone by swapping the sensor. Forgive my surprise...
 
My big worry is children - who are naturally curious about their private parts - they DO take pictures.

I had to erase about 200 "backside" pictures my son had made on his Nikon camera. He said he could not see his backside - and wanted to know what it looked like. 190 out of focus images but a few "in focus" and revealing.

As our children sometimes "borrow" parents phones to take about 1000 pictures before you can say "Where is my phone?" - it could get messy really fast

I know Apple claims they only tag previously "known" images. I do not trust that statement at all - it does not make sense as you would prefer to capture the images at the "source" (aka the "dealer/maker" - not at the "user" level)
But those photos are not on the child porn database - so there would be nothing to worry about - except embarrassment.

I took pictures of a pilonidal cyst kind of boil on my butt crack one time and forgot to delete it - and it came up when I was showing friends some vacation pictures lol - so yah - people do take pictures - but that is not at all like the dehumanizing child porn in the database and would not trigger anything in apples scan. But whatever.

anyway - I’m now against Apple scanning anything on our devices. It truly can be exploited as a backdoor. I gotta believe Apple is going to cancel this stupid plan.
 
Frankly speaking, if i was Apple i'll do the same thing, i'll not allow pedophiles, scumbags, trash people storing their trash on my servers. If you are a pedophile, a scumbag, a psicopath maniac you need a doctor, the phone you bought is yours you can do whatever you want but don't store your trash in my house. People who think that this is the end of privacy really must open your eyes, you are thinking like a 16 year old girl, the online world is full of scumbags and people who will take advantage of whatever they can, so there need to be a compromise, a balance to try to protect defenseless people at a cost...you want privacy get out of the internet, go analog, even strong criptography will be broke in the coming years, stop thinking like a 16 year old girl
 
Frankly speaking, if i was Apple i'll do the same thing, i'll not allow pedophiles, scumbags, trash people storing their trash on my servers. If you are a pedophile, a scumbag, a psicopath maniac you need a doctor, the phone you bought is yours you can do whatever you want but don't store your trash in my house. People who think that this is the end of privacy really must open your eyes, you are thinking like a 16 year old girl, the online world is full of scumbags and people who will take advantage of whatever they can, so there need to be a compromise, a balance to try to protect defenseless people at a cost...you want privacy get out of the internet, go analog, even strong criptography will be broke in the coming years, stop thinking like a 16 year old girl
Good point. So, are you saying you would be OK with Apple employee snooping through your phone and looking at your wife’s picture?

I just think, if Apple wants to enforce go ahead but there are other alternatives way which can be done instead of scanning photos.
 
Given their sensitivity over the Touch ID sensors, how are they allowing the new keyboards to function at all? I assume they work? When I heard of the release, I wondered if someone in marketing was sniffing glue or something. I mean, you used to be able to brick an iPhone by swapping the sensor. Forgive my surprise...
It was simply on purpose, they saw third party repair shops taking away their repair (*cough*cough* 1:1 swap) jobs and introduced this „brickage“. Third party TouchID repair did cost 20-30€£$, while 1:1 Apple swap 150-300€£$. Then Apple got the broken device, replaced the TouchID and renewed it somewhere in China for few cents, and reused it for 1:1 swap. Win-Win, win for Apple i mean. That’s why they used to reject the *repairs*, if your device looked like too damaged, that way their filthy make profit mixed calculations couldn’t work out.

The repair needs of additional Keyboards are simply too low, there is no Keyboard repair market to make profit, but wait till the TouchID of MacBook M1 breaks.
Win-Win, win for Apple again.
 
Last edited:
1628493898848.png
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.
Back
Top