Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The top story of the week is the Magic Keyboard with Touch ID? What a joke of a site.
The funniest part to me is how Apple doing an abrupt about-face on their privacy branding by promising to scan all our photos and text messages to share with law enforcement without a warrant - and hilariously calling it a "Feature!" - just falls under "and More" in the headline. Hahaha.
 
Well, that was quite a week. I won't reiterate the arguments brought forth against the slippery slope this creates for privacy, but I'm quite disappointed how Apple did a 180° turn on one of their central promises. A few days back, I couldn't wait for this fall's new devices - now I found myself checking how good (or bad) the Samsung offerings are. I probably won't immediately sell my current devices (I'm not using iCloud too much and enabling the new scanning "feature" is probably illegal in the EU anyway), but suddenly Android does not look so bad anymore.
 
By far, the TOP story of this week is Apple's decision to automatically scan photos on your iPhone for illegal content. That should be the story up top; by comparison, all of the rest of these stories feel really insignificant.
Did you read this post from jhollington? Quite relevant IMHO.

Actually, Apple's Chief Privacy Officer, Jane Horvath, told a panel at CES early last year that Apple has algorithms in place to scan iCloud Photo Libraries to "help screen for child sexual abuse material."

It's very likely Apple has been doing this for almost as long as other major tech companies that allow photo storage. Google has been doing it since 2008, and Facebook started back in 2011. Much like the new CSAM Detection, however, this is only about scanning for matches to known photos that are already out in the wild.
The goal for this new feature seems to remain the same as before — to catch actual child predators who are using iCloud Photo Library to store their personal collections of filth. The difference is that by moving to on-device scanning, Apple will be able to encrypt users' iCloud Photo Libraries without raising the ire of politicians and law enforcement agencies.
 
  • Like
Reactions: peanuts_of_pathos
Dear MacRumors: Can we move this under the top #1 story, please?

Apple Reveals New Child Safety Features, Including Scanning Photos for Known Sexual Abuse Material

Apple dropped the bomb on us this week. It’s a joke. I really hope Apple reverses it’s decision.

Bottomline: Do not upgrade to iOS 15. Say good bye to privacy and your personal data.

View attachment 1815936
All these billboards that were assuring have now become insulting and offensive.
 
I'm simply playing devil's advocate here - please don't read anything personally about me into this! Anyone who stores child porn (or anything else illegal) in the cloud that isn't encrypted deserves everything their low IQ gets them. That still doesn't exonerate Apple though. Surely child porn and terrorism have semi-equal footings, yet Apple took a different stance on the terrorism side.
 
I'm simply playing devil's advocate here - please don't read anything personally about me into this! Anyone who stores child porn (or anything else illegal) in the cloud that isn't encrypted deserves everything their low IQ gets them. That still doesn't exonerate Apple though. Surely child porn and terrorism have semi-equal footings, yet Apple took a different stance on the terrorism side.

I agree!

They refused to unlock the device of a KNOWN terrorist.

Yet, they are doing warrantless searches of pictures and texts of users that aren’t suspected of having committed a crime?

How does Apple reconcile this behavior?
 
I agree!

They refused to unlock the device of a KNOWN terrorist.

Yet, they are doing warrantless searches of pictures and texts of users that aren’t suspected of having committed a crime?

How does Apple reconcile this behavior?
Can you please try and understand the difference of (searchable) cloud storage and a fully encrypted device without a backdoor? They provided all data that is available to them which is iCloud. They did not magically find a way to decrypt a device and they did not implement a backdoor into the whole thing because backdoors get exploited all the time. This is the same reason why Apple AirPort routers have been among the most secure - they are proprietary in the good way.
 
It's really not good at all. Gruber entertains "Mr. Neuenschwander dismiss[ing] those concerns, saying that safeguards are in place to prevent abuse of the system and that Apple would reject any such demands from a government."

Exhibit A: China.

Hypocritical window dressing, in a nutshell. Google's exit from China just keeps looking better.
What do you expect from a paid obedient sheep?
 
I agree!

They refused to unlock the device of a KNOWN terrorist.

Yet, they are doing warrantless searches of pictures and texts of users that aren’t suspected of having committed a crime?

How does Apple reconcile this behavior?
Yeah they refused to unlock, but handed out the backups, same - same… and the NSA (or was it the FBI? Forgot!) found another way to get in… it was just a farce.
 
  • Like
Reactions: peanuts_of_pathos
Dear MacRumors: Can we move this under the top #1 story, please?

Apple Reveals New Child Safety Features, Including Scanning Photos for Known Sexual Abuse Material

Apple dropped the bomb on us this week. It’s a joke. I really hope Apple reverses it’s decision.

Bottomline: Do not upgrade to iOS 15. Say good bye to privacy and your personal data.

View attachment 1815936
Unfortunately, that means don’t buy the iPhone 13 either since it’ll have iOS 15 preinstalled.
 
My MBP from 2015 is running on fumes.

- The display is broken, so I have to use an external display.
- Some keys doesn't work properly.
- The computer gets so hot I need to use an external fan.
- Half of the USB ports don't work and the dongle I have for the HDMI display is glitchy.
- The display on the touchbar is broken. It keeps flashing all the time.

I don't want to wait another month for the next MBP.
 
Not possible.
Unfortunately, that means don’t buy the iPhone 13 either since it’ll have iOS 15 preinstalled.
I figured out a new way to avoid this privacy mess.

I’m just going to set my iPhone up as new and I will not login to my iCloud. Bringing back 2007-20011 days when iCloud didn’t even exist. :p
 
I figured out a new way to avoid this privacy mess.

I’m just going to set my iPhone up as new and I will not login to my iCloud. Bringing back 2007-20011 days when iCloud didn’t even exist. :p
Even if you keep iCloud Photos turned off, Apple will still be scanning your photos although that is not supposed to trigger anything on Apple's side. Down the road who knows what will be happening.

Keeping iCloud turned off is about all you can do at this point.
 
  • Like
Reactions: peanuts_of_pathos
Even if you keep iCloud Photos turned off, Apple will still be scanning your photos although that is not supposed to trigger anything on Apple's side. Down the road who knows what will be happening.

Keeping iCloud turned off is about all you can do at this point.
I plan on not setting up my iPhone under the iCloud account. I’m about to act like iCloud doesn’t exist and will be using the iPhone as is without logging in and without giving out my information.

Setting the iPhone up as new. No information is given. Good luck scanning any photos or personal data… whatever Apple is trying to achieve. 😂

1628374195562.jpeg
 
Last edited:
  • Like
Reactions: peanuts_of_pathos
I doubt the company is going anywhere so I don't see it as the end, but it invites a lot of frightening questions.

1. Do Android and Samsung already do this without letting anyone know? Or do they do it openly and Apple is just catching up? I honestly don't know.

2. In the U.S. it's "for the children", but what about people in China and other dictatorships where Apple feels compelled to "comply with local law"? Will Apple be turning over the results of their phone scans to those governments? Will anyone in China with a Winnie the Pooh photo, or a meme about the Uighur genocide on their phone be flagged and reported to the Chinese government?

3. How about in places like Russia where homosexuality is virtually criminalized? Will Apple be reporting who is sexting their same-sex partner to the Russian authorities? You might laugh, but the minute Apple rolls this out you better believe these dictatorships will start requiring Apple to use this technology to "comply with local law" if they want to remain in the market.

4. If I took a topless photo of a woman I dated 5 years ago, and it's still on the cloud, and it gets flagged, who looks at it? And if she's wrongly judged to be underaged by whoever on Apple's staff does the looking at photos, what recourse do I have when my phone is suddenly locked and the FBI is breaking down my door? Even if the charges are eventually dropped once the ex comes forward to prove she was an adult when the photo was taken, at this point the suspect has almost certainly lost their job and been shunned by the people in their life.

5. Will Apple pay damages for lost wages, pain and suffering, and attorney fees when a false positive leads to law enforcement action? Will Apple's Terms of Service include a provision where you agree not to sue them if they wrongly get you locked up?

6. How are they judging message content? Are they reading and judging fantasy sexts between adults? Are they only looking at messages that go to Apple customers who are under 18? How can I be sure that if I called my adult girlfriend a "bad little girl" in a joke text message, that the FBI won't be reading it the next day and deciding whether it's actionable?

7. Regarding reexamining tech options - not a bad idea. I've been in the Apple ecosystem so long, even to the point of enthusiastically buying Apple stock when I started investing, that I have no idea what's goin on with Samsung or Pixel phones or how a Windows start page even looks these days. It's past time I caught up on the outside world. But again, are those companies just doing the same thing too?

1. I know MS, FB and Google have been doing it for years. Obviously not client-side as they’re not privacy oriented like Apple. I don’t know about Samsung.

2. no. Apple does realise that would be the end of them.

3. See 2.

4. That’s not how the system Apple built works. Nobody will look at the picture of your naked wife.

5. Nobody will be wrongfully accused. Only if you have multiple confirmed known CSAM pics on your phone will a private NGO be informed.

6. nobody will read your messages. they’re end to end encrypted.

7. It’s very likely they don’t want CSAM pics on their server infrastructure either.
 
I plan on not setting up my iPhone under the iCloud account. I’m about to act like iCloud doesn’t exist and will be using the iPhone as is without logging in and without giving out my information.

Setting the iPhone up as new. No information is given. Good luck scanning any photos or personal data… whatever Apple is trying to achieve. 😂

View attachment 1816060
Even if you aren't logged into iCloud, Apple is still going to be able to scan your photos although no alarms will be triggered, according to Apple. With iOS 15, scanning is done on the device and Apple is alerted when a bad photo is uploaded to the iCloud account.

If you currently have iCloud backups of your phone in iCloud, Apple has access to that as well. Ideally, you should go ahead and turn off iCloud now and backup your phone to your Mac using an app like iMazing. Remove any backups you have in iCloud. If you want to store phone backups in the Cloud, create the backup with iMazing and then encrypt the backup using Cryptomator, and then you can upload to iCloud for safe storage, as Apple won't be able to gain access to that backup.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.