Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
******** it is. Apple has abused so much trust over the years and this is right up there with the rest of their lies & broken promises. Apple still needs to have their feet held to the fire over this lest they try and slip it in quietly.

Trust, but verify - Ronald Reagan
 
  • Like
  • Haha
Reactions: nt5672 and SFjohn
Apple with the 5D chess.

Do something controversial. Then later come out with something that fixes it and also and even more better changes.

Obviously kidding.
 
It's good that they at least caved in to pressure. It's bad that they even considered doing this.

Now if they'd update the included iCloud space... it may actually make it useful. I mean, even the first paid tier ain't good for anything. You can't fully back up a 64 Gb iPhone.
 
I'll let the cat say it.

grumpy.jpg
 
Great news but they lost all my trust with this. I will continue to use local backups on my Mac and keep my photos out of iCloud for good.
After messing with iCloud for years, getting frustrated with failed syncs and slow uploads/downloads, I returned to a home server solution. Now I have peace of mind. I think iCloud is the most trapping part of the walled ecosystem.
 
So Apple was planning to use cpu usage of my iPhone I bought? Will they pay me for their usage? Combing through personal data, using the iPhone I own for their own benefits and I need to pay an extra premium for this?

I am buying the iPhone I want to use my own way not that the cpu is constantly being tasked 20% for tasks of Apple.

I tell you know the base of this is not abandoned but going to be tucked in a new package and marketing way to implement. In the meantime the backdoors are still open like it accidentally got exposed recently when everyones could access each others iCloud Photo library introduced with iOS16. Coincidence?
 
  • Like
Reactions: bobcomer
Hello everyone-

They said they weren’t going to scan iCloud Photos. They didn’t say that they won’t scan your phone

If Apple wanted to scan your phone surreptitiously, they could have done that since 15 years ago.

I don't get why people think this whole CSAM exercise was just a means of Apple getting a backdoor into people's phones - why would they need to go to those lengths when they control the OS for pity's sake!
 
I've had mixed feelings about this from the beginning, but I wonder if this will become a careful-what-you-wish-for moment... Apple won't handle it in a private, focused way on your device, which cracks the door to government demanding a way to do it themselves with blunter instruments.

It's possible this was another AirTag stalking debacle where Apple proactively tried to solve a problem their technology might cause (AirTags for stalking, end to end encryption letting predators operate with impunity), but I suspected this was a proactive way of removing a strawman argument for government demands for greater access. They held the line against the but-terrorism! argument, maybe they can hold it here.

Time will tell.

Hopefully this will at least stop people from making the argument that "Apple wants to put CSAM on our phones", which was always weird.
 
  • Like
Reactions: SFjohn
So excited to hear this. I’ve been sticking with my iPhone XS and iOS 14 ever since that ill-fated policy was announced. Glad that I can finally move on! Couldn’t have come at a better time for me honestly, I just dropped my iPhone and I think the LCD display connector broke, the screen is completely black but it still rings when called…
 
Last edited:
  • Like
Reactions: bobcomer
Good news for all the pedophiles out there.
Bad joke if it is a joke. If it isn't it would seem to reflect a lack of understanding and knowledge about existing tools law enforcement currently uses to find and prosecute those possessing and trading in underage porn.

I just don't understand people who think irreverent remarks and "jokes" about child porn and pedophila have value as humor. None of it, whether you agree with what Apple has done or not, is funny in any way regardless of how you spin it.
 
  • Like
Reactions: murrayp and SFjohn
If Apple wanted to scan your phone surreptitiously, they could have done that since 15 years ago.

I don't get why people think this whole CSAM exercise was just a means of Apple getting a backdoor into people's phones - why would they need to go to those lengths when they control the OS for pity's sake!
Your voice reflects sanity, but didn't you know everything tech does is a giant conspiracy? Just ask a fair number of the posters here!
 
  • Like
  • Haha
Reactions: NightFox and SFjohn
I think this is great. I'm more concerned about the privacy issues then CSAM scanning of a private/personal photo repository. I'd rather just see CSAM removal targeted to public places where they can cause more harm.
 
I wouldn't be so sure they're abandoning it... Probably figured out another way to mass monitor data.
Apple abandoned CSAM scanning of iCloud data. Craig Federighi said as much. Instead they will just be focusing on providing parents tools to identify photos on their children's devices, but nothing will be reported back to Apple or authorities.

I'm no expert, but the need to scan for CSAM among private files has always seemed overblown to me and it has backfired on completely innocent Google Photos users in a number of publicized cases.
 
  • Like
Reactions: SFjohn
For example a parent's pictures of children in a bubble bath could open the door for their entire iCloud to be shared with law enforcement.

Just a point of clarification — the CSAM detection was against a set of known hashed CSAM material, and was not designed to use machines learning models of what might be said material.
 
  • Like
Reactions: jonblatho
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.