Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
A great misconception is that a person owns an iPhone in the same way as either their house or car. This is never been true. A person owns the data and the part of the iPhone that could be used as a paperweight. Plus, by using Apple’s cloud services, there are exceptions with CSAM being one starting sometime in iOS 15.

There are other options which are no better but the path is clear.
 
I'd say the chances iOS 15 & its derivatives are Boycott now stands @ 20% or so !

2x of what it was just a day ago !

Apple's BIG challenge right now is what to do about the iPhone 13 family.

Go with iOS 15 ?, or go with iOS 14.8 ?

If they decide to go with iOS 14.8, I suspect they will leak it to one of their contacts, so Word Gets Out !

IMO, they don't really have much of a choice, they must go with iOS 14.8.

And naturally, I'm assuming it doesn't include what Apple announced last Thursday.
Even if they preinstall iOS 14.8 the software would be on life support that Apple could pull the plug on any day. If you are not okay with what Apple is doing, I do not think preinstalling iOS 14.8 on the device is going to attract those customers.
 
Hmmm. I could see it now - an "If You Want to Date My Daughter' app, where the potential suitor's blood type could be pulled from the Health app, his criminal record pulled from the net, photo from the Photos app, a complete map of where's he's been for the last 7 days from Maps, a word count of the number of times drugs, sex or profanity was mentioned in his texts/posts, etc. 🤔
You got it, can I please get that app so I can check on my daughters “friends”
 
  • Like
Reactions: VulchR
Except the child pornographers can just turn off iCloud Photo Library and won't have any issues, while the rest of us innocent users are constantly surveilled.

I would also love to see Apple cite actual data about child exploitation instead of just issuing broad proclamations about how this is such a huge problem.

This also doesn’t solve the problem. I want to see the CREATORS in jail or worse. While someone randomly finding these pics and saving them is sick, how about we focus on the creators. And criminals are smart. Now that Apple has talked about this LOUDLY, everyone knows turning off iCloud photos will prevent this. Or people just won’t use iPhones for this thing. But we as innocent individuals are harmed by being scanned without a warrant.
 
  • Like
Reactions: centauratlas
They will be scanning ON YOUR DEVICE. Will you let them in your house to search for evidence for crime, too? You phone and devices will have become snitch tools to use against you rather than friends and assistants to help you.
I think this overall point is valid for tech as a whole. Notice that all the algorithms for everything, music, ads, social media, etc. are less about you and more about giving more money and power to others. Smart phones have gotten very smart in some areas and stay extremely dumb in others (Siri)

I feel like ten years ago music and news apps were WAY better at catering to my tastes rather than that of others.
 
  • Like
Reactions: turbineseaplane
Helluva way for this to be the 'year of the Linux desktop' ain't it?


I am developing professional software for linux devices and I am using linux for this. But the linux desktop is the reason why I switched to the Mac in the first place. The usability could not be worse for some things, e.g. printing to pdf and a lot od other things.

But yes, I would switch to Linux when leaving the Mac.
 
This also doesn’t solve the problem. I want to see the CREATORS in jail or worse. While someone randomly finding these pics and saving them is sick, how about we focus on the creators. And criminals are smart. Now that Apple has talked about this LOUDLY, everyone knows turning off iCloud photos will prevent this. Or people just won’t use iPhones for this thing. But we as innocent individuals are harmed by being scanned without a warrant.
CSAM could enable this.

If the content gets flagged, it gets reported to apple/gov with all the appropriate Metadata.

There are databases that have scanned PUBLICLY available pictures in order to match that metadata with some perpetrators f.ex: facebook profile picture.

source(some old defcon lecture, cba to look it up, but it's straightforward knowledge even NCIS covered it at some point ;))
 
Why I don't like here is just that..

Apple should refuse to help governments go through peoples files, for any reason (without just cause and a warrant) and force this to be a political issue in front of voters and politicians.

Corporations with any backbone - as I thought Apple had - should be pushing this up to a public policy debate.

This is a chicken ish approach by Apple
They need to implement end to end, key on device encryption so that they can not do it. Otherwise someone somewhere will demand it and apple will be forced to do so in whatever jurisdiction that is.
 
  • Like
Reactions: turbineseaplane
Obviously you didn't read the part where I stated I'm NOT on either side. I was in fact defending a member for stating how they feel about the subject. Go somewhere else with that disrespectful tone. 🙄 👎
Disrespectful why? you're telling me to go somewhere else :)
 
It seems very contradictory and goes against their whole campaign of privacy I get why it's important. But when you set a precedent for one thing it opens the door for others and pretty soon other people are interpreting information that has nothing to do with what they're actually looking for.
 
According to FotoForensics, a private on-line photo research company, the amount of CSAM they discover in over a million images per year is 0.056%.
This isn’t just about abuse. It’s about little kids sending and receiving D-— pics, human trafficking ,etc. As I stated already in the thread, it’s a HUGE issue in other parts of the world including the US, it’s in the tens of millions of kids.
 
Last edited:
i-ve-got-one-that-can-see_orig.jpg
 
There's just no way China is going to continue doing business with Apple if they don't start "playing ball" like the other companies, and loss of the Chinese market would be catastrophic to Apple's bottom line.

Easily explains Apple's surprising about-face.
 
  • Like
Reactions: centauratlas
because this type of crime wins the hearts of the masses, no one can possibly be against it. And that's how they get their foot in the door for more.
My gut tells me Apple made a deal with the NSA to put this in and they told them to sell it as CP protection.
If you believe the NSA needs this, g*d bless you.
 
  • Like
Reactions: VulchR
There's just no way China is going to continue doing business with Apple if they don't start "playing ball" like the other companies, and loss if the Chinese market to Apple's bottom line would be catastrophic.

Easily explains Apple's surprising about-face.
It's not like their own web2.0 isn't already riddled with filtering and content flagging.
 
Clearly Apple has data that this is a major problem. Also it seems the average person seems aloof to what a problem this is especially in other parts of the world as well.

I thought this was US only? (The CSAM stuff said it was)
 
Last edited:
because this type of crime wins the hearts of the masses, no one can possibly be against it. And that's how they get their foot in the door for more.
My gut tells me Apple made a deal with the NSA to put this in and they told them to sell it as CP protection.
If so, then every company made the same deal. Security and privacy is on the individual alone.
 
Sure and your privacy is still being exposed, so we're ok now then?


[...]
That's not true either. If one traffics in CSAM and uploads to icloud your photos will be hashed and compared to a database (with some threshold) that is taking up space on your phone as I understand it. The algorithm doesn't do facial recognition. And the treatment of PII hasn't changed.

So if you (the global you, not you personally) want to be exempt from scanning:
- log out of icloud
- ditch iphones/ipads
- don't engage in the trafficking of CSAM.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.