Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Oh for sure, me too! I totally agree with you non-suspicious new account from Apple PR (ouch, how many people got fired for this delivery? Apple is usually incredibly slick, this delivery was not dancing through the mine-field, it was more tripping and falling face-first into their own **** - I guess whatever deal they cut happened at the 13th hour, not leaving much time to kick out a coherent storyline ... I mean, their own FAQ/white paper, contradicts itself).


It's all just selling dreams, and now, helping bridge the gap between where government cannot go, but big tech companies are free to do as they please; working together, like the super-friends, to help enable the correct narrative so everybody is thinking the appropriate thoughts, and all those antitrust issues become much smaller since it's a big Circle of Friendship and Enlightenment!

:apple: Big Brother on Device - Don't Think Different!

Turning the page from the All Singing, All Dancing, Crap of the World ... this seems to leave CalyxOS, GrapheneOS ... and stuff, that kinda seems to mostly work like a phone and have Signal integrated into it, c'est la'vie.



Just saw a video yesterday on CalyxOS looks promising for an Android alternative. Not sure how many existing devices are supported but its needed revenue model may cause it to die quickly. I thought GrapheneOS was dead.
 
What I don't understand is how Apple can get away with hosting known illegal-to-possess child porn images on its servers. Fingerprint matches, they are identified illegal images. Legally they should reject them immediately that they are identified and not permit them to exist on systems they own and control. Or maybe they don't understand they are blatantly breaking the law themselves, or don't care as they are "the good guys" and consider themselves immune from prosecution. Google rejects them out of hand and doesn't permit them on their servers. What makes Apple special here? And then Apple has a group of non-law-enforcement people looking at those child porn images to see if there are enough of them to report the uploader.
The point is, if it’s on your phone, not uploaded to Apple, it will be scanned too. Local scanning, turning your own device and the people you intimately trust against you, is basically mass surveillance.
 
So he said it is the same software everywhere. Does this mean their search database is the very same globally? Like in China, Russia and the US? Or will there be any local differences?
 
Path to hell is always laid with good intentions.

Gulag archipelago is not built overnight, it always starts with something benign.
They also don‘t seem to understand how a virus scanner is still a scanner without scanning a printed photo.
Not to mention it‘s not about the process but the interaction with an apparently untrustworthy customer base.
So he said it is the same software everywhere. Does this mean their search database is the very same globally? Like in China, Russia and the US? Or will there be any local differences?
I believe that the likes of Belarus, Russia, China and so forth will soon get slightly „adjusted“ databases Apple is required by law to include.

Apple itself will probably be left unharmed by this because the majority of its target audience doesn‘t care about anything but getting the newest iPhone and post selfies with it. The majority of people use Whatsapp and not Signal.
 
Last edited:
people seem to think if you are against it, you support it. t
Diode thinking is easy to do, easy to understand, and easy to execute. Of course, those dudes will never accept it, but that's how simple people tend to be.
one word about 'authoritarian governments'. systems like those don't need apple's CSAM solution to lock up their opponents. they can just plant any image, video, whatsoever on your device with zero day exploit based tools, and use the plain old 'hand-controlled' police/secret police to conduct a made-up investigation to uncover the evidence they planted before.
These are usually reserved for high-profile targets rather than applying to the general public. But at the same time, they don't need to spend a lot of resources on the general public to achieve the same goal.
That part is not confusing, they’re trying to hammer home the fact that the code for doing the scanning is on your device which means it can be explored by anyone. Server side scanning is “scarier“ because that means we have no idea what they are actually doing. Of course, I believe they are already doing that for many things.
It's not like these on-device scannings are less scary than server-side scanning. At least people can make up server-side scanning with "huh, it's on their server and their property so they have the right to manage it".
Also, people don't want to believe their extremely expensive (think about those $50k Mac Pro) Apple devices are not actually owned by them in practice: good luck trying to install anything other than macOS with Apple Silicon Macs.
 
  • Like
Reactions: 09872738
They also don‘t seem to understand how a virus scanner is still a scanner without scanning a printed photo.
Not to mention it‘s not about the process but the interaction with an untrustworthy customer.
The virus scanner is working for you not against you. It’s on your side. This Apple’s kiddie photo scanner is not on your side.

If your car will snitch on you when you are driving under influence, it’s the same thing.

That’s not far from when your wife will be forced to snitch on you when you bad mouth Uncle Sam or the system.

That’s how a normal society transforms into Gulag.
 
I really don't get the CSAM feature. Do people who distribute child pornography even use something like iCloud to store these illegal photos? Similarly, would they use Telegram or Whatsapp to distribute these photos? I'd think that they'd only use dark web for things like this.
 
  • Like
Reactions: one more
Is it “not true” by policy and promise, or is it by Engineering and technical impossibility?

I don’t trust those promises, if a snitching system has been engineered, how it’s used can easily be changed by policy.
Indeed, and they have promised a lot in the past that they haven't kept. On top of that, they can't promise not to bend to current and future laws.
I really don't get the CSAM feature. Do people who distribute child pornography even use something like iCloud to store these illegal photos? Similarly, would they use Telegram or Whatsapp to distribute these photos? I'd think that they'd only use dark web for things like this.
The dark web largely means using the same internet but without restrictions of known illegal sources. The resources and means to get access stay the same.
 
  • Like
Reactions: BurgDog
So he said it is the same software everywhere. Does this mean their search database is the very same globally? Like in China, Russia and the US? Or will there be any local differences?

The way I understood it from the WSJ interview the database will be universal worldwide, probably something like global antivirus databases?
 
The dark web largely means using the same internet but without restrictions of known illegal sources. The resources and means to get access stay the same.
Ok but there are tons of different ways to distribute illegal content on the dark web without touching the giant tools like Facebook or Whatsapp or iCloud. Why would you risk everything by using Whatsapp or iCloud to distribute your illegal photos? An iCloud account is bound to your entire device, it can get your location with the GPS in the phone, it can get tons of other data, even if Apple says they are not tracking you.

Similarly a Whatsapp account is bound to a phone number, which again can be used to physically track you. So if you want to distribute something illegal, shouldn't you use something completely anonymous?
 
  • Like
Reactions: BurgDog
For me the saddest thing in all of this debate is not the technicalities of it, nor possible future abuses if Apple go ahead with this system. The current debate shows really well that the current societies are quite fragmented and there is very little trust between people. Any tool/technique can be used for positive or negative goals, but living in constant fear and mistrust just makes life miserable.
 
Ok but there are tons of different ways to distribute illegal content on the dark web without touching the giant tools like Facebook or Whatsapp or iCloud. Why would you risk everything by using Whatsapp or iCloud to distribute your illegal photos? An iCloud account is bound to your entire device, it can get your location with the GPS in the phone, it can get tons of other data, even if Apple says they are not tracking you.

Similarly a Whatsapp account is bound to a phone number, which again can be used to physically track you. So if you want to distribute something illegal, shouldn't you use something completely anonymous?
I was not talking about visiting the same services, just using the same tools. I was not talking about Whatsapp, Telegram or the likes - sorry for the confusion.

Mostly it means using a browser, preferably one like Tor, or through secure means (VPN, hopping, etc.).
 
If you're worried about spyware Apple is the last of your worries. They just gave us Private Relay, hide my email, and a method for protecting our children from criminals.

The biggest of your worries is what will happen if third party app stores and side loaded apps use undocumented APIs that try to find security holes on your device that allow them to capture everything.

Pegasus was just a taster of what could happen.

Pirated iOS Apps that are side loaded will not only reduce income for developers substantially but also come infected with malware for stealing your data.
 
I was not talking about visiting the same services, just using the same tools. I was not talking about Whatsapp, Telegram or the likes - sorry for the confusion.

Mostly it means using a browser, preferably one like Tor, or through secure means (VPN, hopping, etc.).
Indeed, so I don't really see any actual reason to scan iCloud for child pornography and even if you catch one or two in a billion it's really not worth violating the privacy of everyone else. Instead focus on other methods to fight this.

Or I'm completely wrong and people use iCloud quite a lot for this kind of stuff.
 
Great point. Let users scan Craig’s phone, Tim’s phone, Eddy’s phone, Phil’s phone, Jeff’s phone, all the time, whenever we choose. I imagine they wouldn’t like it.
And their point would be no human is looking at your phone just a system and only a human would look at photos which match the data would be flagged
 
Indeed, so I don't really see any actual reason to scan iCloud for child pornography and even if you catch one or two in a billion it's really not worth violating the privacy of everyone else. Instead focus on other methods to fight this.

Or I'm completely wrong and people use iCloud quite a lot for this kind of stuff.
The objective is not for Apple to act as law enforcement. Rather it is to satisfy the legal requirement that Apple has to ensure that their servers do not host illegal materials. This development will likely pave the way for future E2EE for iCloud Photo. There is no way for Apple to achieve proper E2EE for iCloud Photo if the check is done server side.
 
The objective is not for Apple to act as law enforcement. Rather it is to satisfy the legal requirement that Apple has to ensure that their servers do not host illegal materials. This development will likely pave the way for future E2EE for iCloud Photo. There is no way for Apple to achieve proper E2EE for iCloud Photo if the check is done server side.
They don't have to ensure it, they have to take action when there is a case that is known to them.
 
They don't have to ensure it, they have to take action when there is a case that is known to them.
The only action Apple need to take is to report to NCMEC. Apple will only know about it if they verify contents uploaded to iCloud Photo. If they don’t do it and an iCloud user is found to store such contents in iCloud and Apple cannot prove that they took measures to detect such contents, they will get into trouble.
 
This would seem to me that any information forwarded to NCMEC by Apple violates the 4th Amendment.

I base this on having just read the appeals decision summary of August 5th, 2016 that determined NCMEC is a federal agency.

In the case the 10th Circuit decided that information, in this instance scanned, seized and sent by AOL to NCMEC and then reviewed by NCMEC was an unlawful search by a government agency.

If there are any lawyers in this thread, would be helpful to hear a professional opinion.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.