Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Haven't you read about Tesla's cabin cameras? They constantly watch you while driving and analyse what you do.
If you turn on data sharing it can be shared with Tesla.

Sounds a lot like how Apple's iCloud Photo Library operates.
Didn't note that, but definitely wouldn't support that either, unless there is an option you can fully disable it.
 
  • Like
Reactions: Morgenland
I have no idea what you are on about. You have no idea as to what ecosystem I may be in or if I am in one at all. Also, if I feel that Craig was not credible in this interview it means that I never trusted Apple? Logic does not appear to be your strong point.
if an Apple exec says x and his statement is supported by rest of apple engineers and privacy executive and you said Craig is lying you are basically saying the entire company is not to be trusted. Do you know how logic works? have you ever studied tautology?
 
No, I just work rationally with what you're dishing out. I can't help it if you use words incorrectly. There's a difference between "forced" and "on by default." Sounds like you were using the word "forced" to over-dramatize it and make it sound worse than what it was. You made it sound like some sort of scandal, which is why I was asking for a link to a news story about it or something. I guess it wasn't.

It's not often that I set up a new iPhone, so I honestly don't remember what's on/off by default in iCloud, but I always go through the different iCloud-compatible apps (Settings > Apple ID > iCloud) and set everything how I want it when I DO set up a new iPhone. I noticed Apple's instructions to set up and use iCloud Photos instruct you to go in and turn it on, which seems like an odd thing to say if they're already on by default.

And based on your last paragraph, it sill appears you don't understand that the new parental controls for Messages are not the same thing as on-device CSAM detection for iCloud photos. Two different topics. Nothing is being reported to Apple from Messages. It's only between the parent(s) and the child. And as for "what's next" beyond CSAM detection, you're committing the slippery slope fallacy, as are so many on this forum. Just because something theoretically could be abused, doesn't mean it's a bad thing or shouldn't be allowed. This applies to MANY things in life, not just technology.

You're already trusting Apple isn't doing anything nefarious with software functionality already on your phone (e.g. facial recognition), so why all of a sudden the distrust? If you're that paranoid, then why continue to use iPhones?
iCloud Photos are turned on by default whenever you sign on to iCloud.

With FaceID, it works on device in the secure enclave. Nothing is being transmitted to Apple, even if you have iCloud. Obviously that's more trustable.

If Apple announced tomorrow that they will be matching your FaceID data with known criminals, I'm sure people would be shocked as well and up in arms.
 
  • Like
Reactions: CriticalThoughtDrop
if an Apple exec says x and his statement is supported by rest of apple engineers and privacy executive and you said Craig is lying you are basically saying the entire company is not to be trusted. Do you know how logic works? have you ever studied tautology?
You are a little confused. If Mr. Blue tells a lie this afternoon, that does not mean that he has always lied for the past twenty years. If I don't trust Apple today it does not mean I have never trusted Apple.
 
That doesn't address what I was responding to:

"2. Apple do not have access to the original database only the hashes. If the hash for a photo of BLM was added then Apple wouldn’t have a clue and would just push it to users."
Like I said, Apple did number 2 so in the midst of this worldwide mass scanning system deployment, they have plausible deniability. The concern is, why did they decide to design such complex system in the first place? If they simply didn't want CP on their servers, the e2e iCloud encryption would solve it just fine. Apple wouldn't know what users are uploading, done deal.
 
  • Like
Reactions: Morgenland
You are a little confused. If Mr. Blue tells a lie this afternoon, that does not mean that he has always lied for the past twenty years. If I don't trust Apple today it does not mean I have never trusted Apple.
If you are not trusting the word of a person who is adamant against accusations you are basically questioning the persons character. this is not about Craig saying trivial things. the WSJ directed accusations against Apple and Craig responded to those. You are saying those responses to allegations are lies. Thats questioning his character and thus by proxy the company itself of its intentions.
 
  • Like
Reactions: hlfway2anywhere
No, I just work rationally with what you're dishing out. I can't help it if you use words incorrectly. There's a difference between "forced" and "on by default." Sounds like you were using the word "forced" to over-dramatize it and make it sound worse than what it was. You made it sound like some sort of scandal, which is why I was asking for a link to a news story about it or something. I guess it wasn't.

It's not often that I set up a new iPhone, so I honestly don't remember what's on/off by default in iCloud, but I always go through the different iCloud-compatible apps (Settings > Apple ID > iCloud) and set everything how I want it when I DO set up a new iPhone. I noticed Apple's instructions to set up and use iCloud Photos instruct you to go in and turn it on, which seems like an odd thing to say if they're already on by default.

And based on your last paragraph, it sill appears you don't understand that the new parental controls for Messages are not the same thing as on-device CSAM detection for iCloud photos. Two different topics. Nothing is being reported to Apple from Messages. It's only between the parent(s) and the child. And as for "what's next" beyond CSAM detection, you're committing the slippery slope fallacy, as are so many on this forum. Just because something theoretically could be abused, doesn't mean it's a bad thing or shouldn't be allowed. This applies to MANY things in life, not just technology.

You're already trusting Apple isn't doing anything nefarious with software functionality already on your phone (e.g. facial recognition), so why all of a sudden the distrust? If you're that paranoid, then why continue to use iPhones?
Defocusing won't get us anywhere. In every country outside the United States, OS components are still installed by local providers. Apple has always shown this so far, and you had to agree to it before you could use the phone. This is exactly where the services will build in their interfaces, you know that very well.

And I have loved Apple for decades, and there are just ****ed-up systems besides Apple, so let me defend Apple against its imposed scheme. Plus, I'm an Apple shareholder.
 
  • Like
Reactions: CriticalThoughtDrop
If you are not trusting the word of a person who is adamant against accusations you are basically questioning the persons character. this is not about Craig saying trivial things. the WSJ directed accusations against Apple and Craig responded to those. You are saying those responses to allegations are lies. Thats questioning his character and thus by proxy the company itself of its intentions.
Exactly. Today but that does not mean I never trusted him and the company which is what you stated. That ends this topic.
 
No, I just work rationally with what you're dishing out. I can't help it if you use words incorrectly. There's a difference between "forced" and "on by default." Sounds like you were using the word "forced" to over-dramatize it and make it sound worse than what it was. You made it sound like some sort of scandal, which is why I was asking for a link to a news story about it or something. I guess it wasn't.

It's not often that I set up a new iPhone, so I honestly don't remember what's on/off by default in iCloud, but I always go through the different iCloud-compatible apps (Settings > Apple ID > iCloud) and set everything how I want it when I DO set up a new iPhone. I noticed Apple's instructions to set up and use iCloud Photos instruct you to go in and turn it on, which seems like an odd thing to say if they're already on by default.

And based on your last paragraph, it sill appears you don't understand that the new parental controls for Messages are not the same thing as on-device CSAM detection for iCloud photos. Two different topics. Nothing is being reported to Apple from Messages. It's only between the parent(s) and the child. And as for "what's next" beyond CSAM detection, you're committing the slippery slope fallacy, as are so many on this forum. Just because something theoretically could be abused, doesn't mean it's a bad thing or shouldn't be allowed. This applies to MANY things in life, not just technology.

You're already trusting Apple isn't doing anything nefarious with software functionality already on your phone (e.g. facial recognition), so why all of a sudden the distrust? If you're that paranoid, then why continue to use iPhones?
Why on earth are you going on about iCloud settings, because the software is on your hardware rather than on iCloud.
Apple could have chosen to have all checks via iCloud and its server and the fact they haven't makes it far more of a concern as whether you use iCloud or not, that coding is still within your system, still capable of being modified at any time, and if there was no intention of doing that then why are they seemingly so intent on doing it via installing the software on customers hardware rather than iCloud?
 
Like I said, Apple did number 2 so in the midst of this worldwide mass scanning system deployment, they have plausible deniability. The concern is, why did they decide to design such complex system in the first place? If they simply didn't want CP on their servers, the e2e iCloud encryption would solve it just fine. Apple wouldn't know what users are uploading, done deal.

We're not communicating.

"2. Apple do not have access to the original database only the hashes. If the hash for a photo of BLM was added then Apple wouldn’t have a clue and would just push it to users."

Apple does not want the *reference* original database of CP photo jpegs on their premises (whether that be a computer file, physical binder, etc) to begin with. Which is why Apple can't determine if a BLM photo was nefariously slipped into the hash table by a higher authority.
 
  • Like
Reactions: Morgenland
If you are not trusting the word of a person who is adamant against accusations you are basically questioning the persons character. this is not about Craig saying trivial things. the WSJ directed accusations against Apple and Craig responded to those. You are saying those responses to allegations are lies. Thats questioning his character and thus by proxy the company itself of its intentions.

Unfortunately, the world is not that simple.
Sometimes it is definitely a matter of life experiences that makes an credible system assessor.

 
Last edited:
As I wrote before, its not about Dr. Fauci or Brilliant (they are your tangent*), it’s about what are the actual capabilities of microbiology research and observation.

It’s mostly about the hardware. “How good are the scopes?”

Just like Dr. Fauci’s vision corrective glasses, the specs of the specialized hardware to see at the microbiotic scale in question, are the crux of all viral research and discussions.

If the methods used are reliable with a high degree of accuracy, then I’m sure Dr. Fauci deserves his reputation.

But if the methods are not precise, and there are no other methods to verify, then that reputation should be tempered.

*Before cheerleading scientists by reputation alone (which is unscientific), you might consider establishing a foundation, is Dr. Fauci competent at TEM or another direct observational research method at the viral scale?

Was that skill even in his wheelhouse, or are you making an incorrect assumption(s) about epidemiologists/virologists and their field? Are there different types of epidemiologists?

You might also want to check out Dr. D.A. Henderson for setting your virology expert benchmark, the man credited with leading the efforts to eradicate small pox (Brilliant was just one of many who worked under him). When Brillant largely left the field to chase Silicon Valley VC money and then start working for Google on investment grants, Henderson continued to do research and publish on viruses and outbreaks (omitted from mention on his Wikipedia, go figure?).

What is your background and credentials (academic, work-related, research papers, etc) that qualify you to assess the methods and precision you question and cast doubt on up above?

Fauci and Brilliant (among others) are my "tangent" because myself being an electrical engineer I'm not qualified to assess precision and methods with respect to infectious disease research and control. Therefore I have to rely on the words and papers of epidemiologists/virologists who have a demonstrated and verifiable track record of positive results over decades.

So please tell me about yours.
 
Last edited:
  • Like
Reactions: hlfway2anywhere
I wonder if the real reason they want it via individual hardware via operating systems is for identifying source IP or even unique identifying device data which would be available via system report and elsewhere? Apple really have opened a can of worms by seeking to have this incorporated in operating systems where they could do this on their servers, so they can't really be surprised at the suspicions and concern it causes, which for me would be allayed immeasurably if they announce they are scrapping idea of incorporating it in systems software and instead keeping their CSAM idea checked by iCloud server so everyone would know their own hardware was sacrosanct.
 
We're not communicating.

"2. Apple do not have access to the original database only the hashes. If the hash for a photo of BLM was added then Apple wouldn’t have a clue and would just push it to users."

Apple does not want the *reference* original database of CP photo jpegs on their premises (whether that be a computer file, physical binder, etc) to begin with. Which is why Apple can't determine if a BLM photo was nefariously slipped into the hash table by a higher authority.
The fact they have no access to the original database is irrelevant as by having the software on customers own equipment they have access to unique identifying data, and its still surveillance.

Now if they want to keep their hash idea and where in my experience this will make it harder for agencies employed to stop child abuse and child pornography, then they can use their own iCloud to do it, but the fact they want to introduce it via operating systems send a massive red light of concern.

Remember System Integraty Protection does not apply to Apple, so they can amend that part of the software any time they choose whatever they suggest is they 'aim'.

The fact they intend to do it via Systems Software doesn't give one confidence they won't modify it. Take it off the systems software, keep checking on iCloud then a lot of the fears subside.
 
We're not communicating.

"2. Apple do not have access to the original database only the hashes. If the hash for a photo of BLM was added then Apple wouldn’t have a clue and would just push it to users."

Apple does not want the *reference* original database of CP photo jpegs on their premises (whether that be a computer file, physical binder, etc) to begin with. Which is why Apple can't determine if a BLM photo was nefariously slipped into the hash table by a higher authority.
True. That's the plausible deniability. Apple simple has no knowledge since they only see hashes. Thus it really sends a chill, at least for me, that a giant tech company has set up a worldwide mass scanning system where they themselves are not accountable for the implication. "We're just matching hashes."
 
  • Like
Reactions: CriticalThoughtDrop
Just got a chance to take this screenshot. This is what "optimized storage" image looks like before it is being downloaded. If you manage to get a slow connection and your target image is very big (20M or more), the effect will last longer.
Screen Shot 2021-08-15 at 01.28.02.png

And this is what the same photo looks like when it is being downloaded.
Screen Shot 2021-08-15 at 01.53.00.png

Zoom it in for dramatic effect.

If this is what Apple has to work with, any human can easily identify the major features of a given photo without the need for much detail. Even if the "thumbnail" res goes down a bit, recognizing features is not hard to do.

Granted, Apple may not use the same "thumbnail" I use here, but imagine this is what Apple will receive for "questionable" photos. They don't take up much space and can be uploaded near-instantly too.
 
Number 2 is intentional to protect Apple themselves. They basically have plausible deniability when things go south as all they have are hashes. Really smart of Apple.

So it’s not a case of trusting Apple (which would be stupid enough) but trusting Apple and every government that can affect Apple’s share price.

The most stupid thing imaginable. How could you possibly do worse?
 
OK, thanks, I will. Please post a link from a credible source that states that's been happening since 2019.
Here is Apple’s own privacy policy from May 10th 2019. The day before they didn’t have the “scanning part” in it.


“[…]We may also use your personal information for account and network security purposes, including in order to protect our services for the benefit of all our users, and pre-screening or scanning uploaded content for potentially illegal content, including child sexual exploitation material.[…]”
 
  • Like
Reactions: m.dricu
So it’s not a case of trusting Apple (which would be stupid enough) but trusting Apple and every government that can affect Apple’s share price.

The most stupid thing imaginable. How could you possibly do worse?
Not even sure what goes where anymore. :D
 
Here is Apple’s own privacy policy from May 10th 2019. The day before they didn’t have the “scanning part” in it.


“[…]We may also use your personal information for account and network security purposes, including in order to protect our services for the benefit of all our users, and pre-screening or scanning uploaded content for potentially illegal content, including child sexual exploitation material.[…]”
Here is another that is now highly questionable:
"Apple is committed to protecting the security and privacy of our users. ......"

or this advert for Iphone specifically citing privacy
 
  • Like
Reactions: 09872738
iCloud Photos are turned on by default whenever you sign on to iCloud.

With FaceID, it works on device in the secure enclave. Nothing is being transmitted to Apple, even if you have iCloud. Obviously that's more trustable.

If Apple announced tomorrow that they will be matching your FaceID data with known criminals, I'm sure people would be shocked as well and up in arms.
Why should Chinese still buy expensive iPhones now?
I will be watching the stock market closely. Of course, analysts need time first, so I am still calm these days.
 
Why should Chinese still buy expensive iPhones now?
I will be watching the stock market closely. Of course, analysts need time first, so I am still calm these days.

Apple Inc. is the technology department of the CCP. No thinking Chinese person has been buying Apple for years, and certainly has no reason to change their mind now.
 
Like I said, Apple did number 2 so in the midst of this worldwide mass scanning system deployment, they have plausible deniability. The concern is, why did they decide to design such complex system in the first place? If they simply didn't want CP on their servers, the e2e iCloud encryption would solve it just fine. Apple wouldn't know what users are uploading, done deal.

Therefore, by Apple employing E2E encryption on their servers, people could upload massive amounts of CP to their heart's content. And that would be a-ok because Apple wouldn't know about it?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.