Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
How would WhatsApp, a company which doesn’t even make photos, even go about doing such a thing? They scan for images on their servers because that is the only thing they do control.

Apple controls the hardware and the software, so of course they would be able to accomplish a particular task in a manner that other companies can’t.

Each is simply leveraging on their individual areas of expertise to tackle an existing problem.
It can be added in their software code to scan photos. I would imagine the application would be granted access to the photos on your device since most people share them.
 
  • Like
Reactions: peanuts_of_pathos
I think Apple has trade-in programmes as well. You may want to try your luck at the nearest Apple store.
The problem is I do not want to trade it in for another Apple device or gift card at Apple store since I do not want their products if they think it is okay to install surveillance software on their devices. Not to mention that seriously has altered what they advertised and the TOS when I originally bought my device. I seriously suspect Apple is underestimating the legal issues they are going to have with this.
 
How would WhatsApp, a company which doesn’t even make photos, even go about doing such a thing? They scan for images on their servers because that is the only thing they do control.

Apple controls the hardware and the software, so of course they would be able to accomplish a particular task in a manner that other companies can’t.

Each is simply leveraging on their individual areas of expertise to tackle an existing problem.
I imagine they'd share the API which gives access to the hashes as they are created.
 
It’s not, as long as the legal guardian gives consent.
You might want to actually read Snapchat’s terms of use. The very first line says “No one under 13 is allowed to create an account or use the Services.” No where does it say anything about legal guardian consent. Parents need to be held reasonable for this too.
 
Last edited:
The problem is I do not want to trade it in for another Apple device or gift card at Apple store since I do not want their products if they think it is okay to install surveillance software on their devices. Not to mention that seriously has altered what they advertised and the TOS when I originally bought my device. I seriously suspect Apple is underestimating the legal issues they are going to have with this.

An argument can be made that this is Apple looking out for the privacy and the security of their users, as evidenced by the manner in which they are scanning the devices of their users for sensitive content which does not involve human intervention.

In short, Apple has come up with a way to identify CSAM images uploaded to one’s iCloud photos without actually having to probe their photo library’s in a way that voids privacy.

The more I read and reread Apple’s statements, and as I continue to follow these threads, the more I am convinced that Apple has developed an effective CSAM detection mechanism, and has sufficient safeguards built in to deter bad actors.

So far, it passes the smell test for me.
 
An argument can be made that this is Apple looking out for the privacy and the security of their users, as evidenced by the manner in which they are scanning the devices of their users for sensitive content which does not involve human intervention.

In short, Apple has come up with a way to identify CSAM images uploaded to one’s iCloud photos without actually having to probe their photo library’s in a way that voids privacy.

The more I read and reread Apple’s statements, and as I continue to follow these threads, the more I am convinced that Apple has developed an effective CSAM detection mechanism, and has sufficient safeguards built in to deter bad actors.

So far, it passes the smell test for me.
All companies are going to make self-serving statements to avoid liability. We will have to wait for this to work it's way thru the courts.
 
Is it?
Personally, if I had a kid under the age of 13, and I had that restriction of sending or receiving explicit photos turned on, I would want it to be universal across all apps, first party and third party.
I wouldn’t want my kid sneaking onto Snapchat to send stuff or receive stuff they couldn’t send or receive on iMessage.
just don't give them a phone.
 
Well the Swamp have noticed that way too many people has awakened to the masterplan or turning the west into third word and the slow boil frog strategy is not as efficient anymore. They don't even try to hide it anymore, on the financial part you see what paypal and adl are doing blocking financial services to anyone criticizing the government. At almost the same time Apple does this, on top of this there is the forced vax pass where they want to force the uncomplying out of the normal life for a pandemic supposedly so bad that you need to hear about it on tv mostly.

The state must declare the child to be the most precious treasure of the people. As long as the government is perceived as working for the benefit of the children, the people will happily endure almost any curtailment of liberty and almost any deprivation.
Adolf Hitler



And by the way, the CSAM database is probably created by hashing Tim Cook's personal photo library
 
Last edited:
An argument can be made that this is Apple looking out for the privacy and the security of their users, as evidenced by the manner in which they are scanning the devices of their users for sensitive content which does not involve human intervention.

In short, Apple has come up with a way to identify CSAM images uploaded to one’s iCloud photos without actually having to probe their photo library’s in a way that voids privacy.

The more I read and reread Apple’s statements, and as I continue to follow these threads, the more I am convinced that Apple has developed an effective CSAM detection mechanism, and has sufficient safeguards built in to deter bad actors.

So far, it passes the smell test for me.
So, for how long have you been working for Apple?
 
if one of those 3rd party apps happens to be from a developer that's 47% owned by a Chinese tech company... which is controlled by a Chinese state corporation with direct ties to the Chinese communist party.... well... they'll never use this as a backdoor into scanning iPhones of U.S users to detect unflattering images of Chinese communist leaders... right?
you kidding right ? how did you come to that conclusion ? I really hate statement like this based off nothing, no facts. At my work, there are full of people like this and I don't know how they are doing it all the time
 
if one of those 3rd party apps happens to be from a developer that's 47% owned by a Chinese tech company... which is controlled by a Chinese state corporation with direct ties to the Chinese communist party.... well... they'll never use this as a backdoor into scanning iPhones of U.S users to detect unflattering images of Chinese communist leaders... right?

If the Chinese government is able to infiltrate NEMEC and include their own Winnie the Pooh photos to be flagged as child pornography and have all this go undetected for any period of time, the US government has bigger problems on their hands.

That’s simply not how this feature works.
 
  • Like
Reactions: slineaudi
Apple is using the database provided by NCMEC, plus they are the ones compiling and submitting the reports to said organisation, so other organisations would simply not be able to get around Apple to spy on users.

Apple has also said that CSAM can’t be used to target a specific individual since the same set of hashes is stored on the OS of every iOS device. If you are worried that the government of another country may try to force Apple to build a new detection system to identify other kinds of content, I don’t think there is anything Apple can say which would alleviate your concern.

It ultimately comes down to consumer trust. Which is why Apple is being upfront and truthful with customers right here, right now. And that is why Apple still has my trust for the moment.

I imagine any entity willing to do the extent of adding a completely unrelated photo to this database without detection (especially a foreign government), would already possess the means to infiltrate your smartphone using a myriad of other alternatives. For example, China already possesses easier and more practical means of doing so via Wechat.

So it still comes down to “it’s technically possible, but probably very likely” territory.

When I was a kid, no one snuck into my house daily and scanned my family's photo albums and for me, that's the issue.

These companies designed a product (and services) that the entire globe now thinks it can't live without (it totally can) and then trojan-horsed surveillance methods into them, little by little, inch by inch, so as not to rock the boat or draw too much attention. But now, with everyone believing that life without a smartphone is an impossibility, they're starting to go full-bore and flat out announcing it.

It's like, how far do these people need to go? Is there even a breaking point anymore? I'm not so sure there is. And to me--that's frightening.
 
When I was a kid, no one snuck into my house daily and scanned my family's photo albums and for me, that's the issue.

These companies designed a product (and services) that the entire globe now thinks it can't live without (it totally can) and then trojan-horsed surveillance methods into them, little by little, inch by inch, so as not to rock the boat or draw too much attention. But now, with everyone believing that life without a smartphone is an impossibility, they're starting to go full-bore and flat out announcing it.

It's like, how far do these people need to go? Is there even breaking point anymore? I'm not so sure there is. And to me--that's frightening.

Nobody at Apple is looking through the photos on your iPhone either. That’s what this technology is all about. It’s happening on your phone, and it stays on your phone.
 
I am so lucky.

I have an old iPhone. I was planning to expand my investment in Apple by buying iPad, Macbook, Apple Watch and new iPhone. Lucky for me, I have not done that yet. Now, I can save a few thousand dollars. If I had gone too deep in Apple, I would have regretted it. It is easy for me now before I just have an old iPhone and it will be the last for me.
 
I agree with you 100%. However, Apple is not obligated to scan our iPhones. There has to be an alternative way to fight against child pornography.

Privacy is being exposed at it's fullest. Especially, now third party will be involved. Imagine if the information/photos gets leak by a third party? Who's held responsible for that.
Apple doesn't care about children. There's something sketchier.
 
I am so lucky.

I have an old iPhone. I was planning to expand my investment in Apple by buying iPad, Macbook, Apple Watch and new iPhone. Lucky for me, I have not done that yet. Now, I can save a few thousand dollars. If I had gone too deep in Apple, I would have regretted it. It is easy for me now before I just have an old iPhone and it will be the last for me.
Don't. I regret it.
 
So... by third party, you mean the government? Right Apple?
Ugh... the worst part of this is you know the other big tech companies aren't far behind in doing the same stuff Apple is doing... How could they not? "Unlike Apple we don't scan your photos." "You mean, you don't scan for child abuse?" "Oh... uh.... *jumps on the Apple bandwagon*" Makes me wonder now how doxing will evolve. If you just hate that person get a burner phone and just spam the heck out of them with that stuff? It would trigger it on their device? I hope Apple thought of that.
Third party apps, this apply to hash detection. Like many here you don’t understand who an hash scan from a db works. Others like Google are already doing this since 2014-2015.
It is an automated hash, it is mathematic not a scan. Please read it stop spreading disinformation.
 
  • Like
Reactions: slineaudi
Apple is using the database provided by NCMEC, plus they are the ones compiling and submitting the reports to said organisation, so other organisations would simply not be able to get around Apple to spy on users.

Apple has also said that CSAM can’t be used to target a specific individual since the same set of hashes is stored on the OS of every iOS device. If you are worried that the government of another country may try to force Apple to build a new detection system to identify other kinds of content, I don’t think there is anything Apple can say which would alleviate your concern.

It ultimately comes down to consumer trust. Which is why Apple is being upfront and truthful with customers right here, right now. And that is why Apple still has my trust for the moment.

I imagine any entity willing to do the extent of adding a completely unrelated photo to this database without detection (especially a foreign government), would already possess the means to infiltrate your smartphone using a myriad of other alternatives. For example, China already possesses easier and more practical means of doing so via Wechat.

So it still comes down to “it’s technically possible, but probably very likely” territory.
I agree with you - from what we know about the implementation, if it is correctly implemented any abuse by (e.g. a government) is extremely difficult to stage. All scenarios I can think of, involve at least infiltration of Apple, sophisticated cryptographic attacks (e.g. birthday attacks to create two pictures with identical hashes) and a lot of criminal energy. There are easier ways to spy on people.

Now the « slippery slope » argument is different - nobody knows how this can evolve.
 
Third party apps, this apply to hash detection. Like many here you don’t understand who an hash scan from a db works. Others like Google are already doing this since 2014-2015.
It is an automated hash, it is mathematic not a scan. Please read it stop spreading disinformation.
"The knife attacks in China are ok, because there are school shootings in the USA" kind of logic.
 
I am so lucky.

I have an old iPhone. I was planning to expand my investment in Apple by buying iPad, Macbook, Apple Watch and new iPhone. Lucky for me, I have not done that yet. Now, I can save a few thousand dollars. If I had gone too deep in Apple, I would have regretted it. It is easy for me now before I just have an old iPhone and it will be the last for me.

If you want those devices get them you can’t be scared based on what a company may do with your information. Hell nothing is secure anymore with any apps we use never mind devices. The best way to avoid being tracked is not using any phone at all
 
  • Like
Reactions: 708692
Too Big Brother to me. It seems with each passing day, Apple gets closer to the 1984 thing that they railed against in their ad. What the heck happened Apple? How did you allow yourself to get here?

Protecting children is a noble goal, but once you write the architecture into the software, it WILL be exploited and misused at some point. And now that they are aware of the capability, what will Apple do if countries start demanding other uses for the technology besides children? Will they comply “because it’s the law”? Or will they leave a market like China where they have their products assembled?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.