Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Yes the I have nothing to hide so it’s all good…
If there's no CSAM on your device, then how would Apple ever see any of your images? I don't feel bad for anyone that has that kind of stuff in their possession. This fact makes this system 100% private to those that don't own CSAM.
 
This fact makes this system 100% private to those that don't own CSAM.
^
this

It’s literally impossible to escalate to human review if you don’t have “X” CSAM pics.

Currently X=30 but Apple said they’re committed to tweak “X” in order to always keep the chance of being erroneously flagged to 1 in 1 trillion.
 
^
this

It’s literally impossible to escalate to human review if you don’t have “X” CSAM pics.

Currently X=30 but Apple said they’re committed to tweak “X” in order to always keep the chance to be erroneously flagged to 1 in 1 trillion.
They also said they would never make it less than 30, but may end up adjusting it to be more.
 
Except it's not ridiculous. It's the truth. You do realize I said "hundreds AND thousands", not "hundreds OF thousands", right? In other words, most people downloading and sharing pornographic images probably have at least 100, and many have 1000 or more.

I say again: lol, wut

First of all, please show me any data showing that "People who are into pornography of any type are usually addicted to it." That's a legitimately ridiculous thing to say.

Second of all, who downloads porn? Let alone images? How old are you?
 
As a side note, today Apple stock closed at the all time high of 151$…looks like investors are failing to see what in some circles has been spoken of as the greatest privacy scandal in tech and an existential threat to Apple’s business…oops.
Forum says it’s a big summer drama.
Market says it’s a nothingburger.
Choose wisely.
 
I say again: lol, wut

First of all, please show me any data showing that "People who are into pornography of any type are usually addicted to it." That's a legitimately ridiculous thing to say.

Second of all, who downloads porn? Let alone images? How old are you?

Google it. Lots of info on pornography addiction. I'm starting to wonder if we live on the same planet. If you don't agree with "addiction" then you can at least surely agree that most people who download porn aren't downloading less than 30 images and then never download any more. Also, isn't this whole conversation about downloaded porn that people are uploading to iCloud? 🤦‍♂️
 
  • Haha
Reactions: Scotticus
While I'm "not happy" about this scanning I'm also not throwing away Apple gear and rushing to get Android (or Linux). I don't buy all the slippery slope arguments, although they are really tempting to dive into the whataboutisms that these arguments represent. I'll see how this starts to shake out before making a move to determine if I buy a Galaxy Note or Iphone 13.

However, and I'm wondering, this is 2021, do people actually ask for a recommendation of whether they should get an iphone or Galaxy or ipad or tab or Mac or Windows? I used to be the goto person for that type of information and it's been 5 years since I've been asked any questions about "which gear is best for me?"
Ask yourself the following questions:
If Apple suggest NeuralHash is less invasive than other companies surveillance is, then ask yourself why aren't Apple putting the check on their own server like other companies as it would still convey the improved 'privacy' they suggest NeuralHash confers.

Instead why would they require 1,000,000,000+ users downloading software from their server, same amount of users having to install the software on their own hardware, electricity costs for both parties, processing costs etc., instead of Apple just putting 1 software on its own servers.

They belatedly tried to sell NeuralHash benefits, but in that case it would convey the same privacy 'benefits' if it was on their server not on your devices.

The fact they are doing this on YOUR hardware when they don't have to, demonstrates in my opinion that it will end up being far more intrusive, as it has access because it bypasses Systems Integration Protection on Apple devices and they are not going the way of using YOUR hardware for no reason, especially as the costs of that are far in excess of having it on their own server..
 
But what is stopping Apple from doing 'surveillance' before now? They already have the capability to detect all sorts of documents contents years ago, and their capabilities have since grown more advance since. Why only start now?

What is the benefit to Apple for implementing the CSAM feature other than to be complaint to US laws? This is effort spent that absolutely does not help them sell more devices.

From Apple's perspective, Apple has the power to push back from govt. agencies if their request are not complaint to local laws, as least for the US right? So what is Apple's reason to work with the govt. to snoop on the users of the devices they sell? So that they can get more tax break? It doesn't make any sense to me.
Theoretically it would make sense if they were pressured by various governments threatening to rule against them in the Apps cases. We know they've been asked to create backdoors previously and have refused, apart from perhaps Apple products bound for China, where they have acceded to certain requests.

The EU, and many other countries are currently 'investigating' Apple's hold over App sales threatening to open it up, and no doubt threatening very large fines...
 
I totally agree with you.

Some people are saying that also provides NSA backdoor access, apparently via the FISA Court. Without explaning how.

That doesn't make any sense to me. Smells more like an "anything in the world is potentially possible" sort of thing. What are your thoughts?
Sorry but its complete rubbish to say its safer on the Users equipment! Of course its not, it gives the potential for modification and access to over 1,000,000,000 users unique identifiers plus anything else that the software was modified for.

On the server it can only serve the general function at present NeuralHash, which as Apple suggest is designed to be anonymous anyway.

It doesn't even make financial or ecological sense to have it on 1,000,000,000+ devices, as its that number of downloads, that number of processing time, electricity etc., whereas installing it at the cloud end requires 1 piece of software.

It would be the same neuralHash tools so anyone suggesting it conveys more privacy on your own device just doesn't understand Apple software nor the fact that Systems Integration protection, stops other companies from changing system files, whereas Apple bypasses that, as authors.

When costs are greater to put on individual hardware and they are, then its there for a reason, and I do not believe it should be on users hardware
 
unless the photos are in the database there will be no match, in other words, if you make a childporn video or series of photos and store those photos on your phone and they are uploaded to icloud they will pass through just fine

the system does nothing about NEW childporn/csam images, only matching know images stored in the csam database, in fact the images must be on at least 2 databases to qualify to get in the database

the married couple presumably would be in no databases at all
So Apple's Manual Review process will not make judgement calls on the subject that had a false positive? It will either match existing pictures or not?
 
If there's no CSAM on your device, then how would Apple ever see any of your images? I don't feel bad for anyone that has that kind of stuff in their possession. This fact makes this system 100% private to those that don't own CSAM.
Many have been duped in to thinking that is what this is about, has zero % to do with why people are upset about this
 
Ask yourself the following questions:
If Apple suggest NeuralHash is less invasive than other companies surveillance is, then ask yourself why aren't Apple putting the check on their own server like other companies as it would still convey the improved 'privacy' they suggest NeuralHash confers.

Because that’s the whole point to not decrypt innocent users’ pics on-server and what you just described is exactly what makes the system “less invasive”? 🤔
 
Many have been duped in to thinking that is what this is about, has zero % to do with why people are upset about this
People are upset about a number of different things and some people are not even sure about what they’re upset about. Other people are being unreasonable or unfair to Apple compared to what the rest of the industry does routinely. It’s a multifaceted debate. Stop with this “we’re being gaslighted into thinking this is about X or Y”. I’ve read enough comments sections in the last few days to know people have been upset about a lot of different things, some without merit, some with some merit.
 
  • Like
Reactions: cupcakes2000
So Apple's Manual Review process will not make judgement calls on the subject that had a false positive? It will either match existing pictures or not?
“A” false positive can’t escalate to manual review.
Need 30 at least.
 
so then you would rather have it on icloud servers and completely in the dark ? having no knowledge of what is being scanned or compared against, having no knowledge of what kinds of images are being scanned ?

also, calling this a backdoor is really an inaccurate example of what is happening

apple is placing a database on our phones in full view of the users and the authorities, it is completely unable to be modified or changed since the hash is provided for everyone to see, we know that apple has obtained the images from 2 different databses so we know it is authentic csam and not any other kinds of material

it is figuratively sitting in the glare of a large spotlight ... to call this a backdoor is really entirely inaccurate
Of course it should be on iCloud server, but I'm sorry your demonstrates you don't know how Apple software works or where Apple suggest its NeuralHash tools are what they suggest make it more private, not the fact its on YOUR hardware, that cannot be more private because your system has unique identifiers that can be accessed by anyone exempt from Systems Integrity Protection (for some reason predictive text keeps changing Integrity to Information). Apple as software originators bypass that protection allowing modification.

If its by virtue of NeuralHash that its considered less intrusive, its not because its on your hardware its as Apple suggest NeuralHash, which would operate the same on iCloud...EXCEPT your software is on your machine all the time, even if you don't use iCloud! There to be modified at any time of Apple's choosing.

On iCloud there isn't the overhead of 1,000,000,000+ software downloads, same amount of users downloading it, so increased server traffic, increased processing use, increased electricity, and increased time loss for users installing it...

If you are downloading pictures to iCloud then the only check via the NeuralHash tools would be on iCloud, where it should be, because then you have a choice and you don't have lost processing power, you don't pay for the electricity, or increased battery use that you would on your system IRRESPECTIVE of whether you intend to use iCloud.

Being on every user's hardware is in no way conveys a safer system, its the opposite. It is far more dangerous, so much so that none other than Apple employees have kicked off about it being on users hardware and the serious implications of that.

Multiple media outlets have pointed out that its far more dangerous installed on users equipment, and I wonder how long it will be before a class action for slowing down users equipment, rather like the iPhone class action. It might be minute, but when it may scan ALL your photos, irrespective of whether you intend to load to iCloud, it also has the potential for much more serious modifications. If you are not using iCloud why should you have software on your system possibly checking photos or anything whether its intended to go to the cloud or not, but again where Systems Integrity Protection makes it so easy for such tools to be modified to target virtually anything.

If Apple are singing the praises of NeuralHash, then it will work no differently on iCloud should they choose to put it there, where they should put it there, because if you believe I cloud too will access data that is outside the remit of what Apple are claiming, then think how much damage it could be installed on your system and the capability of being modified on an individual user basis.
 
  • Like
Reactions: BurgDog
Isn't there tons of free porn on p0rnhub? I'm sure they rake in plenty of ad revenue. You can google "porn addiction" and find tons of info. Seems like a lot of you are making an issue of the word "addiction." I thought this was common knowledge, but apparently not (and of course many addicts can't admit they're addicts). Let me put it this way. If someone is into porn enough to download videos or images of it to their computer, do you think they have less than 30 videos/images or 30 or more? That's all I'm saying - that Apple's 30 image threshold is WAY conservative and is simply a safeguard to dramatically decrease the chances of accounts being falsely flagged.
Lets just be real here. You can get addicted to pretty much anything. Porn addiction is like alcohol addiction, or video game addiction.

It also depends on your timing. Does someone have 30 (or even 100 for an example) different content downloaded, but did so over the course of 10-20 years? Internet in this country (US) is generally on average still very bad. Even though I had 500 Mb/s internet, I still downloaded everything from iTunes/Apple TV for offline viewing because it was better that way. Even when I had hundreds to a thousand dollars of high quality routers, modems and infrastructure in my house, streaming on Netflix or iTunes or even YouTube most of the time was a pain. I have Gigabit internet now which has improved things dramatically.

I am not saying people can't be addicted. But people can be addicted to pretty much anything. And its an equal problem. If it impacts your life and your work, then its a problem. I know a couple of people that are in SEVERE debt due to their gambling addiction. I know some people that almost lost their jobs due to their video game addiction and they were sneaking to play their games while at work.

I mean I have a 24 TB NAS and have about 8 TB of Movies/Games/TV Shows downloaded from iTunes and Steam and other sources. Just because I had to deal with crappy internet for the longest time, it was beneficial to keep this stuff downloaded. Plus it saves on the monthly bandwidth!
 
I think every major tech company has already been scanning pictures. For years. Providers don’t want to be liable for hosting illegal content.

I wonder how many people actually use iCloud for illegal content like child porn. They must be incredibly stupid.
Yes, and NONE of the others are doing it via software being installed on the USER's own hardware, which is why so many media organisations and tech companies are kicking off about that aspect of it, and even Apply employees have emailed their concern at it being installed on our hardware.
 
  • Like
Reactions: BurgDog
People are upset about a number of different things and some people are not even sure about what they’re upset about. Other people are being unreasonable or unfair to Apple compared to what the rest of the industry does routinely. It’s a multifaceted debate. Stop with this “we’re being gaslighted into thinking this is about X or Y”. I’ve read enough comments sections in the last few days to know people have been upset about a lot of different things, some without merit, some with some merit.
….. just to be clear, there is only ONE issue, Apple is installing software on YOUR hardware to scan YOUR data, it’s not about anything else….. anyone that says otherwise Is missing the big picture, it’s 100% different than anything else being done in the industry, The closest breach would probably be the Alexa thing however that’s not as serious a breach compared to what Apple is doing, both are invasion of privacy though
 
Apple tested their system against 500 000 porn images with zero false positives. The probability they will have on the order of 30 matches seems pretty low.

Also Apple knows things about them. If the credit card used for their account is in their own name, they're probably over 18. You provide a birthday to Apple with your Apple ID. No young person lies about the age downwards.
Lets assume for a moment that one of my pictures standing in a hallway gets falsely flagged. Would another picture almost identical but taken 1 second after also be flagged? Maybe my eyes were closed and they asked me to re-take the picture?

Just curious, I know a lot of people like to take "burst" pictures so they end up with a lot of pictures in a short time. Could it flag all of those as false-positive?
 
Because that’s the whole point to not decrypt innocent users’ pics on-server and what you just described is exactly what makes the system “less invasive”? 🤔
They already decrypt users' pics on-server all the time, for example when you view them in a web browser on icloud.com, or to hand them over to law enforcement when legally required. It would be another story if they used E2E encryption, but they don't. As it is, scanning on the device isn't any more private, but opens the door to an entirely new level of surveillance.
 
Yes, and NONE of the others are doing it via software being installed on the USER's own hardware, which is why so many media organisations and tech companies are kicking off about that aspect of it, and even Apply employees have emailed their concern at it being installed on our hardware.
Touchdown, that’s exactly the issue…. Kinda compels you to correct people who are not getting this…. Nobody cares if they scan your photos on iCloud… the issue is they are entering your home to scan them…. Huuuge difference
 
Of course it should be on iCloud server, but I'm sorry your demonstrates you don't know how Apple software works or where Apple suggest its NeuralHash tools are what they suggest make it more private, not the fact its on YOUR hardware, that cannot be more private because your system has unique identifiers that can be accessed by anyone exempt from Systems Integrity Protection (for some reason predictive text keeps changing Integrity to Information). Apple as software originators bypass that protection allowing modification.

If its by virtue of NeuralHash that its considered less intrusive, its not because its on your hardware its as Apple suggest NeuralHash, which would operate the same on iCloud...EXCEPT your software is on your machine all the time, even if you don't use iCloud! There to be modified at any time of Apple's choosing.

On iCloud there isn't the overhead of 1,000,000,000+ software downloads, same amount of users downloading it, so increased server traffic, increased processing use, increased electricity, and increased time loss for users installing it...

If you are downloading pictures to iCloud then the only check via the NeuralHash tools would be on iCloud, where it should be, because then you have a choice and you don't have lost processing power, you don't pay for the electricity, or increased battery use that you would on your system IRRESPECTIVE of whether you intend to use iCloud.

Being on every user's hardware is in no way conveys a safer system, its the opposite. It is far more dangerous, so much so that none other than Apple employees have kicked off about it being on users hardware and the serious implications of that.

Multiple media outlets have pointed out that its far more dangerous installed on users equipment, and I wonder how long it will be before a class action for slowing down users equipment, rather like the iPhone class action. It might be minute, but when it may scan ALL your photos, irrespective of whether you intend to load to iCloud, it also has the potential for much more serious modifications. If you are not using iCloud why should you have software on your system possibly checking photos or anything whether its intended to go to the cloud or not, but again where Systems Integrity Protection makes it so easy for such tools to be modified to target virtually anything.

If Apple are singing the praises of NeuralHash, then it will work no differently on iCloud should they choose to put it there, where they should put it there, because if you believe I cloud too will access data that is outside the remit of what Apple are claiming, then think how much damage it could be installed on your system and the capability of being modified on an individual user basis.

How is it different from Spotlight search indexing my data?
Why can’t that be fan-fictioned into a mass surveillance tool?
Are you really making a fuss about the software being downloaded on 1bln devices included in an iOS update? Are you really making a fuss about the battery life implication? Have you read the license terms of the software and operating systems you use? Do you audit your closed-source software at every update? How do you know they’re not doing something nefarious by virtue of being on your device?
 
So Apple's Manual Review process will not make judgement calls on the subject that had a false positive? It will either match existing pictures or not?
well, apple won't see the images at all until there is a match of the hash and there are at least 30 matches

at that point a person from apple will review the images

but it should never get to that point at all

to your point, yes, it either is an image in the database or it isn't, there should be no "judgement" on apples part since the csam hashes are fixed even in the case of image alteration

so there is no way any of your pictures, even your nude baby-in-the-bath or romping-on-the-lawn-in-the-sprinkler-naked pictures will get flagged because they cannot produce a hash that will match the csam materials
 
  • Like
Reactions: Ethosik
it's in the technical document.



Visually similar images meaning images that were modified but are of the same content. They gave an example of an RGB photo converted to a black and white photo. A neural hash would result in a match. However 2 photos taken of the same object at different angles would result in different content and therefore result in a different neural hash.
See this is what I have always found a bit confusing. As a developer, I would definitely like to study the code and play around with what we will call a "black box". I would love to put am image in a mock CSAM database and adjust it in Photoshop to see what the threshold is. Its just developer curiosity and it does help me understand how something works more than human words ever could. How can the same subject at a different angle not be a match? I mean the edits must be VERY VERY SMALL for a match to occur at this point. If I take a photoshop warp tool and modified part of the image slightly, would that be enough to have the image not match?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.