Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Better times
and now we have this, make no mistake this is the start of where we will end up. Someone or something watching your phone for a dodgy photo is only the beginning. The devices can already track movements and exercise, how crazy would it be for it to be mandated we have to exercise to meet certain quotas for insurance and work. What if the photo scanning was able to detect a burger, what if it could detect to men standing next to each other etc etc.

 
Absolutely!
Instead we have Apple doing the bidding of law enforcement.

It's clear as day - as you've highlighted very nicely - what is really in play here.

All the rest of the "new releases" and "interviews" from Apple are just attempts to push some dirt over the heaping pile of poop here.
Are you just making this up as you go along? The bidding of third parties...doing this because someone told them to?

You must be a high level Apple employee and in the room when this was discussed.

OR...PERHAPS...Apple is 100% liable for their servers hosting or transmitting child pornography by US Law and MUST report such activity. They are meeting this requirement by providing a more secure and private way of identifying it with this method versus just scanning every single photo you have taken and uploaded to iCloud.

Every internet service is required to do this either by scanning all photos are forwarding user submitted complaints/reporting.

The fact that Apple has figured out a way to do this while still protecting their customers versus a mass grab of every image you upload is pretty impressive IMHO.
 
There’s no confusion. It’s spyware; software I can’t control, scanning systems I own, on data I own, that regardless if it’s a true positive or a false positive will be used to harm me. That’s literally spyware.

We could have done mandatory cryptographic hash matching of terrorists names, emails, and phone numbers on user systems after 9/11, but that’s in compatible with a free society. So is this. Why isn’t the issue, nor is the technical implementation. At it’s core, this is unethical software.

The problem is the simpletons who equate “spyware” with thinking there are actual humans combing through all of the data and pictures on your devices. That isn’t what’s happening here, obviously. And I have serious qualms with people who think it is. That’s my only beef with the term “spyware” and those who don’t know the nuanced nature of it.

Otherwise I’m staying out of this. Living in this country, in the year 2021, after all the crap that’s gone down here in the land of the (somewhat free). I have no naive expectation of privacy and never have. That’s a quaint notion and idea. Good luck with that. There are other battles I’d rather fight, you guys can have at this one.
 
  • Like
Reactions: VTECaddict
No @svenning. I've done a bit of recreational work in this area and this is my knowledge...

The device could make a blueprint describing the image's shapes, lines, and colors. That is coded into a hash that is compared with other images. Someone could change the pixels to a certain degree, but think of it this way... imagine a circle with a few pixels missing. Your mind can fill in the blank and still understand that it's a circle, right?

Image processing and machine learning works the same way. Computers have been trained to analyze shapes and patterns and fill in the altered or missing content to be extremely close to the original.

But I'm guessing that for privacy, the original photo hash is one way and cannot be decrypted to restore or attempt to recreate the original photo.

But I have no idea what Apple is actually doing.
So really, the most honest answer is: we don't really know. The range of alterations one could apply to an image is endless. Will changing a pixel trick the system? Will mirroring the image trick the system? Will changing the hue slightly trick the system? Will cropping or resizing trick the system? Will re-compressing the image trick the system? Will inverting the colors trick the system? We don't know without knowing what Apple uses to hash and match the images. There's no robust image matching algorithm that can't be tricked. Machine Learning isn't magic either. If you don't want to be flooded by false positives, you have to accept that you'll lose some. Also, I don't think the strategy here is to get them all, but rather, get the dumbest ones that actually store that stuff on their iCloud. The assumption may be that somebody who is dumb enough to do that won't go through the trouble to make the next step any harder. So honestly, I don't expect Apple to have implemented anything remarkable.
 
Matching is not scanning, the issue there is 99% of users don't know what an hash is. It is not the photos, just numbers.
That is semantics. No one thinks an apple employee is going to their house to manually review their phone. Apple created a software mechanism to categorize photos. “Naughty” “nice”. If you have too many “naughty” things they call the cops. That is oppressive.
 
  • Like
Reactions: BurgDog
Why was this not brought up at WWDC?

If this is such a noble cause with such user privacy in mind when built...
Why was it completely omitted from discussion?

That alone is unbelievably worrying.

It feels like they want/wanted to slide this in real quick right before actual iOS15 and macOS Monterey releases and just sort of not talk much about it.
 
Maybe Apple should use or build intermediary servers whose whole purpose is to scan things meant to go up to iCloud servers.

Don't do scanning on our devices when the goal is to compare our own content to third party databases

Thinking outside the box. I like that!
 
Only if you have 30 image which matches CSAM know images from a independent database. In that casa your privacy will be at risk... but you are a pedophile. Remember matche is not scan, an hash is not an image.

Oh I know. I meant the rest of us will be subject to the entire process, while the pedophiles just turn it off.
 
  • Like
Reactions: turbineseaplane
They can. And the government, if push came to shove, could also view photos on iCloud servers by government mandate.

Which system would you prefer? Apple scanning a table of hashes against hashed photos on your phone, or, the government looking at your photo jpegs on cloud servers (Apple and others).
The pushing coming to shove and “government mandate” are both protected by the 4th amendment in the US, therefore they would need a warrant, and I’d prefer they get one.
 
I’m surprised Apple hasn’t told MR and other site to stop covering this topic. :rolleyes:
 
Only if you have 30 image which matches CSAM know images from a independent database.

"Independent database"...
So scanning my own photos, on my own device, and comparing to a third party fully independent database.

It's literally -- surveillance on your own device.


Also - up to 30 matches now?
Hunting for cockroaches with shotguns at this point...
 
Why was this not brought up at WWDC?

If this is such a noble cause with such user privacy in mind when built...
Why was it completely omitted from discussion?

That alone is unbelievably worrying.

It feels like they want/wanted to slide this in real quick right before actual iOS15 and macOS Monterey releases and just sort of not talk much about it.

I wouldn't be shocked if what Apple is doing was negotiated with the government; down from the government demanding it have access to Apple servers. And that Apple is not permitted to fully discuss.
 
I feel like this needs more than a press release to explain in more detail, but it's hardly a topic you want to discuss live with an iPhone launch.
 
It isn't necessarily THIS system; it is the precedent that it sets for other systems. That is the point that Apple seems to be missing.
This. Exactly this. All they did is now open themselves to "Well if you believe in protecting children why won't you crack open this accused person's iPhone? It's to protect the children." Once you open the door for one thing, no matter how worthy, you've opened Pandora's Box. Isn't my pet concern as important as the one you've already implemented a solution for? This is extremely troubling from a Privacy standpoint.
 
The law states if you download three images you have committed a felony, not 30.

Besides your isp has to report you if they detect your doing this.
Mind quoting that law? Is that a federal statute?
 
The pushing coming to shove and “government mandate” are both protected by the 4th amendment in the US, therefore they would need a warrant, and I’d prefer they get one.

Yes. In a perfect world.
 
This. Exactly this. All they did is now open themselves to "Well if you believe in protecting children why won't you crack open this accused person's iPhone? It's to protect the children." Once you open the door for one thing, no matter how worthy, you've opened Pandora's Box. Isn't my pet concern as important as the one you've already implemented a solution for? This is extremely troubling from a Privacy standpoint.

Just THINK of all the criminal capture uses (thought crimes, data crimes, copyright crimes, normal old school crimes)..

Setting the precedent of "oh no, we have privacy oriented ways to......RIFLE THROUGH ALL YOUR DATA" is atrocious and will never end in the uses demanded of it.

It's.
Insane.

Folks, don't quit on this.
We have to make a LOT of noise on all this.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.