Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The problem is they installed surveillance software on the devices and the users have no way to verify what it is doing since iOS is closed source. So all we have is Apple's word on what the software is doing. Do you not see a problem with that and how it can be easily abused?
You're missing my point. If Apple wanted to (or was forced to) install surveillance software on our devices, they could have done so at any point in time, and they wouldn't even be required to make some public announcement about it. You're right, iOS is closed source, so they could do this behind the radar.

Apple's word means absolutely nothing in either case. Do you not see a problem with that, and how all iOS users could have always been easily abused?
 
Except complaining tools solves nothing. They should’ve complained all people involved into creating those tools. We have yet to live in the age where tools build tools themselves.

People is always the issue behind this. Tools just serves their purpose and nothing more.
Yet there are 33 pages of people complaining about this new tool, which is an infeasible target for abuse.

Also, we are already in the age where tools build tools themselves. I do research involving applying ML and program synthesis techniques for synthetic chemistry, and have read dozens of papers contradicting your assertion.
 
  • Like
Reactions: giggles
How has it been misrepresented or misunderstood?

Wow, really? Just read through the thousands of comments on the gazillion threads about it on this forum. You have people who think they're going to be thrown in prison for uploading their baby's "first bath" picture to iCloud or that Apple employees will be perusing their photo libraries on a regular basis looking through each of their pictures. You have people just certain that Apple is preparing the way for dictatorial governments to throw you in jail for having dissenting political images on your phone, when of course this has absolutely nothing to do with that. Slippery slope arguments are only valid if the negative consequences the arguer is proposing will happen are logically inevitable, which of course isn't true here, therefore it's a logical fallacy. You also have people confusing the on-device CSAM detection with the parental safety measures they're implementing in Messages for inappropriate photos (they're two separate things entirely).
 
Yet there are 33 pages of people complaining about this new tool, which is an infeasible target for abuse.

Also, we are already in the age where tools build tools themselves. I do research involving applying ML and program synthesis techniques for synthetic chemistry, and have read dozens of papers contradicting your assertion.
Yet I highly doubt the machine learning foundation and first few iterations of tools are built by tools themselves.

Also, tools have no conscious so blaming tools hurt nobody. And, Apple has been blamed as well, unless you choose to ignore it.

But I know my comments carry no weight so feel free to ignore any and all part of it. I wont mind.
 
Maybe you need to learn about consent? I don’t like that feature, but I am willing to use my phone despite of it. I do not wish to consent to warrantless searches on my phone.

For the record, there are a lot of other things that I wish would change about iOS, but I am still willing to consent to use my iPhone despite of them. That still doesn’t undermine my stance here.

Consent is a very important concept. I’d be concerned if it’s something you can’t grasp.

But you'll have to consent to it or not use Apple's service if you get some version of iOS 15. Apple will force you to make a choice: accept the new terms or turn off iCloud Photo Library.
 
Yet I highly doubt the machine learning foundation and first few iterations of tools are built by tools themselves.

Also, tools have no conscious so blaming tools hurt nobody. And, Apple has been blamed as well, unless you choose to ignore it.
You're right in that the first iterations did not build themselves, which would imply that these tools could follow some Darwinian notion of evolution from primordial bits.

Apple can be blamed for any of the tools it makes --- my continued refrain hasn't been "don't blame Apple!", but rather, "blame Apple for the appropriate thing!" Again, if anyone is worried about surveillance-capable tools performing scans locally on their device, then you don't need to look at iOS 15 -- in fact, nothing in iOS 15 is worse than what's already in iOS 14, 13, 12...
 
Except the child pornographers can just turn off iCloud Photo Library and won't have any issues, while the rest of us innocent users are constantly surveilled.

I would also love to see Apple cite actual data about child exploitation instead of just issuing broad proclamations about how this is such a huge problem.
Not only that, but they claim they're only matching hashes from known pre-existing CP images, which means what they're actually doing is incentivizing the production of new images and the exploitation/trafficking of more children in order to produce images which aren't in the database and can't be caught until the next time the hashes are updated. They're making the problem of exploitation *worse* while introducing an attack vector to every iPhone. That's just stupid.
 
  • Wow
Reactions: bobcomer
They did not have the ability to scan photos on the device and report back to Apple HQ. Not to mention I am sure this scanning will be expanded on and the option to turn it off will not exist.

They had the power do it all the time. How do we know? They created the CSAM detection system!

Also Photos app (or rather a background service) analysis all the photos in your Photo Album and syncs the entire photo to iCloud (which Apple has access to) by default.

They have also provided iCloud backups which scans your files and creates a copy in iCloud (again available to Apple).

Apple could at anytime make small changes to Photos and iCloud backups and create much bigger privacy and security issues than the CSAM Detection system.

A lot of people seems more worried about what Apple can do and I'm just pointing out all the horrible things Apple could do very easily with minor changes to current technology.

They don't need the CSAM Detection system to take away almost all your privacy and security of using an iPhone.
 
Clearly Apple has data that this is a major problem. Also it seems the average person seems aloof to what a problem this is especially in other parts of the world as well.
No, clearly Apple thinks they can use the outrage against this heinous crime to defuse any outrage over compromising the security of IOS. They've presented no evidence that this is a major problem or any real justification for implementing on-device scanning and compromising the security of the Messages app.
 
You're right in that the first iterations did not build themselves, which would imply that these tools could follow some Darwinian notion of evolution from primordial bits.

Apple can be blamed for any of the tools it makes --- my continued refrain hasn't been "don't blame Apple!", but rather, "blame Apple for the appropriate thing!" Again, if anyone is worried about surveillance-capable tools performing scans locally on their device, then you don't need to look at iOS 15 -- in fact, nothing in iOS 15 is worse than what's already in iOS 14, 13, 12...
Yup. Machine learning into self evolving machines that can act and think like humans. But thats the topic for another day.


You know exponential growth right? This whole machine learning thing could explode just like that. Apple May be the last company doing CSAM related thing, but everybody follows Apple nowadays, which means such move will have ripple effect very soon, and BOOM, everyone would probably think twice before taking any photos, and videos, and more.
 
  • Like
Reactions: 09872738
Apple could at anytime make small changes to Photos and iCloud backups and create much bigger privacy and security issues than the CSAM Detection system.
In fact, I assume they have done so already. It’s just that those tools have yet to find any reason to suggest prosecuting me for any “potential wrongdoing” at the moment.
 
The technology increases child abuse. How much liability will Apple hold as a result?
Exactly. It incentivizes the creation of *new* child porn and the exploitation of more children in order to produce images whose hashes are not in the current database. It makes the problem *worse* and hurts *more* children. Why would anyone want to do that????
 
  • Disagree
Reactions: farewelwilliams
But there's a primary difference in the intentionality behind the backup software vs. the intentionality of the CSAM tool


We might just have to disagree there. There is no other technology built into iOS that's sole intention is to find and report criminal activity, even if Apple cannot see it at first.

To the first point. Some here said under any circumstances which would exclude intentionality.

Agree, but isn't a lot of the discussion here about Apple being forced to use the technology for a different intention?

If I was running a powerful, oppressive government I would not use the CSAM detection system to control the populace.

Here are what I would force Apple to do:
1. iCloud backup mandatory on all phones
2. iCloud Photo Library mandatory on all phones
3. No user has to pay for iCloud storage
4. Real time access to iCloud for law enforcement esp. the secret police
5. Improve Photos to recognise (or use the AI being introduced to Messages) to photos of activities I want to curtail.
6. Don't worry too much about false positives
7. Remove CSAM Detection system

I would then publicly go out and said we have made CSAM Detection system illegal because of privacy issues.

#5 would be much better at getting a large number of photos matched since we are not looking for particular photos but categories of photos.

#6 It's better that 99 innocent people go to forced labour camps than one guilty gets away.
 
Google the chances of winning the lottery. Then read Apple's document of what are the chances of flagging your account erroneously. Compare the two numbers and then re-read your reply to see how nonsensical it is.

I'm glad you proved my point though. You absolutely did not read how the technology works.
Apple's statements about the odds of erroneous flagging are meaningless without independent verification. Since the details of the technology are proprietary, that's impossible to do. They should open-source the scanning/flagging code so independent experts can validate it.
 
  • Disagree
Reactions: farewelwilliams
No.



1. You'd need to find or generate images that would create a collision. You have a better chance of finding a UUID collision which would take 80 years to do on today's computers to find just ONE.
2. This is why Apple set an extremely high threshold on the amount of before an Apple employee can even decrypt the images. Meaning potentially you'd need to find hundreds of collisions.
3. Apple stated that the chances of a mistake is one in a trillion
4. Assuming you're unlucky enough to be that one in a trillion mistake, Apple will manually review those images in question and will correct the mistakes.



China iCloud data centers are already being reviewed by the government. This is why you get a warning if you fly to China with your iPhone and switch your country setting to China, Apple will tell you you're on China servers which are treated differently.



Chances of winning powerball: 1 in 292 million
Chances of erroneously being flagged via Apple: 1 in a trillion

You have a MUCH higher chance of winning the powerball. And even after being flagged, Apple will review and re-active your account if it was in error.

Other perspectives:
- Odds of you dying from a car crash: 1 in 107
- Odds of you being struck by lightning: 1 in 1.2 million
- Odds of you dying from a shark attack is 1 in 3.7 million
- Odds of you dying from a plane ride: 1 in 29.4 million


Think about it. 1 in a trillion. What other event in the world happens with 1 in a trillion odds.
That "1 in a trillion" figure is just PR bafflegab unless it's independently verified.
 
So if Apple were required to use it, then what? Near-exact matches of child abuse photos would be flagged and then sent to some government agency for review? I'm not against that.

But my original assertion, that this can't be used for finding e.g. the set of people who attended a trump rally, still stands. It would only find near-exact machines of hashes of photos stored within iOS's internal database.
It can be used for finding, for example, lots of near exact matches of movie frames, since that’s a straightforward addition to handling HEIF files. In other countries there are movies which are illegal to obtain that aren’t CP, plus there will inevitably be requests to use it as an anti-piracy measure.

there’s also the issue of images that violate national security, as that might be defined by different countries - tank man, Winnie the Pooh meme templates, pictures of classified government actions that were classified to cover up something that shouldn’t have been, that sort of thing.
 
  • Like
Reactions: turbineseaplane
But you'll have to consent to it or not use Apple's service if you get some version of iOS 15. Apple will force you to make a choice: accept the new terms or turn off iCloud Photo Library.

You are so close to reaching the crux of the matter, but you slip at the end. Turning off iCloud Photo Library does not prevent the surveillance software from being installed on your personal device. You should know this if you read the press release, so you’re either ill-informed, or you’re being intentionally misleading.

So the REAL choice is: accept the new terms and consent to surveillance software being installed on your personal device that conducts warrantless searches, or stop using all Apple products. This is why people are upset.

Imagine if a landlord told a tenant that they must consent to cameras being installed in their home. The landlord *promises* to only turn the cameras on while the tenant uses the Internet. The tenant has no recourse to stop the cameras from being installed. They must make a choice: consent to living with the cameras and *hope* that they aren’t being abused, or find a new place to live.
 
It can be used for finding, for example, lots of near exact matches of movie frames, since that’s a straightforward addition to handling HEIF files. In other countries there are movies which are illegal to obtain that aren’t CP, plus there will inevitably be requests to use it as an anti-piracy measure.

there’s also the issue of images that violate national security, as that might be defined by different countries - tank man, Winnie the Pooh meme templates, pictures of classified government actions that were classified to cover up something that shouldn’t have been, that sort of thing.
I'll concede to these possibilities. Apple says they won't do anything aside from CSAM, but that's just what Apple says. In any case, while I wouldn't equate CP with pirated movies, both are still illegal to own, and the point of "don't do illegal stuff and you don't need to worry" still stands.

Let me ask a more pointed question: if Apple were required to use any of the existing tools that are already on your phone prior to iOS 15 that could easily find any of those things, then what? Prior to CSAM detection, the same issue exists; CSAM does not increase the surveillance vector at all. The point is, this method is not the best method to surveil you or anyone else. They have the capability of scanning your entire device without all the cryptography backflips and safeguards with a lot more freedom.
 
You are so close to reaching the crux of the matter, but you slip at the end. Turning off iCloud Photo Library does not prevent the surveillance software from being installed on your personal device. You should know this if you read the press release, so you’re either ill-informed, or you’re being intentionally misleading.

So the REAL choice is: accept the new terms and consent to surveillance software being installed on your personal device that conducts warrantless searches, or stop using all Apple products. This is why people are upset.

Imagine if a landlord told a tenant that they must consent to cameras being installed in their home. The landlord *promises* to only turn the cameras on while the tenant uses the Internet. The tenant has no recourse to stop the cameras from being installed. They must make a choice: consent to living with the cameras and *hope* that they aren’t being abused, or find a new place to live.

Very well articulated @briko
 
  • Love
Reactions: briko
You are getting distracted.
It doesn’t matter if the search is done using hash-matching algorithms, bloodhounds, or black magic. The technical implementation is completely irrelevant. A search is a search.

All that matters is a search is happening, and it’s happening on my personal device, and a warrant for the search was not obtained. End of story.
Apple doesn't need a warrant as it is not a a law enforcement agency.
 
  • Like
Reactions: briko
Apple doesn't need a warrant as it is not a a law enforcement agency.

Yes, and I have previously acknowledged that the constitution does not protect us from the tyranny of corporations. That is why it is so important that we advocate for these rights while we can.

When we stop advocating for our privacy, we will lose our privacy.

I’m not even trying to avoid having my photos scanned. I literally have nothing to hide, so I’m ready and willing for that to be done server-side. But on principle, I must object to this implementation. This will open Pandora’s box.
 
You are so close to reaching the crux of the matter, but you slip at the end. Turning off iCloud Photo Library does not prevent the surveillance software from being installed on your personal device. You should know this if you read the press release, so you’re either ill-informed, or you’re being intentionally misleading.

So the REAL choice is: accept the new terms and consent to surveillance software being installed on your personal device that conducts warrantless searches, or stop using all Apple products. This is why people are upset.

Imagine if a landlord told a tenant that they must consent to cameras being installed in their home. The landlord *promises* to only turn the cameras on while the tenant uses the Internet. The tenant has no recourse to stop the cameras from being installed. They must make a choice: consent to living with the cameras and *hope* that they aren’t being abused, or find a new place to live.
I'm just curious what was preventing <insert evil actor here> from forcing Apple (potentially Apple itself is the evil actor) from doing this prior to this tool being implemented? iOS 14 includes powerful scanning techniques on all of your photos (and the rest of your phone) that far outperform this tool in terms of what can be found and how easy it would be to alter it. If <evil actor> is attempting to surveil, why not just use the more powerful existing tools, and why on earth would you make it public?
 
Yes but if it happens to you it will be 100% for you.

Anyway this 1 in a trillion that people keep repeating is not the problem but the actual software installed on our devices and future uses of it.

On a second note, Apple was never famous about always delivering properly written software so please allow many of us to challenge the number that Apple chose to release.

I am not a developer but are you claiming that all those tech experts that express concerns do not know what they are talking about? Don't you think we should carefully listen to them?
I will not claim to have read all the “tech experts“ posts. However I would like it they or even you could point to an example of good software that has never failed. Software can have flaws, but the spirit/goal of what is being presented by Apple is to make it without flaws and to fix them if they come up. I doubt they spent a few minutes writing some code without testing before they published their results. Thats just not how it works in big companies.

I think your issue, and many others, is not that the software could be flawed but that you do not trust Apple or anyone one for that matter. I will be the first to admit I did not expect this from Apple. Obviously they feel its a big enough issue to take all the bad publicity over it and still try to make it work.

I‘m more optimistic in that I truly think they are trying to do exactly what they say they are trying to do. Also 1 in 1 trillion yeah I’ll take those odds all day long every year of my life. If I did get targeted I’d be okay because I’m not in the market for what they are looking for so it would be about as bad as getting a parking ticket or pulled over for having a tail light out. Which by the way is much more likely to happen and I’d be pretty civil if it did. I’m not everyone though so everyone will feel different.

As for some of the tech experts I have read so far, so not all of them, I can’t find a single one yet that can prove it will fail or that it will be abused. But hey, I can be wrong. I just don’t think Apple is trying to be malicious at all on this. In this case it is society that has the dirty mind.
 
I don't need or want the park to be private.

That's where we disagree. I want that to be private. But the analogy doesn't work as there's nothing private about the park but there's definitely a lot private about my photos and my device backups.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.