Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
im not seeing the pushback, it is like people are defending child abusers (scum). Aren't they all using Android anyway? (that's a joke)
So you’re ok with the cops coming into your house whenever they feel like it, because they say they’re looking for pedophiles under your couch?

I mean, are you protecting pedophiles if you object?
 

If you don‘t need a local scanning system - don‘t build one.​


The first comment contains the most important point. Well done.

Unfortunately, they DID build the local scanning system and their defense of it (attempted distraction from it) tells us Apple has a need for it. That need is, primarily, government mandated surveillance. Any future use for advertising is just gravy.
 
I don't agree with this concept either.

A banker cannot look in my safety deposit box, even if it is stored in the bank.

This is exactly the same thing (in my mind at least).

That's what privacy means. You either have it or you don't.

Turns out, we don't, no matter what Apple says. Oh well.
You make Apple's point for them.
You can guarantee a banker will do their due diligence before letting you store your illegal stuff in their safe deposit box.

You don't expect them to search it after you have deposited it.

Same principle - Apple are doing their due diligence, on your device, that doesn't leave your device, before you deposit your images on their servers.

If the due diligence does meet the criteria to leave your device - you deserve it.
 
You can control it by turning iCloud Photo Library off.

How do you control the iCloud backup feature? It scans your device. How do you know what it backs up and where it's delivered? How do you know the data it scans and copies cannot be used to harm you?
You don't. I turned off both iCloud photo and iCloud backup. I turned off the messages function too. I'm back to encrypted backups on my M1 Mac, which I will likely be staying with Big Sur on.
 
  • Like
Reactions: dialogos
And that is only if the feature works.

So much of what is wrong with this is that their risk assessment for false positives is crap.

Here's a more than one-in-a-trillion scenario:
A list of photos that are collisions with CSAM in the database will leak.​
Reprehensible people will start using AirDrop to put those photos on people's phones (many people leave it on).​
All of them will get flagged.​
Apple's review process will get overwhelmed and the 20-something employee will just forward them all to NCMEC.​
What are you talking about? You can't secretly airdrop pictures to other people's phones. If those devices are locked, you don't even see them and even if they're unlocked, the Photo app opens and shows the picture.
 
I think their legal team calculated the possibilities (and those are very high, probably) of them getting sued, investigated or somehow being impacted in an extremely negative way if any government or agency acuses them of being an accessory for child pornography distribution. Thus, we get these countermeasures wether we like it or not.
I'm not supporting them or accusing them, by the way.
Probably a sign of things to come regarding cloud storage and possible international law changes and reinforcement regarding this subject.
 
Apple had to know this would cause PR problems given their previous privacy based marketing.

I wonder if this limited first step into ondevice scanning and government reporting is to soften the negative PR of coming announced or unannounced governments' mandates that they know are coming.
My thoughts exactly. The thought that people can live in a complete digital bubble during in this day and age, especially as these devices are used for planning terrorist attacks/cp and trafficking, need to get to reality. Govts not only in oppressive countries are asking for access. Apple/Google/MS have antitrust cases against them across the globe. They are trying to buy some time/good graces w/ the regulators. Also, how are all you going to deal with EVs which send EVERYTHING data wise home? I don't think this is a good development, but Apple could not have kept this going forever.
 
  • Like
Reactions: Mega ST
There is no general auditing of software in consumer software. And even in when done in some cases by businesses it's usually for things which needs to bee highly secure.

Releasing the source code won't do anything with the biggest problem some of these people have with Apple: trust.
There are several other areas of iOS which could be misused to an even larger degree and no one is demanding open sourcing those areas.

If you choose to use closed software, you really have to trust the software developer. In this case, Apple.
It's apple that decided to slightly change their business model from " what happens on your iphone stays on your iphone" .
Tech experts express concerns and it's tech experts that asked for apple to release the code. It's also apple employees who express concerns too. Many serious media magazines too. We are not used to apple issuing so many PR releases so frequently, so not only Apple changes but they are doing things they were not doing in the past. So once again, why not publish the code?? Is there a reason you are against it? - at least this is the feeling I get from your reply -
 
  • Like
Reactions: Stunning_Sense4712
The ongoing antitrust point might be valid. So the users pay for apples better political standing with their privacy?
 
  • Like
Reactions: IG88
This issue is clear. It's my data. If you say my data will be private, then you CAN'T look at it.

Apple has never said that the data can never be looked at anyone under any circumstances. They are saying you have to authorise such things.

When you decide to use iCloud Photo Library, your photos are going to leave your device. You have since the inception of iCloud Photo Library authorised Apple to make a copy of your photo and transfer the copy to iCloud. Now they have added another requirement saying when we are making a copy of your photo, we want to also inspect its content in a certain way.

This is how good data protection should work: You ask for the permission in advance, you tell the person how you're going to treat the data and you tell the person how they can refuse to give permission and the consequences of such action.

Then a person can make an informed choice.
 
Thanks Apple for clarifying that I just don't understand what's happening! :rolleyes: Here I thought you were implementing a feature to scan and report any matched data to authorities. But that's not the case and I just don't understand that you're implementing a feature to scan and report any matched data to the authorities, if I choose to store it on iCloud.

I for one can't wait for the next release of iPolice! Now with more scanning everywhere!

*SIGH*
 
But if you believe they are lying then what could they do to convince you otherwise since you have no actual evidence they are lying? Surely you can see the illogic of your stance?
Craig contradicted himself from past comments in this interview. It doesn’t take a genius to dissect it. I’ll leave it up to you to see the contrast in the commentary.
 
Somebody else mentioned this in another thread.
What will happen if a bad state-sponsored actor intentionally send you a whatsapp/imessage 30-50 photos matching the database? iMessage, whatsapp, and many other messaging services defaulted to automatically download any sent photos, and iPhones store them in the photo library, which will then be automatically uploaded to iCloud, without the user's intervention. What then?

I can already expect some critical journalists/politicians to suddenly be arrested for owning CP. It's true, private companies do things more efficiently. Great job Apple!
There isn’t even an option to automatically download iMessage attachments into your photo library. So no, it’s not on by default. It’s fun to imagine scary stories though isn’t it?
 
Last edited:
  • Like
Reactions: VTECaddict
maybe they will scan for the unvaxed. It seems like the media and gouv are slowly moving towards dehumanizing that group. Sooner or later they will be called terrorists and the gouv will make their lives hell
 
You can control it by turning iCloud Photo Library off.

How do you control the iCloud backup feature? It scans your device. How do you know what it backs up and where it's delivered? How do you know the data it scans and copies cannot be used to harm you?
You and I have very different definitions of what it means to have control over systems and data that are owned by me. As long as you believe that a ToS agreement change and disabling a feature I paid for when I bought my devices is a legitimate contract which can revoke, in Tim’s own words, a fundamental human right, we have no space for agreement. I believe you will be on the wrong side of history here. Good day.
 
Well...you are completely wrong in your statement...read how it works again until you get it right.
I have looked at the explanation from Apple several times, and I believe I am correct.

The second feature, CSAM detection in iCloud Photos, is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images. CSAM images are illegal to possess in most countries, including the Unit- ed States. This feature only impacts users who have chosen to use iCloud Photos to store their photos. It does not impact users who have not chosen to use iCloud Photos. There is no impact to any other on-device data. This feature does not apply to Messages.

Does this mean Apple is going to scan all the photos stored on my iPhone?
No. By design, this feature only applies to photos that the user chooses to upload to iCloud Photos, and even then Apple only learns about accounts that are storing collections of known CSAM images, and only the images that match to known CSAM. The system does not work for users who have iCloud Photos disabled. This feature does not work on your private iPhone pho- to library on the device.

I believe the descriptions are deliberately disingenuous, they state the system is designed to keep dodgy photos off iCloud, hence the on device scanning, and carefully say this feature only applies to photos that the user chooses to upload to iCloud photos, to make it sound as if only photos on the cloud will be scanned, but they have already made it clear that the scanning will take place on the iPhone, prior to uploading.

The whole point of the system is to keep the photo off the cloud, to stop it being uploaded if it matches. If a number of photos match, Apple will conduct a human review, as the photo has not been uploaded to iCloud, remember the system has detected the photo and matched it, and it being designed to stop the photo being uploaded to iCloud, will not upload it, will then be scanned by a human review, on device.
 
  • Disagree
Reactions: VTECaddict
Persecution complex at its finest right here, folks.
pretty sure not. here in canada they are forcing you to show papers in order to eat in a restaurant but not if you work in a restaurant. Basically a person who works in a restaurant can work no problem but if he want to eat in the same restaurant he works its illegal without being vaxed. Think what you want but this has nothing to do with the bug. its about obedience and they are ramping up the hattered and hysteria against those who does not bend the knee.
The vax does not protect you as much from cov but it sure does protect you from the mob
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.