Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I suspect that it’s a political move to help stave off monopoly regulations that would impact their profits.
I’m not sure that you can escape it with any platform that is available, but we should boycott all such services and platforms and hopefully they take notice when they start losing money.
It also helps to be more politically responsible and elect people who will stand up for our interests at all levels of government.
I agree that it might be a political move. Or something of those order. This smells like fish more than any 1st layer reasoning.
 
Not trying to be pedantic here, but from what I read, there will be no 'scanning' of photos on device. Only photo(s) that is to be uploaded to iCloud Photo will be hashed checked. Otherwise there will not be any hash checking done. This to me is not the definition of scanning.
If this is true then it is okay. So the iPhone only "translates" all your images into hashes and once the photos are outbound into icloud, the server makes the comparison?

But if it is the phone making the comparison on device before the images were even uploaded (when we switch on the icloud photo toggle), then that is not okay.
 
Apple is installing software that scans your data but DOES NOT RELAY the results of said scan to Apple UNTIL you have uploaded the pictures to their servers ANYWAY.

It’s a bit more nuanced than “they’re scanning my local data”, we’ll agree to disagree about this till the end of times.
I wonder in case a match was already identified, and then you delete the said photo, and then you turn on icloud photo. Will the match still be reported to apple? Will the system record that you have previously matched even if you deleted the photo before you toggle to icloud photo button?
 
If this is true then it is okay. So the iPhone only "translates" all your images into hashes and once the photos are outbound into icloud, the server makes the comparison?
No, the hash database is on the phone and it's checked there -- then it's uploaded.
 
I wonder in case a match was already identified, and then you delete the said photo, and then you turn on icloud photo. Will the match still be reported to apple? Will the system record that you have previously matched even if you deleted the photo before you toggle to icloud photo button?
I think it only does the match if iCloud photos is on, so there wasn't any previous match since iCloud was off and no matching done. And since it was deleted before you turned on iCloud, it's never checked. For now anyway.
 
No, the hash database is on the phone and it's checked there -- then it's uploaded.
Thank you! Hmmm the way Apple does this is worrying. But I might understand this wrong.

It feels like if I have a friend who I share my personal secrets with and that friend will report my mum or teacher if I speak about smoking 30 times.

And feels like owning a car that uses its speedometer against me by reporting to the police whenever the car gets on the highway(iCloud) if I sped more than 30 times in my garage or the street in front of my house.

Am I feeling this correctly? Do you feel the same?
 
  • Like
Reactions: jseymour
Local "news" just did a story about this. They got every single aspect of what Apple's going to do completely wrong. Unsurprisingly, it was essentially a puff piece, talking-up Apple and what a wonderful thing they were doing :rolleyes:
 
  • Like
Reactions: George Dawes
I think it only does the match if iCloud photos is on, so there wasn't any previous match since iCloud was off and no matching done. And since it was deleted before you turned on iCloud, it's never checked. For now anyway.
For now hahahaha
 
Local "news" just did a story about this. They got every single aspect of what Apple's going to do completely wrong. Unsurprisingly, it was essentially a puff piece, talking-up Apple and what a wonderful thing they were doing :rolleyes:
I think they are not allowed to side with us otherwise the people will take down the local news saying they don't think for the children.
 
Thank you! Hmmm the way Apple does this is worrying. But I might understand this wrong.

It feels like if I have a friend who I share my personal secrets with and that friend will report my mum or teacher if I speak about smoking 30 times.

And feels like owning a car that uses its speedometer against me by reporting to the police whenever the car gets on the highway(iCloud) if I sped more than 30 times in my garage or the street in front of my house.

Am I feeling this correctly? Do you feel the same?
Unfortunately I feel the same. :(
 
I wonder in case a match was already identified, and then you delete the said photo, and then you turn on icloud photo. Will the match still be reported to apple? Will the system record that you have previously matched even if you deleted the photo before you toggle to icloud photo button?

Nope.
It can’t be reported to Apple because there’s “nobody” doing the reporting.
The safety voucher of that photo is attached to that photo. If you don’t upload it, Apple can’t access the safety voucher. (actually even if you upload it the safety voucher can’t be read by Apple unless you cross the threshold of ~30 matches, until then it’s encrypted gibberish)

That’s why this is such a nothing-burger and for all intents and purposes equivalent to server-side search.
 
Thank you! Hmmm the way Apple does this is worrying. But I might understand this wrong.

It feels like if I have a friend who I share my personal secrets with and that friend will report my mum or teacher if I speak about smoking 30 times.

And feels like owning a car that uses its speedometer against me by reporting to the police whenever the car gets on the highway(iCloud) if I sped more than 30 times in my garage or the street in front of my house.

Am I feeling this correctly? Do you feel the same?

It’s not really the same. For me this is about a technology company looking at their products, seeing where those products can facilitate new crimes or criminal methods against vulnerable groups and acting responsibility to avoiding becoming a cause of increasing crimes. So..

AirTags could facilitate stalking, burglary, robbery, kidnapping etc. So Apple added safety measures.

Encypted messaging could facilitate grooming, sexual exploitation/extortion etc, so Apple added safety measures.

Encrypted photo cloud storage could be used to facilitate child pornography, making it impossible to detect, so Apple added safety measures.

Your iPhone doesn’t facilitate illicit smoking or speeding so it’s not Apple’s responsibility to add those safety features.
 
I wonder in case a match was already identified, and then you delete the said photo, and then you turn on icloud photo. Will the match still be reported to apple? Will the system record that you have previously matched even if you deleted the photo before you toggle to icloud photo button?
Good question. Since the "safety certificate" or whatever they're calling it may already be created, who knows?
 
After all is said and done, Apple is between a rock and a hard place. If Apple doesn't move forward, then anti-CSAM advocates will be opposed to Apple. If Apple proceeds with their plans, then privacy advocates will be opposed to Apple.

And in any event, the proverbial cat is out of the bag. Apple has shared with the world, the ability to scan images on users phones by accepting government hash files. Even if Apple was to create some slight of hand to appease all parties concerned, governments would simply demand a "peek".

And after the recent elections in Russia where Apple caved to profits, and Google as well, Apple will do what ever is asked by what ever government. Privately or publicly.
 
The only way forward I see is to keep end user devices clean from spying once and for all.
Apple could formally declare to guarantee this and a lot of the damage done would heal over time. Finally paying customer interests should be put first over possibly well intended private campaigns. No illusions, governments will still be able spy on you whenever they want. Apple and everybody else will have to respect local laws including totalitarian ones.

The most polite way to call this mess would be to say it was naive.
 
The only way forward I see is to keep end user devices clean from spying once and for all.
Apple could formally declare to guarantee this and a lot of the damage done would heal over time. Finally paying customer interests should be put first over possibly well intended private campaigns. No illusions, governments will still be able spy on you whenever they want. Apple and everybody else will have to respect local laws including totalitarian ones.

The most polite way to call this mess would be to say it was naive.
That is an extremely polite way to refer to this mess.
 
Don’t worry if the police shows up at your house unannounced and starts searching through your entire home. Have to meet a certain criteria before you’re arrested. Oh and forgot to mention, they’ll be searching 24/7, 365!!!
This is already happening. Take for example online mothers groups on Fakebook, who share photos of a baby or child skin rashe looking for help and advice from other mothers, not long later, a call from the cops at the door.
 
It’s not really the same. For me this is about a technology company looking at their products, seeing where those products can facilitate new crimes or criminal methods against vulnerable groups and acting responsibility to avoiding becoming a cause of increasing crimes. So..

AirTags could facilitate stalking, burglary, robbery, kidnapping etc. So Apple added safety measures.

Encypted messaging could facilitate grooming, sexual exploitation/extortion etc, so Apple added safety measures.

Encrypted photo cloud storage could be used to facilitate child pornography, making it impossible to detect, so Apple added safety measures.

Your iPhone doesn’t facilitate illicit smoking or speeding so it’s not Apple’s responsibility to add those safety features.
By that logic, Winchester and Remington facilitate murder....

It's not Apple's job to ensure that their products are being used "properly", or is it their job to decide what "Proper" is. If I want to buy the latest iphone and use it for skeet shooting (as the skeet) that is MY business, I bought the phone. Period.
 
  • Like
Reactions: Playfoot
Hmmm. Not that I am one to easily surrender to conspiracy theories, but based on the attached link, me thinks Apple is getting ready to activate the dormant CSAM framework buried in its OS's. The reason will be, "see, it is the law, we always follow the law..."

Perhaps Apple knew this all along, decided to attempt to limit the negative feedback, wait until governments started to require CSAM removal.

https://9to5mac.com/2022/05/11/appl...e-back-as-eu-plans-a-law-requiring-detection/
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.