Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
People like you is the reason human rights fail soon or later... see in a few words I describe your obsessive and pathetic behaviour for a company.
Cool story. I’m just not dumb. I’m aware what they do on my device, I’ve read the EULA and the Privacy Policy. I’m also aware how robust this CSAM technology is, maybe the research paper was to complicated for you, would fit given your silly responses.

If you don’t like what Apple is doing or trust Apple, there are plenty of open source options on the market. Anyone using proprietary closed systems whilst whining about this development like a little child needs to evaluate their life choices. Enjoy your open source solutions, don’t let the door hit you too hard on the way out.
 
How long before Apple starts scanning our email? This is definitely a slippery slope. Tim Cook is an ******* and a hypocrite (I’ve never embraced him), and I no longer trust Apple when it comes to privacy, which is sad.

I also wonder why this is happening now, and why Apple suddenly came up with this. Who at Apple came up with this idea? Why? I too wonder if their hand is being forced, and the whole story is not being told.

Having said that, it’s time for me to go back to using my 35mm film SLR camera. And probably my Super 8mm film movie cameras and projectors as well. Before all this techo-digital crap that has been vomited out of Silicon-Valley, we didn’t have to worry about privacy. Now we worry about it constantly. Silicon Valley ushered in Big Brother, and nobody paid attention.
I believe Apple already scans iCloud email for CSAM as they had 265 CSAM reports last year and that was from iCloud email.
 
I believe Apple already scans iCloud email for CSAM as they had 265 CSAM reports last year and that was from iCloud email.
You “believe”, or you know? I have NEVER heard of that. Otherwise, it would have caused the same uproar that this surveillance of people’s photos has.
 
Cool story. I’m just not dumb. I’m aware what they do on my device, I’ve read the EULA and the Privacy Policy. I’m also aware how robust this CSAM technology is, maybe the research paper was to complicated for you, would fit given your silly responses.

If you don’t like what Apple is doing or trust Apple, there are plenty of open source options on the market. Anyone using proprietary closed systems whilst whining about this development like a little child needs to evaluate their life choices. Enjoy your open source solutions, don’t let the door hit you too hard on the way out.

“Evaluate their life choices”

What a joke of a comment.
 
Anyone using proprietary closed systems whilst whining about this development like a little child needs to evaluate their life choices. Enjoy your open source solutions, don’t let the door hit you too hard on the way out.

What an incredibly immature way to converse with people.
Come. On. Let's do better than devolving to that type of comment please.
:confused:
 
“Evaluate their life choices”

What a joke of a comment.
If you use a closed system and expect total privacy then yes, you need to evaluate your choices. What all the cry babies here are looking for is called open source.
 
I believe it was Mark Twain who said, "Half of the results of a good intention are evil."

Yep!
Complete quote below:

And no such thing as an evil deed. There are good impulses, there are evil impulses, and that is all. Half of the results of a good intention are evil; half the results of an evil intention are good. No man can command the results, nor allot them.
 
If the hashes are hard wired into IOS or any Apple software it will end in tears for Apple. Always hard to backtrack on what you thought was a good idea but isn't, but the worst thing anyone can do is then try to justify a crap idea and go forward with it.

I've already alerted certain authorities in the UK, EU, and USA as I consider this so serious, it warrants action within most countries with of course exceptions being countries run by dictators. I spoke to an MP here in the UK last night, and explained my concern which he agreed with, and where in the past Apple have consistently stood on a pedestal in public and in the courts to explain their refusal to cooperate with law enforcement/counter terror organisations on the the basis of upholding PRIVACY of the individual and against SURVEILLANCE.

What we are seeing here is a typical situation where a company grows so big, it wields more power than many countries, in common with Facebook, Amazon etc., and where then they start to develop megalomania.

This is how ironically we get the Epsteins, where power corrupts and absolute power corrupts absolutely.

It's always done in the name of good, but where it seldom is.

I met with my fellow partners yesterday and we will no longer recommend Apple equipment, nor use Apple equipment of any sort should this situation continue.

The agencies in the UK and the USA who I have worked with in the past will confirm they have not been in contact with Apple over this, and indeed could never sanction releasing hashes to Apple or any other company, and indeed on occasions when these agencies have requested assistance over multiple threats, Apple have always refused.

Even the answer by Apple after this issue gained so much attention was NOT accurate, as they refer to comparing hashes, but in order to do that you have to have hashes to compare with....you would need a complete database from crime agencies in order to do this. Apple do NOT have access.

Crime agencies, however we may feel about some of the excesses, we know what they do and we know what their job is, including on occasion ironically surveillance, hopefully for the greater good, but where even then they are subject to checks and balances. It is so hypocritical then for Apple to believe THEY ARE NOW THE LEGITIMATE WORLD POLICE FORCE.

Would I even trust facial recognition now on an Apple....will the next idea be storing facial recognition to do with as they wish? Will Touch ID include keeping copies of fingerprints, to do with what they wish?

I understand certain government security agencies eavesdrop on subjects of interest, use Surveillance, and invade Privacy to fight crime etc. etc., and where even there I have significant reservations, but I do not expect a private company with no authority or mandate from the customer or the electorate to do so, let alone engage in obfuscation and try to justify that action in the name of saving children, with software that does not do what they say, will not do what they say, and where they don't even have access to the hashes they are purportedly comparing photos with, when their actions will make it harder for the agencies employed in doing that in our name, as paedos and criminals will take preventative action and go much deeper into encryption, VPN's, dark web etc. etc., that will make the Apple fag packet idea useless, and in the process make it even more difficult for agencies dedicated to do that work, agencies who are under the jurisdiction of governments and in the most part from freely elected governments.

This idea was not to protect children it was to implement the principle of APPLE being empowered invade privacy and engage in surveillance, an ultimate irony considering their public stance over the same subjects.

A very slipper slope they should never go down.

My company guard our privacy and our customers data, so how could we possibly entrust Apple if this idea goes ahead. I will not be posting more on this particular point, as I hope it will become subjudice if enough people and democratic governments take action to thwart this.

I'm sure some dictatorships, some authoritarian states will be so pleased at Apple's actions, and the cynic in me wonders whether it could be as a result of these countries exerting pressure on Apple to fit in with their own surveillance states.

Difficult for me as I've had a good relationship with Apple since 1976
 
Last edited:
So a perv can have 3 GB of child porn on his phone but needn’t worry as long as he doesn’t upload it to iCloud?
Yep, this is more about mitigating any legal/Governmental action as a result of Apple storing images on its servers than it is about anything else.

Well, Judge... we tried to stop it getting there... just we set the threshold too low... should have been any images at all rather than a collection greater than a size...
 
Maybe Apple needs some sort of customer council if they are not aware of the privacy and legal implications beforehand? Fighting child porn is not a free ticket to do whatever they want ignoring all existing legal boundaries and customer respect elsewhere. I still hope they abandon the entire idea and I don't understand why they have thrown away their claimed reputation for customer privacy support.
 
Last edited:
  • Like
Reactions: turbineseaplane
They weren't "already" using our devices to do it.

I don't understand why those that are ok with this can't understand that concept. Maybe it's a lack of appreciation of what true ownership means?

I'm still not convinced if I'm ok with this or not, as I do fully understand how and what they're doing, and why they're doing it to be honest. But they're using my device to do what they can just as easily do in the cloud. Yes, it's less private in the cloud. But I (and anybody else using iCloud) have accepted that already.

And if we're being 100% honest, if they're scanning in the cloud already, isn't this a step backwards from a standpoint of helping out the NCMEC? Now people with CSAM are aware of a way they can keep their filth more hidden.
If they were already scanning CSAM content on iCloud servers, one must ask why they need to move this software to Apple devices?
 
Great link to the Jason Snell piece.
His head on this is in exactly the right spot.

He is also very right to point out how the announcement of E2EE for iCloud is totally absent here. A lot of people would be hanging their hat on the combination of changing to that and the scanning tool....But that didn’t get announced.

Additionally I think there are a lot of us the encryption or not or simply against these kind of tools being installed on our devices.
 
Law enforcement has every tool and right to look for criminals. But Apple being a private company wants to treat every customer as suspect. I accept that their servers are scanned but I don't accept that my device is scanned for no reason by a private company.
 
So a perv can have 3 GB of child porn on his phone but needn’t worry as long as he doesn’t upload it to iCloud?

That fact alone doesn't feel talked about enough.

It's like --- if we are SO concerned about CSAM (as all the defenders of this scanning change purport to be)...
How long before the scanning is hitting all photos on a device?

It's "to save the kids" after all, right?


***I'm absolutely against CSAM. But I'm against a LOT of awful things....it doesn't mean I want to suddenly search everyones devices looking for all of it.
 
Last edited:
  • Like
Reactions: 09872738
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.