Become a MacRumors Supporter for $25/year with no ads, private forums, and more!

KaliYoni

macrumors 6502a
Feb 19, 2016
608
1,074
That’s not the point. The point is Apple has a back door now. 10 or 20 years ago, things we see today as abhorrent were commonplace. What’s it going to be like 20 years from when perhaps your likes today will be taboo then?
I'll deal with it in 20 years' time, if the problem becomes real. I'm busy enough with today's privacy and security threats, such as ISP and mobile phone provider tracking, "Login-with-XXX" tracking, unwanted surveillance by apps, security breaches at credit reporting agencies, Facebook cookies/beacons/embeds, adware, malware, etc etc etc, to worry about theoreticals.

YMMV
 

ian87w

macrumors 601
Feb 22, 2020
4,685
6,715
Indonesia
Apple confirmed to MacRumors that the company will consider any potential global expansion of the system on a country-by-country basis after conducting a legal evaluation.
To me, this is Apple pretty much advertising their capabilities to other countries by implying that “we can customize these thing to suit your lawmakers’ requirements on per country basis.”

Not really a good statement Apple.

And again, for those blindly agreeing to this, try living in a country that have strict censorship. This is a Pandora box, and Apple just advertised it to other governments that they can tailor it to country by country basis.

This really send a chill, especially in recent political turmoils in Asia (Hong Kong, Thailand, Myanmar, etc).

Meanwhile, can we get a word from the… let’s say… the Vatican? Just asking. I mean let’s be honest that the issue of child abuse lies deeper in the governments and institutions themselves. This mass encroaching scan seems very sus when there are particularly specific points that are largely ignored.
 

ian87w

macrumors 601
Feb 22, 2020
4,685
6,715
Indonesia
Other countries already knew “apple can do this.” Nothing has changed in that regard. And some other countries already require apple to do stuff you’d object to. Again, nothing has changed.
This is different. In the past, each country/institution has to make specific requests to Apple, and then wait for Apple to do something about it, and often times Apple denied or said it’s impossible (eg. the FBI request)

This is Apple actually advertising a mass feature (backdoor) that will be pushed to all iPhones, any iPhones in the world, and now it is Apple advertising to the world “hey, we can do this, and it can be tailored to your needs. Come get some.”

This made Apple to be the actual bad actor. Although I’m glad it’s out in the open so we know about it, it paints a gloomy digital future.
 

ian87w

macrumors 601
Feb 22, 2020
4,685
6,715
Indonesia
Is this why Apple will maintain iOS 14 alongside iOS 15? I have been quite suspicious when Apple said they will still maintain and support iOS 14. This was never done as Apple tend to force supported devices to upgrade by not supporting the older iOS on eligible devices. So maybe someone at Apple is still trying to do better. Let’s hope.
 

DontGetTheCheese

macrumors 6502
Nov 22, 2015
294
199
Apple doesn't give 2-F's about "the children", they just don't, and we shouldn't pretend otherwise. This is about one thing, Apple trying to do a very delicate dance between doing business with repressive regimes (Hi, China and friends) and keep law enforcement off their backs. They picked "the children" because it's the easiest thing to sell to the public. Heck, some people will actually think this is a good thing.

It's a business decision, everything else is PR BS, because that's all Apple is. They aren't your friend, they don't care about your privacy, they don't care about you, except insofar as you buy their products.

They just aren't honest about that like the competition is.
 

poked

macrumors 6502
Nov 19, 2014
267
147
the stock drop speaks for itself on AAPL. If you want real-time consequences, there they are. After all, Apple doesn’t give a fck about the consumer, they care about their stockholders. So sorry to those who bought those stocks hoping for a good investment. While I agree fundamentally with this policy on the CP side and personally have no issues with a scan, I can’t condone the consequences this could have for the average person, especially with back-door capabilities. That paints a target on Apples back like no other.
 

antiprotest

macrumors 68000
Apr 19, 2010
1,784
1,937
There
Is
No
Back
Door

a Backdoor is if Apple could scan non iCloud data, read your notes not on the cloud, scan your selfies not on iCloud.

Apple does not have the ability to scan non iCloud data

repeat

Apple does not have the ability to scan non iCloud data

slower

apple . Does. Not. Have . Ability. To. Scan. Non. iCloud . Data

this means .
If you turn iCloud photos off, Apple cant scan anything

slower this time :

if . You. Turn . iCloud. Photos. Off. Apple. Can’t. Scan. Anything

got it ?
Got. It.
There. Is. No. Back. Door.
It. Is. A. Front. Door.
A. Continuously. Revolving. Automatic. Front. Door. To. The. Millions. Upon. Millions. Upon. Millions. Who. Have. iCloud. Enabled.
That's. A. Seriously. Invasive. Rectal. Insertion. SIRI.
 

boswald

macrumors 6502a
Jul 21, 2016
667
875
Florida
Can you point any law directly state privacy is right?
If that has to be a law, we’ve failed. We’ve been screwed over many times by the government (for example), with illegal wiretapping, NSA surveillance, and who knows what else, but only because we let it happen. We have no power to challenge the government, but we can vote with our wallets. We shouldn’t have to settle because “well, at least it’s better than android.”
 

xWhiplash

Contributor
Oct 21, 2009
4,780
3,388
It's not literally 1 in a trillion. Hash collisions are more like 1 in 2^256 if it's SHA-2, as far as we know. Bitcoin is testing that theory constantly.
This just seems like a major contradiction. How can Apples system get a match on a cropped, color adjusted, pixel modifies, distorted image, yet claim false positives will be so low? There is some leeway to the hash matching if a unique photoshop edit can be caught.
 

trainwrecka

macrumors 6502a
Apr 24, 2007
500
632
Earth
I’ve nothing to hide, but this just doesn’t seem right to me.

I’m not updating any existing device to iOS15 until this is roll-out is stopped. I don’t want my photos scanned and I don’t want it to happen to my children’s messages. I ensure my children are safe myself. There’s a level of trust and these sort of forced policies just don’t agree with me.

No iOS 15 for my devices, and I'm canceling iCloud. I don't want my personal information scanned against a govt database for any purpose. There are better ways to find and arrest these people, that don't involve invading my privacy.
 

LV426

macrumors 65816
Jan 22, 2013
1,310
1,170
We must do our best to remove these images from circulation. This is a good idea. But no, people are crapping on it because FREEDUM!

There are ways and ways. This is a bad way. Also it will ultimately serve no useful purpose because your average paedophile will just turn off iCloud photos when this becomes very common knowledge. They will carry on getting their sick material from the usual sources. Meanwhile, everyone else will continue to have their phone content lazily scanned and evaluated by this new spyware on the phone. Waiting for the hair trigger when government agencies insist on further on-device monitoring. Governments are itching for that capability - it’s a stated objective of GCHQ spy agency in the UK.

If you are a UK citizen, something to bear in mind is that the government can now insist (by law) that phone manufacturers provide technical means to access private content. The phone manufacturer must comply, and the phone manufacturer would commit an offence if they disclose that such an order has been made. It’s unofficially called “the snooper’s charter”.
 

hagar

macrumors 65816
Jan 19, 2008
1,253
2,716
Why does everybody talk about the government? When the system detects multiple known child abuse pics in an iCloud library, Apple will disable the account ano raise an alert at NCMEC, a private NGO. Not the government.

Also, I love it when people say they’re going to switch platforms because of this CSAM stuff. Only to ignore other companies had these systems in place for years.

Google, Facebook and Microsoft put similar systems in place: Google implemented a "PhotoDNA" system in 2008 with Microsoft following suit in 2009. Facebook and Twitter have had similar systems in place since 2011 and 2013 respectively.

If nothing else, Apple is late to the party.
 
  • Angry
Reactions: peanuts_of_pathos

retta283

Cancelled
Jun 8, 2018
2,846
2,894
Victoria, British Columbia
The people creating and consuming the content in question will continue to do so majorly unaffected by this change, most are desperate enough to become wickedly resourceful and adaptive in order to 'survive'. You can make it harder for them, but you will not stop them. It's a tragedy but one that needs consideration that I don't believe it has been given yet.

Therefore, all this does is open a dangerous precedent. Do you really think that if CCP sees this and decides to ask Apple to allow them to use this tech with a puppet company's database, otherwise they will ban Apple from China, that Apple would refuse? They will cave as they always have. There are little morals or standards in the business of mega-corps. Technology is no longer the Wild West it once was and this is the result. Hijacked by the State.
 

bsolar

macrumors 65816
Jun 20, 2011
1,150
892
Why does everybody talk about the government? When the system detects multiple known child abuse pics in an iCloud library, Apple will disable the account ano raise an alert at NCMEC, a private NGO. Not the government.

They talk about the government because law enforcement will definitely get involved. From Apple's documentation, emphasis mine:

NCMEC acts as a comprehensive reporting center for CSAM and works in collaboration with law enforcement agencies across the United States.

Not to mention the tendency of governments to want to extend the scope of surveillance once in place.
 

dmelgar

macrumors 68000
Apr 29, 2005
1,573
129
I’ve been defending apple for a long time as being the one company concerned about privacy.
Now that reputation is gone. Hard to ever get it back.
sounds like storage on the phone and in iCloud cannot be trusted.
Lots of good intentions. But you have to either be for privacy or not.
apple is not.
 

Expos of 1969

Cancelled
Aug 25, 2013
3,271
6,181
Many questions and comments about how Apple will do it with varying theories and opinions. Equally or more important is WHY has Apple decided to get into this. Nothing to do with their core business. Are they being coerced into doing so by the government or does Tim view it as another perceived feather in his cap for safeguarding his customers? In either case, very worrying, ill conceived and it won't accomplish much.
 

cosmichobo

macrumors 6502a
May 4, 2006
753
413
I used to work in a government agency, which used filters on its emails to detect people sending sexual images.

Got an email one day saying I'd sent inappropriate material, which was going to be reviewed and possible disciplinary action taken.

The picture? Something like this:

1628314040919.png


The high yellow content triggered the filter - thought it was lots of skin.

Now - from the above information... that's not what Apple is doing at all. They are scanning for specific, confirmed pictures of abuse. That part at least is "sound". It's not as if you're going to get a knock on the door from the police because you sent a picture of Australia.

Of course, this is bigger than the application that Apple is currently applying. Considering the platform of privacy they've built, it seems a bit strange...
 
Last edited:

giggles

macrumors 6502a
Dec 15, 2012
835
694
Apple: here’s a system that allows us to do pedo-pics house cleaning on our servers by actually looking at law abiding citizen pics LESS, not MORE, because they receive a pre-SCORE locally based on a super accurate matching system

Reactions: shame on you! not a privacy company anymore!
 

ian87w

macrumors 601
Feb 22, 2020
4,685
6,715
Indonesia
Apple: here’s a system that allows us to do pedo-pics house cleaning on our servers by actually looking at law abiding citizen pics LESS, not MORE, because they receive a pre-SCORE locally based on a super accurate matching system

Reactions: shame on you! not a privacy company anymore!
Who is defining what the "pedo-pics" are? Who's determining your score?
Also remember, in the US, nudity is frequently being considered as porn. And often we're not even talking about real humans. Compare what's acceptable in the US vs Japan.

Also in other countries, like mine, kissing is considered indecent. In some other countries, same sex relationships can lead to death penalty. Apple is advertising this feature to the world, saying that they can tailor/customize the system on per country basis. Do you see the implication of your simplistic mindset?
 

giggles

macrumors 6502a
Dec 15, 2012
835
694
Who is defining what the "pedo-pics" are?

A well established repository of known CSAM?

Interesting blog post that argues that this move by Apple may also work to shut up the “think of the children” argument from people asking for actual backdoors:


So this would work exactly the opposite of a slippery slope.

When all it’s said and done, what is actually keeping these companies from doing evil/overreaching is backlash in free countries and subsequent legislative pressure. So I don’t fully buy the slippery slope argument. It’s like asking “why is Google search censoring CP results, what’s next, will they censor confederate flags and same-sex kissing?”. Yeah, they try that, the backlash will be glorious. Even if they do it abroad. Of course there are foreign laws and the sad reality these companies can’t change the world and overthrow dictators on their own. They live in a constant equilibrium between backlash and pressure. Even what’s happening today in terms of backlash is somehow “healthy“ for the system, I suppose.
 

Macaholic868

macrumors 6502a
Feb 2, 2017
587
760
CSSM? Must we have an acronym for everything? Enough. If Apple wants to spy on my photos I’m out. The last thing innocent iPhone users need is to run into an inevitable bug that has the cops knocking on their door like they are a low life with a warrant to search everything they own. iCloud Photos is now shut off. Try something like this again and hello Android. Sorry but bugs are inevitable and all it takes is one to do serious damage. I appreciate the intent but this is way too much Big Brother for me. I don’t need Uncle Apple going through my photos. I’ve beta tested iOS. If they think this won’t backfire and there won’t be bugs then I’ve got a bridge in NYC to sell them. Seriously Apple, WTF????
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.