Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
But it won't. Your taking the risk because it makes you feel better to hear grandstanding and worthless rhetoric.

It would however, save one child if we stopped using automobiles (4200 per year). What is your stand on that?

It would however, save one child if we stopped vaccinating children. What is your stand on that? I know, more would die if they were not vaccinated. But you don't know that because the CDC does not keep accurate records. Read the CDC documents and see what contortions they go through to justify the vaccine deaths.

If would however, save one child if we stopped letting children swim(1200 per year). What is your stand on that?
His stand probably is that a random consumer electronics company should do the police’s work, without being elected or answering to anyone but the dollar, as we’re seeing here.
 
Why would Apple be so eager to search for child porn and not for terrorists or enemies of the state?

Maybe because it makes for better marketing?

Oops!
It's precisely because they might scan for "enemies of the state" in the future that this is a concern!

Remember, you may not be an enemy of the state today, but the state can change, and you might end up as such tomorrow.
 
It's precisely because they might scan for "enemies of the state" in the future that this is a concern!

Remember, you may not be an enemy of the state today, but the state can change, and you might end up as such tomorrow.
Of course, I forgot the /s

It shocks me that in so many people’s minds every state is this perfect, all loving, divinity-like entity that knows better than anybody and should control everything you do. Really, really shocking. It’s just people with power, ffs
 
  • Like
Reactions: PC_tech and IG88
so millions of people now can safely purchase an iPhone 13 without surveillance software pre-installed on it from the factory. That's a relief. Guess I don't have to completely bail the Apl ecosystem just yet
 
Just don't scan on my device, Apple. I don't mind if you scan for such material on iCloud; they're your servers so you have the right to keep such content off them. But my device is sacrosanct; don't scan there.

What's on my iPhone stays on my iPhone, right? Apple made that promise!
Don’t use iCloud Photos and it won’t scan your device. If you do use iCloud photos why does it matter where it gets scanned?

Apple did promise that what happens on your iPhone stays on your iPhone which is why they are doing the scan on device and only if you enable and agree to using iCloud photos (which is you consenting to said photos being sent off your iPhone)

it’s all about consent and everyone who agrees to the terms and conditions and enables/uses iCloud has given it.
 
Never claimed that pedos were all that bright, since they do all their thinking with their dangly bits.
They don't have to be bright in the slightest to avoid these measures. Just marginally less stupid than a sack of carrots. As it happens, most of them are probably not stupid in the slightest. They are devious.
 
  • Like
Reactions: PC_tech
If Apple wants to reduce the spread of CSAM then the approach or scanning users iCloud photos really isn’t the way to go since it only works on existing, known images. That being said I do support the other features they were looking to add - namely allowing Messages to alert parents to nudity and adult content being sent or received on child accounts. Apple could take it a step further prevent the camera on child accounts from even capturing images including nudity. This would have a far bigger impact on preventing predators from soliciting explicit images from unwitting minors, as well as cutting down on the epidemic of teen sexting and subsequent revenge porn while leaving privacy intact for adult users.
 
  • Like
Reactions: Momof2.1107
If I were to be Craig, I would just implement the local scanning mechanism for anything that is going to Apple's servers, and when a kiddie porn match is found, iPhone would isolate the kiddie porn into a folder/album. Then, let the user know due to regulations Apple is not able to host this content and reject the specific upload to iCloud. This match should neither be logged on the device nor should Apple be notified. Therefore, no outside party should be notified.

This implementation saves the fire on Apple's butt without snitching on users.

I think most reasonable people, including Snowden, would agree that this is not a backdoor. This is just like, for example, scanning for viruses before uploading an attachment, or scanning for unsupported image formats, or checking if the file is too big to be sent over an email, etc.
Except end-to-end encryption should prevent anything from being on Apple's serves to be identified as CSAM, because that's the point of end-to-end encryption. Researchers figured out a way to circumvent this, and then immediately advised companies to NOT do it.


Companies are required to report CSAM materials to the authorities.If Apple were to just shift them into a folder/album then Apple would be hosting child exploitation materials themselves. The mere transfer of those materials to their own storage would make them, probably overnight, the largest host of CSAM on the internet.

I don't really know a system that scans for attachments when uploading, aside from a file server itself, and in that case you're talking about a corporate file server. When you're sending work email through a work server, you're sending email on behalf of the employer, and they can do whatever they want.

But even in that case, the whole of the device isn't scanned for materials, and in that case encrypting the data prevents the email server from being able to scan it for viruses.

Looking at a file's size don'ts involve looking inside the file's content. Looking at the supported file format doesn't require the email to be read by someone. It's simply rejected.

In this case when data reaches a threshold it's passed to a human, to look at the content. What happens when instead of CSAM materials they read something that's anti-government in a country where that's illegal? There's nothing that stops a government from requiring a company to scan every piece of email or every attachment for anti-government sentiment. It already happens in China and Hong Kong.
 
  • Like
Reactions: jhollington
Will not be upgrading any of my software until it’s repealed completely. If that doesn’t happen then I have bought my last Apple product.
I mean, okay? It's not implemented, couldn't you just throw out your devices if it becomes implemented? Let's say Apple just tries to not get a lot of attention about dropping these plans, and pushes it off forever. Will you never upgrade your OS?

For this functionality to be employed it has to be activated. Just bookmark this site, you'll know if/when it's on, and go trashcan your stuff then.
 
Wrong.

Craig acknowledged ONLY the confusion amongst media and Apple iOS users.

He was STEADFAST in upholding CSAM scanning - watch the interview with Bloomberg.
A steadfast man, faced with the consequences of lawsuits in dozens of countries and users being willing to switch platforms, can become less steadfast pretty quickly.
 
I understand why this is a slippery slope but I don’t like the idea of child predators breathing a sigh of relief.
Child predators probably are dumb but likely not this dumb. The rate of criminal prosecution resulting from this technique is likely zero.
 
  • Sad
Reactions: DeepIn2U
Maybe a happy compromise for those in countries enjoying a certain threshold of freedom/privacy would be for Apple to enable this only in countries who regularly abuse freedom/privacy. Those countries can have end-to-end encrypted cloud storage enabled, meaning they can store all the Winnie-the-Pooh photos or anti-government content they could dream of without fear of retribution.
The thing is, the bulk of child porn is consumed in the US and produced in Japan and the Philippines. China culturally does not sexualize children. Neither did Japan, actually, until the anime and rorikon (lolita obsession) kicked in. This is a relatively recent social phenomenon. The US soldiers stationed in the Philippines are the biggest consumer of, not child porn, but child prostitution.

I researched this extensively when I vacationed in the Philippines and Thailand. I was appalled by the treatment of women and the condition of children there. Thailand was much better in comparison, but they have the issue of "transgender freak show". Boys of poor families are being sold off to be neutered and end up in circuses as ladyboys. those ladyboys don't just show you their bits and what they got and entertain you. They do things that are truly damaging to their bodies. Therefore, they only live a very short life. For example, things they inject to keep them look a certain way, and when they do their freak show, think about stainless steel ping-pong balls coming out of places that are not supposed to host 8 ping-pong balls. I will leave you to imaging the rest, and the type of freak show they do.

Edit:

To clarify, lolita is generally regarded as around 14 years old girls. Rorikon is much younger, around preteen girls.
 
Last edited:
Apple the greatest company in history with the greatest humanitarian intentions forced to deal with grandstanding ignorant politicians and self centered selfish advocacy groups. It’s unbelievable!
First of all: I don't buy products to satisfy the latest "humanitarian" or "social justice" goals of a company's executive management or employees. I buy products to satisfy my needs. Secondly: Those "self centered advocacy groups" consist of the entirety of every security and privacy individual and group on the planet.

The lack of backbone by Apple on some of these controversial situations they find themselves in is just astonishing to me.
I understand your point, but should they have allowed pride and hubris to overrule good sense?

But damn.....game that out on the front in before you put yourself out there ...
That would be wise. They should have floated this concept, then decided whether to proceed with it or change course. Instead what they've accomplished is to have spent a bunch of time and money on development and damage control, damaged their reputation among a segment of their previously-loyal customer base, and given their competitors ammunition.

Had they floated the idea, first, I probably wouldn't be in the process of increasingly backing us further and further out of the Apple ecosystem. I might have suspended certain planned purchases to await further developments, rather than cancelling them outright.

Plus the toothpaste of trust broken can't simply be put back into the tube: they insisted we didn't understand it, we were looking at it wrong, etc. etc.
This ^^^^^, precisely.

I've already backed my family partially out of the Apple ecosystem and cancelled pending purchase plans. Those things are permanent. We will not be going back into the Apple ecosystem any deeper than we are now, and will probably continue backing further out.

The trust I had in Apple is gone :(

Just in case folks are not aware (although I'm sure many of you are) https://www.apple.com/feedback/iphone/
Did that by the Monday following the original announcement.

People really don't understand this system...
Yeah, ex-software developer plus over twenty-five years in IT Admin. I "didn't understand the system." Every last security and privacy individual and group on the planet "didn't understand the system." Yeah, that was the problem
rolleyes.gif


A little sacrifice of “freedom” isn’t worth it for a big reward?.
"Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety." -- Ben Franklin

If end-to-end cloud encryption is a possible goal, then this implementation defeats that purpose.
Apple has given us no indication whatsoever E2EE was going to happen and, in fact, has recently backed-off plans to implement it. Certain three-letter agencies didn't like it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.