Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Don’t know what Snowden is upset about. Apple is in their right to keep child abuse off their servers. This feature acts as a deterrent so that nonces don’t even try it. It protects the PRIVACY of children.
If that's the case, why don't they scan pictures on their servers instead of offloading that entire task to each individual's iPhones? I have no problem with them scanning their own servers - they own them and we have an understanding that we are merely renting their services. Put I have a huge problem with them using space and processing power on my iPhone that I never asked for or cannot disable. Why should the majority of us suffer the consequences because a very small handful of pedos out there?
 
Last edited by a moderator:
Don’t know what Snowden is upset about. Apple is in their right to keep child abuse off their servers. This feature acts as a deterrent so that nonces don’t even try it. It protects the PRIVACY of children.
Apple must care a lot about CP, because they have a whooping 200-300 reports about it per year, while Facebook is in the 20 million range.

And why didn't they do something about it until now? They had the technology for it for years and did nothing. They could and still can do cloud based scans. But nothing.
 
Don’t know what Snowden is upset about. Apple is in their right to keep child abuse off their servers. This feature acts as a deterrent so that nonces don’t even try it. It protects the PRIVACY of children.

Except the child pornographers can just turn off iCloud Photo Library and won't have any issues, while the rest of us innocent users are constantly surveilled.

I would also love to see Apple cite actual data about child exploitation instead of just issuing broad proclamations about how this is such a huge problem.
 
Last edited by a moderator:
Is Tim Cook about to retire, and like a disgruntled employee walking away from a dumpster fire they started, leaving this mess for someone else to inherit?

This isn’t the same Apple I’ve bought from for almost three decades.
as said above, they are up to something or their hand is being forced. Caring for children? Then why do album covers with near nudity get prominence on the Music app browse page?

Great points..
Also - why the hyper narrow focus on this one type of crime?

Why not scan for domestic abuse?
or people taking pics of drinking and driving?

The list could go on for days...

Not acceptable Apple.
Get out of this business before you even start.

This is turning you into an arm of law enforcement.
 
If that's the case, why don't they scan pictures on their servers instead of offloading that entire task to each individual's iPhones? I have no problem with them scanning their own servers - they own them and we have an understanding that we are merely renting their services. Put I have a huge problem with them using space and processing power on my iPhone that I never asked for or cannot disable. Why should the majority of us suffer the consequences because a very small handful of pedos out there?

Because it is illegal for Apple to have child abuse or any such content on their servers. By having the scan happen just prior to an upload, it protects Apple from being complicit. It should stop the image upload or the image from being sent to a recipient. Apple’s method right now is that the image will become blurred and a parent will be notified. It is better if the image simply doesn’t transmit.

You are not suffering any consequences if you’re not a nonce. It simply doesn’t affect you.
 
Don’t know what Snowden is upset about. Apple is in their right to keep child abuse off their servers. This feature acts as a deterrent so that nonces don’t even try it. It protects the PRIVACY of children.
It only applies to iCloud photos? I thought the phone scans your device photos too?
 
  • Like
Reactions: Clocksetbyfeel
Sadly this was bound to happen as nearly every parent is giving kids iPhones they have no idea how to monitor or filter. Lots of parents have no idea how to properly parent today so I wouldn’t expect them to know how to set up parental controls either. What’s even more scary is all the lower income folks who are giving their kids android phones that are many years out of date.
 
  • Like
Reactions: boswald
Except the child pornographers can just turn off iCloud Photo Library and won't have any issues, while the rest of us innocent users are constantly surveilled.

There are three choices:

- The device scans the image and should prevent it being transmitted. The user has control and Apple doesn’t have illegal content on their servers.

- The cloud scans the image and that means no user control. That means Apple gets the illegal content and the user is surveilled.

- No protections at all for children.
 
"We get to keep our users' phones private, or we no longer make -- or sell -- iPhones in China."

I don’t know about that. At the risk of shutting down this thread (sorry in advance), how are they going to resist the pressure to give this tech to China? All the CCP has to do is say we want this software engineering or you no longer have access to our market.

This will be reverse engineered by somebody for nefarious purposes.
This makes zero logical sense.

- the data available for device ai to scan is the exact same as icloud so even if CCP does ask . it is exactly the same as asking give us users icloud data

once you realize this fact you ask the question, if its the same as icloud data, then what is the slippery slope exactly if no NEW data is available to scan
 
I'm afraid I have to agree with those here that think this is about something altogether different. Unless I am sorely mistaken, this only applies to images uploaded/synced to iCloud...correct? In which case, by stating that publicly, the nonces that they claim to be trying to go after have just been given clear warning about how to avoid being detected...while those for whom noncery is not a past-time they partake in will continue uploading their images to iCloud where, with a little bit of pressure from the right people, Apple will be able to compare those images to hashes for any number of things including - as others have stated - "political" targets and issues (those who wrong think about party politics, gender issues, race issues, you name it).

Even with the most charitable interpretation, the fact that this timing coincides with the seemingly inevitable arrival of "health passport" apps should concern us all. Or should we believe that these massive intrusions worldwide against physical and digital privacy and freedom that are happening concurrently are just "mere coincidence"???
 
They care about the kids yet no matter how parental controls are set on the AppleTV music app my 7 year old gets to surf past 3 naked guys with their junk blurred out (Lil Nas X) to get to her kids bop songs.
Exactly this….either way just about every kid over 7 now has a porn browsing device in their pocket with parents who have either little knowledge or desire to attempt to restrict the device.
 
Those who are complaining obviously did not read how the technology works.

You have a higher chance of winning the lottery than Apple erroneously looking through your photos.
That's bad, there are lottery winners each week, considering all winners and not only the jackpot alone that would go into millions of errors each year. Some countries on the globe have multiple lotteries every week.
 
Why are you defending Apple so desperately? Have some dignity, how the technology works is not even the point.
I'm personally not defending either side but I will say a member has a right to defend Apple if they want and others here don't have a right or just cause to question them. It's a discussion forum not us against the evil giant called Apple. 🙄
 
  • Angry
Reactions: Shirasaki
Because it is illegal for Apple to have child abuse or any such content on their servers. By having the scan happen just prior to an upload, it protects Apple from being complicit.
We get it you don't have a technical background, leave alone a PhD, but you have to understand this couldn't be simpler. User to Apple connection is secured, no one else involved or can see data. Upload data to Apple (can go to a temporary buffer), then scan it there. If the result is ok, Apple can proceed as normal and put the file on a users iCloud account. If they have a hit, not put it on a users iCloud account, instead delete or store the content as evidence and submit a report.

This isn't rocket science. One of my former colleagues, a security and network expert, is regularly consulting on CP cases and he's often tasked with securing forensic evidence.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.