Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Hard to believe that there's truly that level of actual child p# on something like IG / FB. Makes me wonder just how much the definition of p# has shifted. Basically, it's whatever the gov't says it is, because they control the database.

As in, you post a picture of your kid on social media. Someone uploads that picture to the CSAM database, now you're guilty. It's really that easy.
 
The discussion on this subject is so divorced from reality it has left said reality far behind. So few read with an open mind and a willingness to understand. I truly despair for any meaningful discussion on an important subject.

If you think there is a better model currently to address the issue than that would be worth discussing. If you seriously believe Apple is blatantly lying in their explanation then what possible resolution is there to this situation?
Who decides what is p# and what is just a simple pic of a child uploaded by a parent? By definition, the database is all p# so they won't open it up to scrutiny / independent review by some sort of watchdog agency.

What's to stop the gov't from scraping social media and indiscriminately uploading pics to CSAM, to then by definition get the pics defined as CSAM, and hence all sorts of new "criminals" exist to prosecute?
 
  • Like
Reactions: PC_tech and Pummers
Given all the negative feedback and all these “clarification” discussions, I have a feeling they won’t backpedal on this.
Clearly a deal has been cut been Apple and the Feds over this subject. This is Apple's way of throwing them a bone instead of handing over the keys to decrypt iOS.
 
What's to stop the gov't from scraping social media and indiscriminately uploading pics to CSAM, to then by definition get the pics defined as CSAM, and hence all sorts of new "criminals" exist to prosecute?

I can think of a number of things.

1) First, the government has to be able to have their way with the CSAM database. The person in question also needs to be using an iphone, and store these photos on iCloud.

2) Even if this triggers a ton of false positives, it would still have to be vetted by Apple employees first. When they see that the images flagged are not child pornography, the case will not be further escalated to law enforcement. It would also be a red flag to Apple that the CSAM database has been compromised.

If the government really wanted to “fix” someone, there are probably easier ways of doing so, that don’t require so many steps or an intermediary like Apple.
 
If you think there is a better model currently to address the issue than that would be worth discussing.

What, exactly, is "the issue"? That 0.000000001% of iCloud users might try to upload CSAM to their iCloud account? And the solution to that is to dragnet everyone?

The reason the response to this is so outlandish is because what's been explained so far accomplishes next to nothing. All this effort for near zero return makes one think that there's something more going on.
 
Naughty” and “nice” isn’t even close what’s going on here. I have plenty of the “naughty” variety and you know why I won’t get reported to the cops? Because they aren’t illegal pictures.
They aren't illegal? I'll be the judge of that. Send them to me if you're so confident, and I'll forward them to the FBI.
 
I would have been impressed by such kind of straight statement:

"We at Apple have been forced by law to perform image scans, which we are supposed to justify with the protection of children. Since we are too weak to enforce consumer interests against the institutions by pushing for a general ban on image analysis, we want to be a little better than Google and the like, and have created an instrument that is difficult to explain."

This would return the game to those responsible and Apple would not have lost any trust and would not have to give crude and funny interviews…
You’d be impressed by a statement that Apple would never even think of saying?
 
The way I understood it from the WSJ interview the database will be universal worldwide, probably something like global antivirus databases?

The thought that China, Russia, et al would allow for only one database controlled by and maintained in the US is beyond imagination. The afore mentioned countries, and certainly others will demand databases they build, and control.

I am not a techie, but it seems to me the "security" is based on hashes. So, it would seem that any national authority of the country in question would simply create its own database, with the requisite hashes. And then force apple to search for those hashes....

Or am I missing something.

And lest someone believes Craig's claim they would push back and NOT accede to the request is kidding themselves. Look at Apple and China;

  1. Apple caved to China: now stores customer data on Chinese government servers
  2. Apple caved to China: now shares customer data with Chinese government. In fact, twisted itself into a pretzel to create a legally defensible way to circumvent US law
  3. Apple removes apps at the demand of the CCP: access to international news apps, anything that might mention, Tiananmen, Tibet, etc.
  4. ...Etc. Think Saudi Arabia and dropping FaceTime because it is encrypted.
  5. And finally to be balanced, don't forget those gosh darn sneaky National Security Letters . . .
 
Last edited:
LOL. Federighi's explanation of "what is happening with your pics" starting at 2:20 is a textbook example of a terrible answer that 100% validates everyone's concerns.

Keep doubling down, Apple.
Craig must have loads of money. Why continue on and embarrass himself and let his reputation (if it is decent) go down with Apples? Retire with some dignity and then do something else or enjoy yourself.
 
So what I'm hearing is that once I have 30 pictures of my baby, Apple will start looking at them taking baths? No thanks.
Theoretically, you'd probably have to upload them to FB or IG first, so that they can be in turn uploaded to the NCMEC database. Now the hash checks will match.
 
The objective is not for Apple to act as law enforcement. Rather it is to satisfy the legal requirement that Apple has to ensure that their servers do not host illegal materials. This development will likely pave the way for future E2EE for iCloud Photo. There is no way for Apple to achieve proper E2EE for iCloud Photo if the check is done server
The objective is not for Apple to act as law enforcement. Rather it is to satisfy the legal requirement that Apple has to ensure that their servers do not host illegal materials. This development will likely pave the way for future E2EE for iCloud Photo. There is no way for Apple to achieve proper E2EE for iCloud Photo if the check is done server side.
You both miss the point and obfuscate the main issue. First, by scanning your DEVICE, it is no longer YOUR device. Apple will see and judge what’s on it. The only limitation is Apple’s choice. (And don’t compare to other companies. Wasn’t Apple just telling us how bad those other companies were?) Would the execs at Apple be willing to give up the same choice to others? Second, people keep bringing up this possible E2EE iCloud encryption. Where is this coming from? I have seen enough vaporware in my time to know that speculative software is meaningless. I am an Apple fan, but am not willing to sacrifice privacy and/or rights out of blind loyalty. This is difficult for me and means a huge amount of work and $ investment to change. However, I WILL change over this, as will family and friends. Yesterday, I talked to a Best Buy sales person who already knew all about this and seemed to share concerns. None of this can be good for Apple, but my guess is that they force it through banking on too many people being too invested to switch. We’ll see. 😞
 
There is no confusion, Apple is no longer interested in privacy and security. Any further hype about Apple privacy and security in their ads and/or events they are simply lying.
Or, they are simply more interested in the revenue streams associated with the demands of totalitarian markets than they are in your privacy. In other words, privacy has become an impediment to greater revenue growth.
 
I am Apple user for 20 years. My company runs on Macs. Every employee is with an iPhone.
I am switching to Arch Linux all company computers. Luckily we have not invested in Apple products with T1 chip or M1.
Make no mistake: This is intrusion on user space of gigantic proportion. Apologetic or fanboy position hidden behind pseudo technical understanding of the problem makes you look stupid. We all have invested tons of money in Apple products, some of us in stocks too. All is over with this now.
Making money is good. But making decisions on facts and professional opinions of smarter than me people is more important. Apple is a company with deep pockets and interest in all aspects of our lives. All. Soon they will make a car.
The company philosophy of unification of platforms and walled garden is the worst case scenario for consumers ever.
When they hide taxes and use child labor governments are closing their eyes. Suddenly your data is of upmost importance and this software implementation gives a Door (not Backdoor) for every government or high level business interest to go in.
Think again when you give Apple your money next time.
My next phone will be carefully chosen and may be I am willing to abandon smartphone UX all the way.

Example of smarter than me people:

Oliver Knill from Harvard is sharing his thoughts here.


P.S. Created this account just to share my point of view. I hope we all realize seriousness of this Apple overreach.
Have a Good Day.
 
As a Apple/Macintosh user since 2007, I have totally lost confidence in the Mac platform. Therefore I've started the migration to Linux last year. MacOS was once great operating system, but since Apple is no more a computer corporation, it gets not the attention it deserved. Bugs aren't fixed for years an the new features are not interesting for me at all.

The newest developement in the iOS/iPadOS is very concerning. I do no longer have trust in this software. I'm currently looking for alternatives. Luckily, I do not use any Apple service other than the mandatory App Store. Apple probably won't get any money from me any more...
 
The idea that Apple will be in the business of continuously updating what is essentially a global child pornography blockchain with every possible known image in that category, in every possible country based on age of consent, correctly identifying the exact age of any real person present in the photo, and determining whether or not the photo is even real or simulated or photoshopped, even if outsourced, is patently impossible as it is ridiculous… and it ignores the far greater threat of newly created imagery occurring on an hourly basis all over the world that will never make it into a database. Also, it doesn’t work on video.

So by all means, let us toss user privacy into the trash in exchange for a system that couldn’t possibly work and makes absolutely no sense practically.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.