We're coming over tomorrow to search your bedroom. If you don't possess such materials, you should be ok with it. Please post your address and phone number.If you don't possess such materials, what are you so afraid of?
We're coming over tomorrow to search your bedroom. If you don't possess such materials, you should be ok with it. Please post your address and phone number.If you don't possess such materials, what are you so afraid of?
Also we don't care if you're asleep or not.We're coming over tomorrow to search your bedroom. If you don't possess such materials, you should be ok with it. Please post your address and phone number.
To be fair it’s always been about marketing….none of these companies careThere is no confusion, Apple is no longer interested in privacy and security. Any further hype about Apple privacy and security in their ads and/or events they are simply lying.
thanks for the link - interesting readFor a normal hash, yes. The hash they are using something different:
Hard to believe that there's truly that level of actual child p# on something like IG / FB. Makes me wonder just how much the definition of p# has shifted. Basically, it's whatever the gov't says it is, because they control the database.Facebook/Instagram reported over 20 million instances of CSAM flagged images just last year alone.
And all the moaning will still buy the new iPhones and new macs.Protest with your wallet, folks. All the moaning here is futile.
Who decides what is p# and what is just a simple pic of a child uploaded by a parent? By definition, the database is all p# so they won't open it up to scrutiny / independent review by some sort of watchdog agency.The discussion on this subject is so divorced from reality it has left said reality far behind. So few read with an open mind and a willingness to understand. I truly despair for any meaningful discussion on an important subject.
If you think there is a better model currently to address the issue than that would be worth discussing. If you seriously believe Apple is blatantly lying in their explanation then what possible resolution is there to this situation?
Clearly a deal has been cut been Apple and the Feds over this subject. This is Apple's way of throwing them a bone instead of handing over the keys to decrypt iOS.Given all the negative feedback and all these “clarification” discussions, I have a feeling they won’t backpedal on this.
What's to stop the gov't from scraping social media and indiscriminately uploading pics to CSAM, to then by definition get the pics defined as CSAM, and hence all sorts of new "criminals" exist to prosecute?
You can't secretly airdrop pictures to other people's phones.
If you think there is a better model currently to address the issue than that would be worth discussing.
They aren't illegal? I'll be the judge of that. Send them to me if you're so confident, and I'll forward them to the FBI.Naughty” and “nice” isn’t even close what’s going on here. I have plenty of the “naughty” variety and you know why I won’t get reported to the cops? Because they aren’t illegal pictures.
You’d be impressed by a statement that Apple would never even think of saying?I would have been impressed by such kind of straight statement:
"We at Apple have been forced by law to perform image scans, which we are supposed to justify with the protection of children. Since we are too weak to enforce consumer interests against the institutions by pushing for a general ban on image analysis, we want to be a little better than Google and the like, and have created an instrument that is difficult to explain."
This would return the game to those responsible and Apple would not have lost any trust and would not have to give crude and funny interviews…
The way I understood it from the WSJ interview the database will be universal worldwide, probably something like global antivirus databases?
Craig must have loads of money. Why continue on and embarrass himself and let his reputation (if it is decent) go down with Apples? Retire with some dignity and then do something else or enjoy yourself.LOL. Federighi's explanation of "what is happening with your pics" starting at 2:20 is a textbook example of a terrible answer that 100% validates everyone's concerns.
Keep doubling down, Apple.
Theoretically, you'd probably have to upload them to FB or IG first, so that they can be in turn uploaded to the NCMEC database. Now the hash checks will match.So what I'm hearing is that once I have 30 pictures of my baby, Apple will start looking at them taking baths? No thanks.
The objective is not for Apple to act as law enforcement. Rather it is to satisfy the legal requirement that Apple has to ensure that their servers do not host illegal materials. This development will likely pave the way for future E2EE for iCloud Photo. There is no way for Apple to achieve proper E2EE for iCloud Photo if the check is done server
You both miss the point and obfuscate the main issue. First, by scanning your DEVICE, it is no longer YOUR device. Apple will see and judge what’s on it. The only limitation is Apple’s choice. (And don’t compare to other companies. Wasn’t Apple just telling us how bad those other companies were?) Would the execs at Apple be willing to give up the same choice to others? Second, people keep bringing up this possible E2EE iCloud encryption. Where is this coming from? I have seen enough vaporware in my time to know that speculative software is meaningless. I am an Apple fan, but am not willing to sacrifice privacy and/or rights out of blind loyalty. This is difficult for me and means a huge amount of work and $ investment to change. However, I WILL change over this, as will family and friends. Yesterday, I talked to a Best Buy sales person who already knew all about this and seemed to share concerns. None of this can be good for Apple, but my guess is that they force it through banking on too many people being too invested to switch. We’ll see. 😞The objective is not for Apple to act as law enforcement. Rather it is to satisfy the legal requirement that Apple has to ensure that their servers do not host illegal materials. This development will likely pave the way for future E2EE for iCloud Photo. There is no way for Apple to achieve proper E2EE for iCloud Photo if the check is done server side.
Or, they are simply more interested in the revenue streams associated with the demands of totalitarian markets than they are in your privacy. In other words, privacy has become an impediment to greater revenue growth.There is no confusion, Apple is no longer interested in privacy and security. Any further hype about Apple privacy and security in their ads and/or events they are simply lying.