Sounds like these are dumb employees who got through the equity quota system. I would fire every employee who objects to this implementation. Also 0.01% do not get to decide a policy. They are irrelevant.
Spot on with the propaganda part. The social justice world savior complex is a tough argument to fight because it’s a baited one.Always how it happens with big tech. That's what this issue fundamentally comes down to, the consumer's power has diminished to the size of plankton and now the corporations have all the say. Apple can literally do whatever they want and they will not suffer any consequences, especially when emotionally charged propaganda is behind them. If this were a much smaller company (start up size) and there were many other alternative options they would be dead within a week as most people do not accept this surveillance technology.
Exactly. People are too lazy or “busy” (lol) to bother with standing up for themselves and just roll their eyes. Then, when it’s too late they say “oh ****” like a complete moron when they could’ve done something to stop it.Slow boiling the frog.
They're starting with something that's universally reviled, then they'll add other stuff gradually.
Transphobia in the UK, insufficient reverence for black people in the US, homosexuality in Saudi Arabia etc.
And at every step of the way, people like you will say:
"Well as long as you're not a pedo / a racist / a homosexual / infringe on copyright / have impure thought / criticize the government you have nothing to fear!"
In what way is "we're going to extend our current searching of your photos for CSAM materials on our iCloud servers to scanning your entire local computer for copyright material" an easier sell than just announcing it directly?
If this was Apple's end game do you honestly think this would be the best possible way they'd come up with approaching it?
You clearly don't know how this works. You need to educate yourself on how CSAM works, and how other companies have been doing this for years already. Apple is only just catching up. Use google photos app? use Gmail, Hotmail, Instagram, Facebook... they all scan your images you upload for CSAM hashes and match them against a database of current hashes... they don't scan the image in your photo, so your wifes and kids pictures are safe.If you don't see a problem with this technology. Apple is definitely playing you. Pretty sure you don't want Apple to be looking/scanning/analyzing/identifying your wife's pictures. This CSAM stuff needs to be shut down. Apple needs to respect our privacy period. Apple needs to figure out another way if they are really interested in catching pedophiles... God knows for what reason.
I agree with your comments. But in other threads you are such a cheerleader for spending lots of cash on the new iPhones etc. How do you reconcile your double personalities?
CSAM itself is not the problem... it is the mechanism of it which is ripe for potential abuse.
Enough is Enough, Apple! It is incredibly distasteful for you to SPY on us, as consumers. Apple it will be in your best interest to give us the Opt-Out option from CSAM feature, please.
Where are you, Craig? You said this yourself in WWDC - 2021. What happened?
“At Apple, we believe privacy is a fundamental human right,” said Craig Federighi, Apple’s senior VP of software engineering. “We don’t think you should have to make a tradeoff between great features and privacy. We believe you deserve both.”
This is a punishment and a big slap in the face for anyone who owns an iPhone. Whatever you are trying to accomplish with this Apple, leave it in government and law enforcement hands.
1. How about CSAM scans Apple's executives' iPhones first? No one wants their privacy to be exposed. Please stop this nonsense and RESPECT our PRIVACY. It is our fundamental human right.
2. How come this CSAM stuff was not mentioned by Apple during WWDC - 2021. Apple is up to something. Why now? When we are almost to the release date of iOS 15...
![]()
Apple adds powerful new privacy features to Mail and more
Apple is constantly touting how they are embracing user privacy. This year's updates take another shot at advertisers in the name of privacy.www.cultofmac.com
Also, this guy needs to be FIRED from Apple. He is the mastermind behind CSAM. What a joke!
View attachment 1818081
Does anyone here have a game plan on how we can stop this crappy CSAM feature?
That wasn't a random Physicist, that was Einstein.Random Physicist: "I invented the A-bomb. But it should not be used."
Yeah, but a lot of people are and they use social media/messaging apps, like WhatsApp, so the moment one of them sends something out that may be of questionable acceptability into a group you happen to be in an app like WhatsApp is going to get that scanned by "the system" for you if you aren't careful."Well as long as you're not a pedo / a racist / a homosexual / infringe on copyright / have impure thought / criticize the government you have nothing to fear!"
haha yeah and where are they going to go? Android? Google already do this and scan for CSAM. No difference.People are really taking this seriously and planning on leaving Apple. They don't care about iOS 15 anymore... They don't care about the Apple's Fall line up.
Fact is there is the perception whether rightly or wrongly that Apple is walking back some of its privacy marketing with this move and that's not good for the brand.Some employees are worried that Apple is damaging its industry-leading privacy reputation.
How do you know that CCP doesn’t already have those keys which belongs to Chinese accounts? For the most part, iCloud isn’t end-to-end encrypted — the cloud operator has the decryption keys."We will refuse any such demands... except when the FBI ask us not to encrypt iCloud Backups.
Oh, and when the CCP ask us to hand over iCloud decryption keys"
At least Apple has been up front about this. Unlike our own (US) government and their alphabet spy agencies. If they work, we at least have ways to turn off Apples scan. The governments don’t provide that option.i can’t get over the feeling apple is doing this because of some pending porn legislation somewhere (GB or EU), some government coercion somewhere else regarding market access (China) or some broad decryption lawsuits or threats of monopoly breakup somewhere else (US, from DOJ/FBI / new sideloading bill in US congress), and is either trying to comply, get ahead of, or appease.
Fact is by doing this, Apple is demonstrating proof of concept.
Fact is that Apple won’t be able to refuse when some government somewhere now enacts law, based on expanding this proof, that scans for other images, symbols, words, or text strings, without apple screening and with results delivered directly to that government, or else risk indictments, civil suits, market closures, test suits, breakup or regulatory threats and actions.
Fact is that Apple could be made to comply without being allowed to publicize its objections. For reference, just recall the National Security Letters that forbade/forbid companies from discussing the mere existence of being served such surveillance orders (this is in the USA, which theoretically is more transparent than repressive countries not to mention has a written Bill Of Rights that many countries lack.)
Fact is that Apple has encouraged us to think of, and use, our Apple devices as secure extensions of our brains, presumably subject to the same protections as our brains under their Privacy As A Human Right (BS), and further, that presumably as a US company their principles are informed by the traditions and the protections afforded by the Bill of Rights, especially the 4th and 5th Amendments pertaining to Search and Seizure, and Prohibiting Self Incrimination, but not forgetting specifically that a person is presumed innocent and no searches shall ensue without a judicial warrant justified by a reasonable suspicion of guilt.
Fact is that Apple has just (voluntarily?) become a willing extra-judicial adjunct of state security and law enforcement, with its plan to willfully perform warrantless searches, while thumbing its nose at the protections enshrined in the Bill Of Rights.
Fact is that Apple has already announced willingness (actually intention) to take further steps down the slippery slope by expanding to other countries and to 3rd party apps after starting with its own Photos app and iCloud services.
Fact is Millenia of human existence have tried to teach us a) the state will try to overrun the rights of the individual on a pretext, b) mission creep is a real thing, c) moral zealotry is a dangerous thing, d) just because you can do it doesn’t mean you should, e) appeasement doesn’t appease, f) if you have stated principles, they must be inviolate, else they are not principles, g) doing the wrong thing for the right reason is still doing the wrong thing, and h) the road to hell is paved with good intentions.
I think most people would agree that CSAM is a scourge but Apple is now so on the wrong side of its rhetoric, stated principles, the Bill Of Rights, millenia of learning and just good sense that one really wonders how Apple got itself tangled up in this issue, and wonders how Apple could at turns be both so naïve and so arrogant as to think that a legal push won’t now come to a statutory shove, one Apple won’t be able to “vehemently refuse”.
The interesting thing to wonder is whether it is only detecting known CSAM pictures (already seen and catalogued in said database) or if it is scanning the content of pictures and determining if nudity/infantile attributes exist and raising alarm bells…Apple is not asking you to prove your innocence. Apple is required to ensure that nothing illegal is stored in their servers. Apple is already scanning for CSAM photos since 2019 for all photos uploaded to iCloud Photo.
I don't understand why anyone would.Don't turn on iCloud Photo Library. But quite honestly, as all this detects is child pornography, it makes me wonder why you wouldn't want it turned on.
Enough is Enough, Apple! It is incredibly distasteful for you to SPY on us, as consumers. Apple it will be in your best interest to give us the Opt-Out option from CSAM feature, please.
Where are you, Craig? You said this yourself in WWDC - 2021. What happened?
“At Apple, we believe privacy is a fundamental human right,” said Craig Federighi, Apple’s senior VP of software engineering. “We don’t think you should have to make a tradeoff between great features and privacy. We believe you deserve both.”
This is a punishment and a big slap in the face for anyone who owns an iPhone. Whatever you are trying to accomplish with this Apple, leave it in government and law enforcement hands.
1. How about CSAM scans Apple's executives' iPhones first? No one wants their privacy to be exposed. Please stop this nonsense and RESPECT our PRIVACY. It is our fundamental human right.
2. How come this CSAM stuff was not mentioned by Apple during WWDC - 2021. Apple is up to something. Why now? When we are almost to the release date of iOS 15...
![]()
Apple adds powerful new privacy features to Mail and more
Apple is constantly touting how they are embracing user privacy. This year's updates take another shot at advertisers in the name of privacy.www.cultofmac.com
Also, this guy needs to be FIRED from Apple. He is the mastermind behind CSAM. What a joke!
View attachment 1818081
Does anyone here have a game plan on how we can stop this crappy CSAM feature?
Did you read and understand the whitpaper? I have not read it but have read elsewhere that much remains unclear. Apple supposedly does not deliver everything.That's not how it works. Have a read up, Apple have put out a technical white paper on it.
Agreed!Slow boiling the frog.
They're starting with something that's universally reviled, then they'll add other stuff gradually.
Transphobia in the UK, insufficient reverence for black people in the US, homosexuality in Saudi Arabia etc.
And at every step of the way, people like you will say:
"Well as long as you're not a pedo / a racist / a homosexual / infringe on copyright / have impure thought / criticize the government you have nothing to fear!"
I see what you mean. I suppose the fall back method is Apple will physically compare the images if x amount of images someone has on their phone match these CSAM images. Which would be a very unlikely scenario.A lot here depends on the thresholds and specific features. Apple may very well have tuned it so strictly that images need to be close to identical to match. But semantic similarity means a similarity of content, e.g. are both images showing similar people doing similar things. Analysing this is hard, and Apple would not have to bother with this if they only cared about visual similarity, minor alterations and the like.
I’m not much comforted by the “upfrontedness” of Apple as it feels more like due care to prevent a future civil suit from biting; “but we warned them.”At least Apple has been up front about this. Unlike our own (US) government and their alphabet spy agencies. If they work, we at least have ways to turn off Apples scan. The governments don’t provide that option.
Yep, how long until there is a UK court order to add copyrighted content to the database?It’s the technology itself that is terrifying people, not Apple’s use of it, which I understand. You could essentially build a database of anything digital (pictures, movie files, pdf’s), attribute a unique hash to them and scan for them across people’s devices.
If you haven't been able to read all the concerns expressed by so many people, then I hate to say it, but, at this point, you're not going to get it. Which says a lot about your critical thinking skills. It's also pretty disgusting to imply that people are pedos just because they don't like what's going on. That just says a lot about you as a person. And it doesn't say anything good at all.Don't turn on iCloud Photo Library. But quite honestly, as all this detects is child pornography, it makes me wonder why you wouldn't want it turned on.