Stuck behind a paywall when I try to view![]()
If you search Google for it and use their result, and then stop it fast enough, you can get to it...
Stuck behind a paywall when I try to view![]()
Stuck behind a paywall when I try to view![]()
It's an interesting article. Apple really can't be trusted any more than any other provider, even if you don't count on device CSAM scanning.Stuck behind a paywall when I try to view![]()
It's an interesting article. Apple really can't be trusted any more than any other provider, even if you don't count on device CSAM scanning.
From a new article this morning:
![]()
Apple fires Ashley Gjøvik, senior employee who alleged sexism at work
Apple has fired Ashley Gjøvik, a senior engineering program manager who's been outspoken about her experiences working for the tech giant.www.engadget.com
I know that's a private employee, and their privacy is a different matter, but Apple went way overboard as far as I'm concerned.
It's an interesting article. Apple really can't be trusted any more than any other provider, even if you don't count on device CSAM scanning.
From a new article this morning:
![]()
Apple fires Ashley Gjøvik, senior employee who alleged sexism at work
Apple has fired Ashley Gjøvik, a senior engineering program manager who's been outspoken about her experiences working for the tech giant.www.engadget.com
I know that's a private employee, and their privacy is a different matter, but Apple went way overboard as far as I'm concerned.
As I sad early, the problem with Apple is systemic, not CSAM only. When your leader is obsessed with profit margins and vertical integration you don't have product centered culture, you let people under you (managers) decide what is the product and marketing to create the sales pitch.It's an interesting article. Apple really can't be trusted any more than any other provider, even if you don't count on device CSAM scanning.
From a new article this morning:
![]()
Apple fires Ashley Gjøvik, senior employee who alleged sexism at work
Apple has fired Ashley Gjøvik, a senior engineering program manager who's been outspoken about her experiences working for the tech giant.www.engadget.com
I know that's a private employee, and their privacy is a different matter, but Apple went way overboard as far as I'm concerned.
I have no doubt there's more to the story.I'm going to go out on a limb and say this person might be a bit of a pot stirrer. That doesn't mean some of her complaints aren't legit, but based on how some of that reads, it sounds to me like she's sometimes looking for a fight more than looking for a solution. Even if I'm wrong, there definitely sounds like there's more to the story here.
This comment didn’t age well. Although I find CSAM scanning creepy, I’m not going to change my purchase habits because of it.[….].
Nowadays Apple is legitimate monopoly, with portfolio of consumer products build around the iPhone and iOS.
Not only with closed source but with dark pattern design and UX everywhere. The Mac is something that Apple hates and the idea of running software like Little Snitch to circumvent telemetry makes them screech
[….]
Are there examples of successful companies other than apple where profits rule over customers?As I sad early, the problem with Apple is systemic, not CSAM only. When your leader is obsessed with profit margins and vertical integration you don't have product centered culture, you let people under you (managers) decide what is the product and marketing to create the sells pitch.
[…]
This comment didn’t age well. Although I find CSAM scanning creepy, I’m not going to change my purchase habits because of it.
It's not okay and I find it creepy. However, we can let Apple know through PR that CSAM scanning is not okay. But in the end Apple will do the right thing. I'm not throwing away Apple devices because I don't like this one thing. Hundreds of millions are in the same boat.That is a bit sad, that you let apple to know, it is okay to build a os that let you break privacy and survey people.
Im more worried about the system when it spreads widely and people are hanged because they are differend or think differendly.
Let us hope so. Still the damage might have been done because Apple published a blue print on how to do local surveillance on a mobile device while ensuring 'privacy' (well, sort of).It's not okay and I find it creepy. However, we can let Apple know through PR that CSAM scanning is not okay. But in the end Apple will do the right thing....
Beg to differ, my friend. You are just a regular Apple user. There are a tons of security breaches in your beloved Apple products, but I did not feel obligation or responsibility to inform you.This comment didn’t age well. Although I find CSAM scanning creepy, I’m not going to change my purchase habits because of it.
"Tons"? Is there some underlying subtext where every other major product, such as windows or linux or android is bug free? My linux distros get updated pretty frequently, I guess because they are bug and breach free?Beg to differ, my friend. You are just a regular Apple user. There are a tons of security breaches in your beloved Apple products, but I did not feel obligation or responsibility to inform you.
You've shared an opinion, which I appreciate . However that is all you shared.Actually all Apple users are in mercifully deceiving hands of company which has no Red Team, adequate communication or rewards with security researchers.
Your purchase habits are the last concern of mine.
I have shared logical, technical and facto-logically correct information.
Good luck with half-baked solutions, roll your own backups etc. Hard to believe a mass of people went to linux based phones and gave up the polish of iphones.We(me, my family, friends and colleagues ) are moved on.
Towards proven FOSS solutions, because we know technical and business things that will not see the light of the day or be advertised, promoted or freely shared.
There is no safe phone. Period.
You're entitled to your opinion.Just a little hint: CSAM is not what is marketed or explained publicly. It is much more dangerous and disgusting.
And Apple is not removing it. They postponed official implementation.
In the background things are working as planned.
Some points:![]()
New study reveals iPhones aren't as private as you think
Android phones collect more data by volume, but iPhones collect more types of data, a study findswww.tomsguide.com
![]()
China Used a Tiny Chip in a Hack That Infiltrated U.S. Companies
The attack by Chinese spies reached almost 30 U.S. companies by compromising America's technology supply chain.www.bloomberg.com
Apple ‘Still Investigating’ Unpatched and Public iOS Vulnerabilities
https://webrtchacks.com/apples-not-so-private-relay-fails-with-webrtc/
https://www.theverge.com/22700898/apple-company-culture-change-secrecy-employee-unrest
https://threatpost.com/apple-airtag-zero-day-trackers/175143/
https://www.telegraph.co.uk/technol...aw-risks-letting-hackers-drain-money-iphones/
https://krebsonsecurity.com/2021/09/apple-airtag-bug-enables-good-samaritan-attack/
https://www.engadget.com/apple-ipad...iGcMTQw8kgsjx7IH7r9tiAthXtg2hVzMFcToZQfvI-WMd
To clarify in simple terms possible. We are in defining times. In the future only those who can control their personal data habits and digital UX will have the maximum control over their lives. The created systems and products already gather enough data to form a profile which will affect your life directly. This data is shared with banks, credit agencies, insurance agencies, police and government.
To give "trust" for any corporation is plain stupid. This is red pill moment, you choose to learn how the real world functions and get proper knowledge on how to protect yourself or you go with the blue flow of mindless consumerism and advertised perceptions.
There is enough data to confirm the reality of the picture that I am painting. There is enough evidence of dark-pattern UX, malpractice and manipulation historically from big tech (and other) corporations.
Easy and comfortable are the soma pills that corporations are serving you as an entertainment, politics, religions or "tech" advancements. Psychologically speaking mass consumers are sick and addicted to dopamine hits and illusion of control and ownership.
Behind all this the cold machines will decide your future supervised by corporate interests and <insert good cause, intentions> manufactured consent.
You have internet and all the books, science and lesson from history are available (yet). Do your work and free yourselves from Apple or any other monopoly overlords. The future will not be defined by people that follow trends or fashion.
Think different.
Good luck with getting off the grid.To clarify in simple terms possible. We are in defining times. In the future only those who can control their personal data habits and digital UX will have the maximum control over their lives. The created systems and products already gather enough data to form a profile which will affect your life directly. This data is shared with banks, credit agencies, insurance agencies, police and government.
To give "trust" for any corporation is plain stupid. This is red pill moment, you choose to learn how the real world functions and get proper knowledge on how to protect yourself or you go with the blue flow of mindless consumerism and advertised perceptions.
There is enough data to confirm the reality of the picture that I am painting. There is enough evidence of dark-pattern UX, malpractice and manipulation historically from big tech (and other) corporations.
Easy and comfortable are the soma pills that corporations are serving you as an entertainment, politics, religions or "tech" advancements. Psychologically speaking mass consumers are sick and addicted to dopamine hits and illusion of control and ownership.
Behind all this the cold machines will decide your future supervised by corporate interests and <insert good cause, intentions> manufactured consent.
You have internet and all the books, science and lesson from history are available (yet). Do your work and free yourselves from Apple or any other monopoly overlords. The future will not be defined by people that follow trends or fashion.
Think different.
And they're right. We need to think more than one move ahead. Right now we are weighing the benefits and costs to the child victims of CSAM versus the privacy of the vast majority of iPhone users who have nothing to do with CSAM. However, if the blueprint for this system ever gets abused, possibly even just by copying, the costs might be to the children and adults put at risk by violent authoritarian governments suppressing dissent. I understand Apple's motivation for wanting to do something, but there are alternatives that already exist: police work, search warrants, online honeypots for stings, online monitoring of communications, and server-side scanning. Don't pretend these alternatives do not exist. And by the time CSAM hashes have entered Apple's system, most of the damage has been done. Apple's system is far more likely to catch CSAM consumers (no bad thing IMO, but not enough to prevent the abuse) than the predators that create CSAM material.Eff: “we are not going to present any alternative. Let the pedophiles hunt children and store their images. We don’t agree with pedophiles but they have the right to those images in the privacy of their devices. Apple should not go above and beyond todays normal scanning in order to catch predators. The cost of a slippery slope is higher than the rape of a minor”
And they're right. We need to think more than one move ahead. Right now we are weighing the benefits and costs to the child victims of CSAM versus the privacy of the vast majority of iPhone users who have nothing to do with CSAM. However, if the blueprint for this system ever gets abused, possibly even just by copying, the costs might be to the children and adults put at risk by violent authoritarian governments suppressing dissent. I understand Apple's motivation for wanting to do something, but there are alternatives that already exist: police work, search warrants, online honeypots for stings, online monitoring of communications, and server-side scanning. Don't pretend these alternatives do not exist. And by the time CSAM hashes have entered Apple's system, most of the damage has been done. Apple's system is far more likely to catch CSAM consumers (no bad thing IMO, but not enough to prevent the abuse) than the predators that create CSAM material.
How will the abuse occur with this when it hasn’t on cloud? Even if implemented, Apple cannot access your e2e data. What will Apple magically scan on device which is not available on cloud anyway ? Would love to knowAnd they're right. We need to think more than one move ahead. Right now we are weighing the benefits and costs to the child victims of CSAM versus the privacy of the vast majority of iPhone users who have nothing to do with CSAM. However, if the blueprint for this system ever gets abused, possibly even just by copying, the costs might be to the children and adults put at risk by violent authoritarian governments suppressing dissent. I understand Apple's motivation for wanting to do something, but there are alternatives that already exist: police work, search warrants, online honeypots for stings, online monitoring of communications, and server-side scanning. Don't pretend these alternatives do not exist. And by the time CSAM hashes have entered Apple's system, most of the damage has been done. Apple's system is far more likely to catch CSAM consumers (no bad thing IMO, but not enough to prevent the abuse) than the predators that create CSAM material.
Apple openly published the blueprint for how their system works (local processing on the phone to extract features from data files to be compared to a pre-existing database derived from an external source), even having the hubris to say that this form of blanket surveillance ensures privacy (which it might not, BTW). Now imagine an authoritarian regime requires a system modelled on Apple's to be installed on all mobile phones (at some point they are all likely to have AI-optimised processors). Then imagine that this system, modelled on Apple's, extracts features to detect the faces of gay people, the faces of Uyghurs, politcal posters present at a protest, or opposition party flags, and compare that to the government's database. Do you not see the danger? The government could claim that it is conducting survillience to protect the public, but that their survillience maintains privacy. Why? Because the clueless numpties at Apple said so.How will the abuse occur with this when it hasn’t on cloud? Even if implemented, Apple cannot access your e2e data. What will Apple magically scan on device which is not available on cloud anyway ? Would love to know
They can do all this on iCloud or azure or google cloud or Amazon cloud and the governments can ask them to do all this on any cloud as well.Apple openly published the blueprint for how their system works (local processing on the phone to extract features from data files to be compared to a pre-existing database derived from an external source), even having the hubris to say that this form of blanket surveillance ensures privacy (which it might not, BTW). Now imagine an authoritarian regime requires a system modelled on Apple's to be installed on all mobile phones (at some point they are all likely to have AI-optimised processors). Then imagine that this system, modelled on Apple's, extracts features to detect the faces of gay people, the faces of Uyghurs, politcal posters present at a protest, or opposition party flags, and compare that to the government's database. Do you not see the danger? The government could claim that it is conducting survillience to protect the public, but that their survillience maintains privacy. Why? Because the clueless numpties at Apple said so.
You're focusing on only the next move - Apple's implementation - just like Apple's engineers. We need to think several moves ahead, like how such a system could be abused by governments. I fear the damage has already be done by Apple's publication of technical papers describing the system.