Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Stuck behind a paywall when I try to view :(
It's an interesting article. Apple really can't be trusted any more than any other provider, even if you don't count on device CSAM scanning.

From a new article this morning:

I know that's a private employee, and their privacy is a different matter, but Apple went way overboard as far as I'm concerned.
 
It's an interesting article. Apple really can't be trusted any more than any other provider, even if you don't count on device CSAM scanning.

From a new article this morning:

I know that's a private employee, and their privacy is a different matter, but Apple went way overboard as far as I'm concerned.

Ouch!
Not knowing all the details, this doesn’t look good.
 
It's an interesting article. Apple really can't be trusted any more than any other provider, even if you don't count on device CSAM scanning.

From a new article this morning:

I know that's a private employee, and their privacy is a different matter, but Apple went way overboard as far as I'm concerned.

I'm going to go out on a limb and say this person might be a bit of a pot stirrer. That doesn't mean some of her complaints aren't legit, but based on how some of that reads, it sounds to me like she's sometimes looking for a fight more than looking for a solution. Even if I'm wrong, there definitely sounds like there's more to the story here.
 
It's an interesting article. Apple really can't be trusted any more than any other provider, even if you don't count on device CSAM scanning.

From a new article this morning:

I know that's a private employee, and their privacy is a different matter, but Apple went way overboard as far as I'm concerned.
As I sad early, the problem with Apple is systemic, not CSAM only. When your leader is obsessed with profit margins and vertical integration you don't have product centered culture, you let people under you (managers) decide what is the product and marketing to create the sales pitch.
S.Jobs had his faults, but he was obsessed with product creation process, Cook is equally obsessed but with politics and extracting maximum profit from production pipeline.
The results in any big organization when faced with this leadership philosophy is mediocrity and corruption.
There is no denying that Ashley Gjovik weaponized social media to show her message, everyone now is bashing her over this.
But reality is that this overreaction clearly shows that Apple has big internal management problems and culture of oppression.
There is no surprise for me or lots of other old Apple users, we remember clearly Apple under John Sculley and Jobs reaction.


Apple has reached the stage in which if User Centered culture is not restored immediately they will turn into monstrosity. An actual outlet for cultural control and data control. And we are seeing this transformation in real time.

The fundamental difference between Apple and old IBM is that Jobs successfully created the perfect mix between design and marketing.
And with years this mix is proven so much effective that Apple can introduce "dark pattern" UX and software design decisions and users will take it as a "gospel". Combine this with commercial media coverage and "influencer" outreach and you have the perfect brainwashing machine.

There is no hope in sight for the regular user. To understand whats going on you have to be a professional marketing or product design person. People hate to think hard. They wait for mass approved reaction because this gives them result with minimal effort.
 
Last edited:
I'm going to go out on a limb and say this person might be a bit of a pot stirrer. That doesn't mean some of her complaints aren't legit, but based on how some of that reads, it sounds to me like she's sometimes looking for a fight more than looking for a solution. Even if I'm wrong, there definitely sounds like there's more to the story here.
I have no doubt there's more to the story.
 
[….].

Nowadays Apple is legitimate monopoly, with portfolio of consumer products build around the iPhone and iOS.
Not only with closed source but with dark pattern design and UX everywhere. The Mac is something that Apple hates and the idea of running software like Little Snitch to circumvent telemetry makes them screech
[….]
This comment didn’t age well. Although I find CSAM scanning creepy, I’m not going to change my purchase habits because of it.
 
As I sad early, the problem with Apple is systemic, not CSAM only. When your leader is obsessed with profit margins and vertical integration you don't have product centered culture, you let people under you (managers) decide what is the product and marketing to create the sells pitch.
[…]
Are there examples of successful companies other than apple where profits rule over customers?
 
This comment didn’t age well. Although I find CSAM scanning creepy, I’m not going to change my purchase habits because of it.

That is a bit sad, that you let apple to know, it is okay to build a os that let you break privacy and survey people.

Im more worried about the system when it spreads widely and people are hanged because they are differend or think differendly.
 
That is a bit sad, that you let apple to know, it is okay to build a os that let you break privacy and survey people.

Im more worried about the system when it spreads widely and people are hanged because they are differend or think differendly.
It's not okay and I find it creepy. However, we can let Apple know through PR that CSAM scanning is not okay. But in the end Apple will do the right thing. I'm not throwing away Apple devices because I don't like this one thing. Hundreds of millions are in the same boat.
 
It's not okay and I find it creepy. However, we can let Apple know through PR that CSAM scanning is not okay. But in the end Apple will do the right thing....
Let us hope so. Still the damage might have been done because Apple published a blue print on how to do local surveillance on a mobile device while ensuring 'privacy' (well, sort of).
 
This comment didn’t age well. Although I find CSAM scanning creepy, I’m not going to change my purchase habits because of it.
Beg to differ, my friend. You are just a regular Apple user. There are a tons of security breaches in your beloved Apple products, but I did not feel obligation or responsibility to inform you.
Actually all Apple users are in mercifully deceiving hands of company which has no Red Team, adequate communication or rewards with security researchers.
Your purchase habits are the last concern of mine.
I have shared logical, technical and facto-logically correct information.
We(me, my family, friends and colleagues ) are moved on.
Towards proven FOSS solutions, because we know technical and business things that will not see the light of the day or be advertised, promoted or freely shared.
There is no safe phone. Period.

Just a little hint: CSAM is not what is marketed or explained publicly. It is much more dangerous and disgusting.
And Apple is not removing it. They postponed official implementation.
In the background things are working as planned.
 
Beg to differ, my friend. You are just a regular Apple user. There are a tons of security breaches in your beloved Apple products, but I did not feel obligation or responsibility to inform you.
"Tons"? Is there some underlying subtext where every other major product, such as windows or linux or android is bug free? My linux distros get updated pretty frequently, I guess because they are bug and breach free?
Actually all Apple users are in mercifully deceiving hands of company which has no Red Team, adequate communication or rewards with security researchers.
Your purchase habits are the last concern of mine.
I have shared logical, technical and facto-logically correct information.
You've shared an opinion, which I appreciate . However that is all you shared.
We(me, my family, friends and colleagues ) are moved on.
Towards proven FOSS solutions, because we know technical and business things that will not see the light of the day or be advertised, promoted or freely shared.
There is no safe phone. Period.
Good luck with half-baked solutions, roll your own backups etc. Hard to believe a mass of people went to linux based phones and gave up the polish of iphones.
Just a little hint: CSAM is not what is marketed or explained publicly. It is much more dangerous and disgusting.
And Apple is not removing it. They postponed official implementation.
In the background things are working as planned.
You're entitled to your opinion.
 
Last edited:
  • Disagree
Reactions: scvrx
Eff: “we are not going to present any alternative. Let the pedophiles hunt children and store their images. We don’t agree with pedophiles but they have the right to those images in the privacy of their devices. Apple should not go above and beyond todays normal scanning in order to catch predators. The cost of a slippery slope is higher than the rape of a minor”
 
Some points:
1: nothing wrong with data collection imo, at least post an article that details iOS 15.
2: everything else is a rehash and mish mosh of other things that have been posted.
3: android is not the poster child of a big free operating system
 
To clarify in simple terms possible. We are in defining times. In the future only those who can control their personal data habits and digital UX will have the maximum control over their lives. The created systems and products already gather enough data to form a profile which will affect your life directly. This data is shared with banks, credit agencies, insurance agencies, police and government.
To give "trust" for any corporation is plain stupid. This is red pill moment, you choose to learn how the real world functions and get proper knowledge on how to protect yourself or you go with the blue flow of mindless consumerism and advertised perceptions.
There is enough data to confirm the reality of the picture that I am painting. There is enough evidence of dark-pattern UX, malpractice and manipulation historically from big tech (and other) corporations.
Easy and comfortable are the soma pills that corporations are serving you as an entertainment, politics, religions or "tech" advancements. Psychologically speaking mass consumers are sick and addicted to dopamine hits and illusion of control and ownership.
Behind all this the cold machines will decide your future supervised by corporate interests and <insert good cause, intentions> manufactured consent.
You have internet and all the books, science and lesson from history are available (yet). Do your work and free yourselves from Apple or any other monopoly overlords. The future will not be defined by people that follow trends or fashion.
Think different.
 
To clarify in simple terms possible. We are in defining times. In the future only those who can control their personal data habits and digital UX will have the maximum control over their lives. The created systems and products already gather enough data to form a profile which will affect your life directly. This data is shared with banks, credit agencies, insurance agencies, police and government.
To give "trust" for any corporation is plain stupid. This is red pill moment, you choose to learn how the real world functions and get proper knowledge on how to protect yourself or you go with the blue flow of mindless consumerism and advertised perceptions.
There is enough data to confirm the reality of the picture that I am painting. There is enough evidence of dark-pattern UX, malpractice and manipulation historically from big tech (and other) corporations.
Easy and comfortable are the soma pills that corporations are serving you as an entertainment, politics, religions or "tech" advancements. Psychologically speaking mass consumers are sick and addicted to dopamine hits and illusion of control and ownership.
Behind all this the cold machines will decide your future supervised by corporate interests and <insert good cause, intentions> manufactured consent.
You have internet and all the books, science and lesson from history are available (yet). Do your work and free yourselves from Apple or any other monopoly overlords. The future will not be defined by people that follow trends or fashion.
Think different.

I suspect we will also see a mid level group that will monitor / control their data yet give up some of that control to better and more easily access the electronic landscape. Either way, it will be an interesting next decade.
 
To clarify in simple terms possible. We are in defining times. In the future only those who can control their personal data habits and digital UX will have the maximum control over their lives. The created systems and products already gather enough data to form a profile which will affect your life directly. This data is shared with banks, credit agencies, insurance agencies, police and government.
To give "trust" for any corporation is plain stupid. This is red pill moment, you choose to learn how the real world functions and get proper knowledge on how to protect yourself or you go with the blue flow of mindless consumerism and advertised perceptions.
There is enough data to confirm the reality of the picture that I am painting. There is enough evidence of dark-pattern UX, malpractice and manipulation historically from big tech (and other) corporations.
Easy and comfortable are the soma pills that corporations are serving you as an entertainment, politics, religions or "tech" advancements. Psychologically speaking mass consumers are sick and addicted to dopamine hits and illusion of control and ownership.
Behind all this the cold machines will decide your future supervised by corporate interests and <insert good cause, intentions> manufactured consent.
You have internet and all the books, science and lesson from history are available (yet). Do your work and free yourselves from Apple or any other monopoly overlords. The future will not be defined by people that follow trends or fashion.
Think different.
Good luck with getting off the grid.
 
Eff: “we are not going to present any alternative. Let the pedophiles hunt children and store their images. We don’t agree with pedophiles but they have the right to those images in the privacy of their devices. Apple should not go above and beyond todays normal scanning in order to catch predators. The cost of a slippery slope is higher than the rape of a minor”
And they're right. We need to think more than one move ahead. Right now we are weighing the benefits and costs to the child victims of CSAM versus the privacy of the vast majority of iPhone users who have nothing to do with CSAM. However, if the blueprint for this system ever gets abused, possibly even just by copying, the costs might be to the children and adults put at risk by violent authoritarian governments suppressing dissent. I understand Apple's motivation for wanting to do something, but there are alternatives that already exist: police work, search warrants, online honeypots for stings, online monitoring of communications, and server-side scanning. Don't pretend these alternatives do not exist. And by the time CSAM hashes have entered Apple's system, most of the damage has been done. Apple's system is far more likely to catch CSAM consumers (no bad thing IMO, but not enough to prevent the abuse) than the predators that create CSAM material.
 
And they're right. We need to think more than one move ahead. Right now we are weighing the benefits and costs to the child victims of CSAM versus the privacy of the vast majority of iPhone users who have nothing to do with CSAM. However, if the blueprint for this system ever gets abused, possibly even just by copying, the costs might be to the children and adults put at risk by violent authoritarian governments suppressing dissent. I understand Apple's motivation for wanting to do something, but there are alternatives that already exist: police work, search warrants, online honeypots for stings, online monitoring of communications, and server-side scanning. Don't pretend these alternatives do not exist. And by the time CSAM hashes have entered Apple's system, most of the damage has been done. Apple's system is far more likely to catch CSAM consumers (no bad thing IMO, but not enough to prevent the abuse) than the predators that create CSAM material.

Hats off the the EFF for having the “guts” to call this out.

That is one great aspect to keep in mind: what is listed in the NCMEC and ICMEC (and others) as CSAM are not new files. These objects have been around for a bit.

While I can emotionally agree with Apple in removing CSAM, I still feel Apple has really missed the boat as this proposal of theirs does nothing to clean up existing items in the iCloud, only limit the amount of new stuff going forward.
 
And they're right. We need to think more than one move ahead. Right now we are weighing the benefits and costs to the child victims of CSAM versus the privacy of the vast majority of iPhone users who have nothing to do with CSAM. However, if the blueprint for this system ever gets abused, possibly even just by copying, the costs might be to the children and adults put at risk by violent authoritarian governments suppressing dissent. I understand Apple's motivation for wanting to do something, but there are alternatives that already exist: police work, search warrants, online honeypots for stings, online monitoring of communications, and server-side scanning. Don't pretend these alternatives do not exist. And by the time CSAM hashes have entered Apple's system, most of the damage has been done. Apple's system is far more likely to catch CSAM consumers (no bad thing IMO, but not enough to prevent the abuse) than the predators that create CSAM material.
How will the abuse occur with this when it hasn’t on cloud? Even if implemented, Apple cannot access your e2e data. What will Apple magically scan on device which is not available on cloud anyway ? Would love to know
 
How will the abuse occur with this when it hasn’t on cloud? Even if implemented, Apple cannot access your e2e data. What will Apple magically scan on device which is not available on cloud anyway ? Would love to know
Apple openly published the blueprint for how their system works (local processing on the phone to extract features from data files to be compared to a pre-existing database derived from an external source), even having the hubris to say that this form of blanket surveillance ensures privacy (which it might not, BTW). Now imagine an authoritarian regime requires a system modelled on Apple's to be installed on all mobile phones (at some point they are all likely to have AI-optimised processors). Then imagine that this system, modelled on Apple's, extracts features to detect the faces of gay people, the faces of Uyghurs, politcal posters present at a protest, or opposition party flags, and compare that to the government's database. Do you not see the danger? The government could claim that it is conducting survillience to protect the public, but that their survillience maintains privacy. Why? Because the clueless numpties at Apple said so.

You're focusing on only the next move - Apple's implementation - just like Apple's engineers. We need to think several moves ahead, like how such a system could be abused by governments. I fear the damage has already be done by Apple's publication of technical papers describing the system.
 
Last edited:
Apple openly published the blueprint for how their system works (local processing on the phone to extract features from data files to be compared to a pre-existing database derived from an external source), even having the hubris to say that this form of blanket surveillance ensures privacy (which it might not, BTW). Now imagine an authoritarian regime requires a system modelled on Apple's to be installed on all mobile phones (at some point they are all likely to have AI-optimised processors). Then imagine that this system, modelled on Apple's, extracts features to detect the faces of gay people, the faces of Uyghurs, politcal posters present at a protest, or opposition party flags, and compare that to the government's database. Do you not see the danger? The government could claim that it is conducting survillience to protect the public, but that their survillience maintains privacy. Why? Because the clueless numpties at Apple said so.

You're focusing on only the next move - Apple's implementation - just like Apple's engineers. We need to think several moves ahead, like how such a system could be abused by governments. I fear the damage has already be done by Apple's publication of technical papers describing the system.
They can do all this on iCloud or azure or google cloud or Amazon cloud and the governments can ask them to do all this on any cloud as well.

I asked simply what non cloud information can they scan .

you simply answered how expansive scanning of available data can become. Same data is already on the cloud.

the issue is device scanning right? So tell me what non cloud data on device is available for Apple to scan that is not on the cloud

simple question
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.