Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
"Differential privacy" is just an obtuse and hopefully anonymous method of data mining. For Apple's benefit, not ours.

Why not call it for what it is? "Hopefully Anonymous User Unwanted Spying"
But I guess that wouldn't go over too well at the keynote.

Actually, knowing something about statistics, I can see advantages to this even though normally I turn every form of data collection from me off if at all possible (and block insofar as possible the rest). The real question is the degree to which this truly protects anonymity and privacy from determined attacks. Also, I think both Apple and developers need to consider how much energy is being used to collect and collate this information, for across millions of customers the energy cost must be staggering (perhaps this explains Apple's emphasis on solar power).
 
  • Like
Reactions: jnpy!$4g3cwk
"Differential privacy" is just an obtuse and hopefully anonymous method of data mining. For Apple's benefit, not ours.

Why not call it for what it is? "Hopefully Anonymous User Unwanted Spying"
But I guess that wouldn't go over too well at the keynote.

Differential privacy is, and I'm putting this lightly, an extremely strong notion of data de-individualization with severe consequences that ultimately reduce the statistical utility of data, with the goal being mathematically provably guaranteed protection against arbitrary attacks on the data.

You an your children and your children's children and so on can use all the technology from the future and cross-compare data from other databases to attempt to attack their de-identified data all you want. The best you can hope to ever accomplish is having a ridiculously small *chance* you might have guessed the right value for a variable for some user. But you can never be sure, due to the strong probabilistic guarantees afforded by differential privacy.

Good luck with that!
 
Last edited:
  • Like
Reactions: jnpy!$4g3cwk
... So frankly, even if it doesn't work at all, it's better than the alternatives - at least they are trying!

I think even the most privacy-conscious users will concede there are useful types of data that can improve services and software when the developers can access such data. ...

Bingo, at least trying ... at least they are taking this serious and putting effort into this - allowing companies to improve their services while trying to improving privacy for individuals.

(and if they want to make it a selling point, that's ok with me).
 
  • Like
Reactions: JohnnyGo
Many argue that this concept, although it is an interesting mathematical tool, is too strong for use in practice, in that it cannot be practically implemented in any real-world scenario without removing all useful signal in the data. I can't name any companies or even government agencies that have any claims that their data are algorithmically protected with deferentially private guarantees. What Apple has done here is truly revolutionary and I sincerely doubt any of its competitors are close to being able to do what they're doing today. Maybe in a decade or two?
What you are quoting are techniques that have been purposed to keep integrity of elections and avoid voter fraud. Those whom have been opposing these improvements from the current election system really show their true colors.
 
What you are quoting are techniques that have been purposed to keep integrity of elections and avoid voter fraud. Those whom have been opposing these improvements from the current election system really show their true colors.

Wait, could you rephrase what you wrote? I'm not sure I understand. I was referring to statistical agencies worldwide (including various census-taking agencies which I've been in contact with).
 
Wait, could you rephrase what you wrote? I'm not sure I understand. I was referring to statistical agencies worldwide (including various census-taking agencies which I've been in contact with).
There have been many papers published where you can find voter fraud patterns via these techniques. Search yourself.
 
So basically Apple is selling people's personal info.

Aha - there IS a dark side to this! :)

But honestly, there are so many companies that already know everything about you that even if Apple *IS* selling your data, it's quite an improvement. There's currently an entire ecosystem of companies (I won't name names) that all "undo" the animinity of your data so everything about you IS precisely identified. They actually know everything about you already. (The core algorithm is based on combining multiple anonymous pieces of information until everything is revealed, just as a detective might, but with the power of big data collection behind it.) So if Apple can drive a wedge into this and still profit from it, that's awesome.

Regarding the comment about the necessity of negative comments, while I agree in general (and it's great for Apple to get real feedback), that doesn't mean you shouldn't highlight things that are generally positive, and it doesn't mean you should be sure to illustrate a negative aspect to everything without the positive aspects too. (Although to be fair, Apple will provide a lot of positive aspects for free.)
 
You an your children and your children's children and so on can use all the technology from the future and cross-compare data from other databases to attempt to attack their de-identified data all you want. The best you can hope to ever accomplish is having a ridiculously small *chance* you might have guessed the right value for a variable for some user. But you can never be sure, due to the strong probabilistic guarantees afforded by differential privacy.

So, the big question on this for me is where is the real, raw data kept/confined to? My Mac or iPhone? If so, the data will, one hopes, die with the Mac or iPhone. But, if there is raw data somewhere in the cloud, maybe it is private today, but, a change of policy makes it not-so-private tomorrow, without my knowledge or permission. This has already happened where a company gathered data with privacy promises, but, was sold or went bankrupt, and, the inheritor of the data made no such guarantees. So, before I can consider the data truly private, I need to know that the raw data isn't accessible in the server farm somewhere.
 
So, the big question on this for me is where is the real, raw data kept/confined to? My Mac or iPhone? If so, the data will, one hopes, die with the Mac or iPhone. But, if there is raw data somewhere in the cloud, maybe it is private today, but, a change of policy makes it not-so-private tomorrow, without my knowledge or permission. This has already happened where a company gathered data with privacy promises, but, was sold or went bankrupt, and, the inheritor of the data made no such guarantees. So, before I can consider the data truly private, I need to know that the raw data isn't accessible in the server farm somewhere.

As has been mentioned by Apple, the raw data are all stored locally only.
 
Interesting article by a professor who's into cryptographics:

http://blog.cryptographyengineering.com/2016/06/what-is-differential-privacy.html

Two points stood out to me, if I read it correctly:

1. The more that privacy is emphasized, the less useful the data is for a particular person.
2. The more queries, the less private the data can become.

His main conclusion was that Apple needs to be more transparent about exactly what kind of algorithms they're using, so that third parties can ascertain just how private the info actually is. Otherwise, we'll never know for sure.
 
"Differential privacy" is just an obtuse and hopefully anonymous method of data mining. For Apple's benefit, not ours.

Why not call it for what it is? "Hopefully Anonymous User Unwanted Spying"
But I guess that wouldn't go over too well at the keynote.

since unwanted is redundant

They could call it HAUS it's a cool sound acronym and is House in Germany so it's doubly delicious. :cool:
 
As has been mentioned by Apple, the raw data are all stored locally only.
Really the only downside to this is when you get a new device it has to relearn everything. And if you have multiple devices, they all won't necessarily share the same experience.
 
Interesting article by a professor who's into cryptographics:

http://blog.cryptographyengineering.com/2016/06/what-is-differential-privacy.html

Two points stood out to me, if I read it correctly:

1. The more that privacy is emphasized, the less useful the data is for a particular person.
2. The more queries, the less private the data can become.

His main conclusion was that Apple needs to be more transparent about exactly what kind of algorithms they're using, so that third parties can ascertain just how private the info actually is. Otherwise, we'll never know for sure.

This article about sums up my expectations for what I expect is happening behind the scenes. http://simplystatistics.org/2016/06/14/ultimate-ai-battle/
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.