Apple Touts 'Differential Privacy' Data Gathering Technique in iOS 10

Discussion in 'Politics, Religion, Social Issues' started by MacRumors, Jun 14, 2016.

  1. MacRumors macrumors bot

    MacRumors

    Joined:
    Apr 12, 2001
    #1
    [​IMG]


    With the announcement of iOS 10 at WWDC on Monday, Apple mentioned its adoption of "Differential Privacy" - a mathematical technique that allows the company to collect user information that helps it enhance its apps and services while keeping the data of individual users private.

    [​IMG]

    During the company's keynote address, Senior VP of software engineering Craig Federighi - a vocal advocate of personal privacy - summarized the concept in the following way:
    Wired has now published an article on the subject that lays out in clearer detail some of the practical implications and potential pitfalls of Apple's latest statistical data gathering technique.
    Wired notes that the technique claims to have a mathematically "provable guarantee" that its generated data sets are impervious to outside attempts to de-anonymize the information. It does however caution that such complicated techniques rely on the rigor of their implementation to retain any guarantee of privacy during transmission.

    You can read the full article on the subject of differential privacy here.

    Note: Due to the political nature of the discussion regarding this topic, the discussion thread is located in our Politics, Religion, Social Issues forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

    Article Link: Apple Touts 'Differential Privacy' Data Gathering Technique in iOS 10
     
  2. MH01 macrumors G4

    MH01

    Joined:
    Feb 11, 2008
    #2
    Good that they are taking privacy seriously . Looking forward to some experts review of this approach
     
  3. SSD-GUY macrumors 6502a

    Joined:
    Sep 20, 2012
    Location:
    London, UK
    #3
    Never thought I'd say this, but they've finally made all my years of learning stats for my Econ degree sound interesting!

    Quite intrigued to see how this actually works out. My guess is that that they take this individual level data but perhaps apply it on a macro scale? But I can't see it being completely unbreakbale.
     
  4. nt5672 macrumors 65816

    Joined:
    Jun 30, 2007
    #4
    Me too, but what I have heard is that as long as you are doing the same as everyone else your privacy is protected, but if you stand out in anyway then you can be identified.
     
  5. CFreymarc Suspended

    Joined:
    Sep 4, 2009
    #5
    While I don't have the exact technique they are using, it is common to use a "double blind" addressing technique keep anonymity making it impossible to trace back to ID someone. There are descriptions of this technique a search away.
     
  6. Amacfa macrumors 65816

    Amacfa

    Joined:
    May 22, 2009
    Location:
    D.C.
  7. Mac Fly (film) macrumors 65816

    Mac Fly (film)

    Joined:
    Feb 12, 2006
    Location:
    Ireland
    #7
    That's very good and all, but this is MacRumors (macRumors ;-)), I'm sure we can find a negative way to spin this.
     
  8. MH01 macrumors G4

    MH01

    Joined:
    Feb 11, 2008
    #8
    Are you fishing for these comments ?? ;)

    Welcome to the Internet . I'm yet to find a forum where it's just positive news....
     
  9. nt5672 macrumors 65816

    Joined:
    Jun 30, 2007
    #9
    Nothing ever progress if all you have are positive comments. You ever heard the expression, "Tell me what I need to hear, not what I want to hear"? The question is, do the negative comments have merit, if so, and most do, then someone at Apple should be listening. We cannot count on the Media, because they need access to Apple, to say what everyone is thinking.
     
  10. omgitscro macrumors 6502a

    omgitscro

    Joined:
    Jul 12, 2008
    #10
    Background: my PhD advisor is a main contributor to the differential privacy literature, and my department overall has a few professors working on differential privacy. Although my own research doesn't deal with differential privacy, some of my past work has been in statistical privacy.

    Response to quoted text: while Apple is, without a doubt, anonymizing all identifiers in the data (i.e. your name, address, and other contact info is 100% certain to have been stripped), this does not describe what differential privacy does (rather, anonymizing data is a prerequisite for all practical data privacy methodology). Differential privacy provides a probabilistic guarantee on the data-masking algorithm that, in layman's terms, if you have two datasets that differ only for one user, the output of the algorithm on both datasets are indistinguishable in some precise sense. There are various ways to construct this algorithm so that is differentially private.

    The take-away is (and I'm addressing the other commenter): no, even if you are absolutely unique in the dataset, differential privacy guarantees you will be entirely indistinguishable. In their words, it is a guarantee that any attacker will never be able to verify or determine the true value for any entry in the protected data (e.g. the value of any variable for any particular individual).

    Many argue that this concept, although it is an interesting mathematical tool, is too strong for use in practice, in that it cannot be practically implemented in any real-world scenario without removing all useful signal in the data. I can't name any companies or even government agencies that have any claims that their data are algorithmically protected with differentially private guarantees. What Apple has done here is truly revolutionary and I sincerely doubt any of its competitors are close to being able to do what they're doing today. Maybe in a decade or two?
    --- Post Merged, Jun 14, 2016 ---
    See my other reply for a more detailed response. In particular, differential privacy is a guarantee that no matter how any attacker aggregates the data, there is no way to pick out individual values for any of the variables collected, for any user.
     
  11. H2SO4 macrumors 68040

    Joined:
    Nov 4, 2008
    #11
    This. Also sooner or later Apple will start looking more closely at people to see what each demographic does and wants. The line they are on now will either move or become blurred.
    In order to know what your customers want you have to know them. Period. Or you have to ask someone that does, (which means setting up a phantom corporation to do the dirty work or buying the info from someone that has done it already).
    In order to know what makes people spend money you have to study them. Period.
    Also, and this is a simplistic example. You have the choice not to collect data about someones age in the first place surely, you don’t have to collect it and then find a way to obfuscate it surely?
     
  12. 2010mini macrumors 68040

    Joined:
    Jun 19, 2013
    #12
    So this is what Apple hard at work creating. I'm impressed.
     
  13. vmachiel macrumors 68000

    Joined:
    Feb 15, 2011
    Location:
    Holland
    #13
    I'm also happy that they are doing this. I'd rather them being a little bit behind, but with a much better implementation.
     
  14. macfacts macrumors 68000

    macfacts

    Joined:
    Oct 7, 2012
    Location:
    Cybertron
    #14
    So basically Apple is selling people's personal info.
     
  15. thermodynamic Suspended

    thermodynamic

    Joined:
    May 3, 2009
    Location:
    USA
    #15
    Only since the Great Lawsuit of 2010, or was it the other one from earlier?

    Apple cares first and foremost to profit for Apple. All this privacy stuff is manure, until it is forced to do so. Otherwise there would have been no need for a lawsuit, surely?
     
  16. MH01 macrumors G4

    MH01

    Joined:
    Feb 11, 2008
    #16
    Apple is using privacy as a selling point, I understand that, but if they make thier hardware secure/private , we are winners, they are doing it cause it = profit
     
  17. Porco, Jun 14, 2016
    Last edited: Jun 14, 2016

    Porco macrumors 68030

    Porco

    Joined:
    Mar 28, 2005
    #17
    I think this is a very positive thing. I think it's the very antithesis of the likes of Facebook and Google's approach to using user data (and increasingly Microsoft's too). So frankly, even if it doesn't work at all, it's better than the alternatives - at least they are trying!

    I think even the most privacy-conscious users will concede there are useful types of data that can improve services and software when the developers can access such data. The compromise between that data and the users' privacy is unfortunately sometimes the 'collateral damage' in the process, so it's great if Apple are finding ways to have the best of both worlds. I would guess it's only really possible if your target is improving the product for the user, rather than identifying the user in order to sell ads to them, specifically, so it could become a real unique selling point for paid software development on iOS.

    Still, they need to be careful and very sure that it works. There have been lots of instances in the past where claims of 'anonomysied data' have been proven to be trivially easy to de-anonomyse.
     
  18. SSD-GUY macrumors 6502a

    Joined:
    Sep 20, 2012
    Location:
    London, UK
    #18
    But how will Apple transcend the individual level data into an aggregate form? Surely this transition point (from micro to macro data) indicates a weak point?
     
  19. omgitscro macrumors 6502a

    omgitscro

    Joined:
    Jul 12, 2008
    #19
    Read my reply to the second quoted comment, which I think answers your question.
    --- Post Merged, Jun 14, 2016 ---
    No, they are not.
     
  20. 69Mustang macrumors 601

    69Mustang

    Joined:
    Jan 7, 2014
    Location:
    In between a rock and a hard place
    #20
    I am cautiously optimistic this will function as Apple hopes it does. If so it will allow Apple and the surrounding ecosystem to utilize and monetize the collected data while maintaining user privacy. Like you, I think they need to tread slowly and carefully. 9to5 has a more cautious take on the subject: http://9to5mac.com/2016/06/14/differential-privacy-security-questioned/
    Relevant portion: "...Matthew Green, a Cryptography professor at Johns Hopkins University, was tweeting skeptically about it, describing the approach as untested."

    Querying him, Green said that existing implementations of Differential Privacy had needed to compromise privacy to obtain accurate data.

    The question is, what kind of data, and what kind of measurements are they applying it to, and what are they doing with it,” Green told Gizmodo. “It’s a really neat idea, but I’ve never really seen it deployed. It ends up being a tradeoff between accuracy of the data you are collecting and privacy.

    “The accuracy goes down as the privacy goes up, and the tradeoffs I’ve seen have never been all that great,” Green continued. “[Again] I’ve never really heard of anyone deploying it in a real product before. So if Apple is doing this they’ve got a custom implementation, and they made all the decisions themselves."

    It's a step. Hopefully a good one. This WWDC was weird. They concentrated on letting everyone know about a lot of fluff but the relevant changes were nearly unmentioned. I understand the gen pop digs emoji but this, delete stock apps, removing siri remote requirements from ATV gaming, and others will be far more important to Apple keeping it's success.
     
  21. macs4nw macrumors 601

    macs4nw

    #21
    The article says Apple collects/aggregates the data employing 'Differential Privacy'. Nowhere did I see that Apple, beyond using this collected data for their own product or services improvement, is selling this info to third parties.

    Please correct me if I'm wrong.
     
  22. JohnnyGo macrumors 6502a

    Joined:
    Sep 9, 2009
    #22
    No they're not!

    Instead they seem to be able to use the COLLECTIVE data from all of their users to help/aid/assist each one individually. Potential examples:

    - you search for a restaurant in a particular area or cuisine, you will receive the list ordered by how many more other users searched, drove to or marked said restaurants in the list

    - you are typing the name of a recent newsworthy politician and the predictive keyboard in iOS will suggest said individual (trending)

    That's what I understood is being done: AI powered by collective/anonymous data gathering.
     
  23. JohnnyGo macrumors 6502a

    Joined:
    Sep 9, 2009
    #23
    Correct. Using such data is definitely not selling the data.
     
  24. Tubamajuba macrumors 68000

    Joined:
    Jun 8, 2011
    #24
    You might have missed it, but there was an article to read that actually explains it. It's in the first post.
     
  25. now i see it macrumors 65816

    Joined:
    Jan 2, 2002
    #25
    "Differential privacy" is just an obtuse and hopefully anonymous method of data mining. For Apple's benefit, not ours.

    Why not call it for what it is? "Hopefully Anonymous User Unwanted Spying"
    But I guess that wouldn't go over too well at the keynote.
     

Share This Page