Apple Opens Cryptographic Libraries to Third-Party Developers to Encourage Security

Discussion in 'Mac Blog Discussion' started by MacRumors, Oct 30, 2015.

  1. MacRumors macrumors bot

    MacRumors

    Joined:
    Apr 12, 2001
    #1
    [​IMG]


    Apple announced yesterday that the company has opened up its cryptographic libraries so that third-party developers can build more "advanced security features" into their apps (via VentureBeat). The cryptographic libraries being opened to developers are the same ones Apple uses to protect iOS and OS X, as Apple notes on its updated site.

    [​IMG]

    Developers will have access to two of the company's advanced security features, including Security Framework and Common Crypto. Security Framework gives developers tools for organizing certificates, public and private keys, and trust policies, ensuring that all sensitive information is stored privately in a "secure repository for sensitive user data." Common Crypto library provides additional support for symmetric encryption, hash-based message authentication codes, and digests.
    Check out Apple's official website for reference sheets, service guides, and links to the open source releases for Security Framework and Common Crypto libraries.

    Article Link: Apple Opens Cryptographic Libraries to Third-Party Developers to Encourage Security
     
  2. ArtOfWarfare macrumors G3

    ArtOfWarfare

    Joined:
    Nov 26, 2007
    #2
    Hopefully we have more whitehats than blackhats looking at this stuff...
     
  3. jayducharme macrumors 68040

    jayducharme

    Joined:
    Jun 22, 2006
    Location:
    The thick of it
    #3
    That was my thought as well. Plus, could this knowledge enable a small "back door" that the government has been pestering Apple about?
     
  4. konqerror macrumors 6502

    Joined:
    Dec 31, 2013
    #4
    I don't think these are "advanced" capabilities, but really intended for applications which don't use Cocoa for whatever reason. It looks like Apple wants to replace OpenSSL by providing a low-level C interface to native crypto.
     
  5. SpinThis!, Oct 30, 2015
    Last edited: Oct 30, 2015

    SpinThis! macrumors 6502

    Joined:
    Jan 30, 2007
    Location:
    Inside the Machine (Green Bay, WI)
    #5
    No. It doesn't matter. Good security isn't based on obscurity. The current security we have is based on our collective knowledge of mathematics. It's good that Apple is opening this up. If developers need to do secure hashing or what not, it's better to use a tried and tested crypto algorithm than trying to roll your own.
     
  6. KALLT macrumors 601

    Joined:
    Sep 23, 2008
    #6
    El Capitan uses LibreSSL now for secure shells. They are at least looking at different options.
     
  7. ArtOfWarfare macrumors G3

    ArtOfWarfare

    Joined:
    Nov 26, 2007
    #7
    No, as others said, security comes from having good practices, not by keeping your practices secret.

    Linux is generally regarded as the most secure platform, even though it's completely open source. Lots of people have read through the code looking for weaknesses, and lots of people have contributed fixes for any weaknesses they find.
     
  8. RabidMacFan macrumors regular

    Joined:
    Jun 19, 2012
    Location:
    California
    #8
    This seems misreported. The only thing new here is the source code for corecrypto. This does not appear to be made to allow third-party-developers to implement new security API's. From the page itself:

    Both Security Framework and Common Crypto rely on the corecrypto library to provide implementations of low level cryptographic primitives. This is also the library submitted for validation of compliance with U.S. Federal Information Processing Standards (FIPS) 140-2 Level 1. Although corecrypto does not directly provide programming interfaces for developers and should not be used by iOS or OS X apps, the source code is available to allow for verification of its security characteristics and correct functioning.

    (emphasis is mine)

    From what I can understand, there are no new usable libraries or API's here. OS X and iOS developers can use Apple's existing Security Framework and Common Crypto services like they have always done.

    Whats new is that the source code for the CoreCrypto library that Security Framework and Common Crypto use is available. This allows developers and security professionals to better understand what the existing frameworks were already doing in the background.

    One of the benefits of this release is to allow auditing of the source code, and to give assurance to developers with a "trust no one" attitude that the built-in libraries are secure and safe to use.
     
  9. Sasparilla macrumors 6502a

    Joined:
    Jul 6, 2012
    #10
    Generally more visibility leads to flaws being found and a more secure system, not less. Also seems like Apple says no to the Feds for that kind of stuff, so I doubt it (that being said you can always roll your own security if you don't trust it). With regards to Apple working with the government (its possible) but you don't see articles like the following about Microsoft (who's known to have worked hand in glove with the NSA):

    https://theintercept.com/2015/03/10/ispy-cia-campaign-steal-apples-secrets/

    Seems like Apple really does say no to the govt.. I think the questions are whether they've been compromised despite saying "No" (the article above) or at what point does the government secretly force them to do stuff they don't want to & can't talk about. If I was the govt I'd go after the compiler and / or the firmware images.
     
  10. .max macrumors member

    .max

    Joined:
    Feb 24, 2009
    #11
    I agree. The misinterpretation is in this phrase in the article: "Developers will have access to two of the company's advanced security features". Developers have had access to these features for years.

    What's different is that now, for example, they can use the open source code and be sure that there are no backdoors. If Apple has (or would be forced in the future to have) backdoors in the system, apps compiled with the open source code would be at less risk.
     
  11. nt5672 macrumors 68000

    Joined:
    Jun 30, 2007
    #12
    No, releasing the source code is no guarantee that Apple is actually using said source code. If the three letter agency tells Apple to add something, then it can still be added and we won't know. We can't rely on checksums of the library because compiler options could make differences.

    If Apple wanted to be really secure then they would include the production build expected MD5 or better checksums for each version of the library. That way it could be independently verified both in the OS and in the open source repository. While the code does provide an Xcode project file, it remains to be seen if Apple actually uses this project file to build the library.
     
  12. Bob Zimmerman macrumors member

    Joined:
    Aug 31, 2015
    #13
    Linux's security reputation is actually pretty bad. BSDs and UNIX variants have a good reputation. OpenBSD in particular is highly-regarded, in part because they audit changes rather stringently. Windows is getting better, and Linux is getting significantly worse.

    OpenSSL is an extremely strong proof against the idea that open source is inherently more secure. It had tremendous numbers of very serious flaws that had been in it for years. Just because people can look at the code doesn't mean that they do. That's why the OpenBSD foundation forked OpenSSL, removed tons of options, and started developing it with their audit model as LibreSSL. It's why the OpenBSD guys recently replaced sudo with a new tool called doas that has far fewer options and as a result, far less that can go wrong.

    In my experience, much software developed for Linux is built on the platform because it's free and it lets the developers work very quickly. Tons of open-source code runs on Linux, so you just have to download a bunch of libraries and write glue code to get them to do what you want. Unfortunately, many don't take the time to set up their application to run properly under a non-root user account. For that matter, the recommended installation method for a lot of software now is to run curl to fetch a URL, then pipe the output to a root-level bash shell. That is literally telling your system to do whatever some web server or anything claiming to be that server tells it to do.
     
  13. ArtOfWarfare macrumors G3

    ArtOfWarfare

    Joined:
    Nov 26, 2007
    #14
    Teehee, that's how I tell people to install a mod for StarCraft 2 on OS X.

    Just run
    Code:
    python < (curl url)
    And it automatically runs the latest version of my Python script (hosted on my server) which searches your entire file system for your StarCraft 2 install, then installs all the latest files for the mod (not stored on my server - I didn't make the mod, just the OS X installer.)
     
  14. usarioclave macrumors 65816

    Joined:
    Sep 26, 2003
    #15
    Well, I can tell you that obscurity is one major part of security.

    The whole "security by obsucurity is bad" idea was probably started by the NSA decades ago. Half of the security problem should be figuring out what the target is doing; ask a lockpicker how much easier their job is if they know the mechanism and model of lock they're going to be working on. Ask anyone who cracks software for a living if knowing specific information about the target is useful.

    That's not to say auditing is bad; it isn't. But the idea of just putting your encryption and security regime out there is stupid. Make the attacker do some work, for god's sake.
     
  15. CreatorCode macrumors regular

    CreatorCode

    Joined:
    Apr 15, 2015
    Location:
    US
    #16
    Apple is using standard algorithms (AES and SHA2 and such) that everybody already knows about. They're as good as it gets; Apple can't improve on them. It just has to implement them without screwing it up.

    The only risk comes if Apple has, in fact, screwed up. And the best way to avoid that is for Apple to show its work and let researchers and programmers examine them inside and out.
     
  16. BigPrince macrumors 68020

    Joined:
    Dec 27, 2006
    #17
  17. Parasprite macrumors 68000

    Parasprite

    Joined:
    Mar 5, 2013
    #18
    The actual quote in the citation:

    "System security should not depend on the secrecy of the implementation or its components."

    In other words, hiding the code doesn't replace good security practices.
     
  18. ChrisA macrumors G4

    Joined:
    Jan 5, 2006
    Location:
    Redondo Beach, California
    #19
    No. The best locks are the ones where everyone knows how they work and can take them apart and see the insides.

    Think about what would happen if this were not the case: If a lock depended on some secret design trick then you'd never really know if the secret was leaked out or maybe some smart person guessed it. But if there is NO SECRET then there can never be a leak.

    So Apple and others place all the design details out in the open for all to see.
     
  19. sudo1996, Nov 1, 2015
    Last edited: Nov 1, 2015

    sudo1996 Suspended

    sudo1996

    Joined:
    Aug 21, 2015
    Location:
    Berkeley, CA, USA
    #20
    Am I missing something, or is there nothing reassuring about them making this open-source? Just because Apple shows us some source code and says it's the source to their crypto libraries doesn't mean that we know that that code was used to compile the libraries.

    What OS X and iOS actually run could be anything, unless I'm missing something. I remember there was an "open-source" Mac Trojan horse a while back. The source code was innocent, but they distributed a malicious binary with it that people trusted and ran.
     
  20. sudo1996, Nov 1, 2015
    Last edited: Nov 1, 2015

    sudo1996 Suspended

    sudo1996

    Joined:
    Aug 21, 2015
    Location:
    Berkeley, CA, USA
    #21
    I think that in the short term, the "security through obscurity" actually works the best. It's a huge PITA to find flaws in closed-source code, so that discourages hackers and/or slows their progress. That being said, I would NOT vouch for it because it's a horrible long-term solution to having a secure system and does not contribute well to the computer science community (partially a "greater good" argument, I know).
     
  21. xbjllb macrumors 65816

    xbjllb

    Joined:
    Jan 4, 2008
  22. ArtOfWarfare macrumors G3

    ArtOfWarfare

    Joined:
    Nov 26, 2007
    #23
    I think you misunderstand me. A whitehat person looking at this will discover the vulnerability, tell Apple about it, give Apple 90 days to patch it, and then publicly talk about it.

    A blackhat person looking at it will discover the vulnerability and simply exploit it.

    A whitehat person would actually be useful for patching the vulnerability. A blackhat person would expose the vulnerability as a side effect of their criminal actions, perhaps, if the breach is even detected.

    I know that open is better than closed. I'm just saying, hopefully any vulnerabilities found by this code moving from closed to open get found by whitehats before blackhats.
     
  23. ChrisA macrumors G4

    Joined:
    Jan 5, 2006
    Location:
    Redondo Beach, California
    #24
    In theory you could verify that the source code is in fact running. Compile the source to binary and look at the binaries. It would take some effort but it certainly could be done.

    You could also disassemble (convert the binary to assembly) the crypto library and look at what the code is doing and verify that it is doing about the same job as the C source code.

    Both methods are labor intensive because you don't get a byte per byte match even if everything is as it should be. Even so, I bet someone has already looked and we'd hear quickly if we were tricked.
     
  24. ChrisA macrumors G4

    Joined:
    Jan 5, 2006
    Location:
    Redondo Beach, California
    #25
    Not much. Even if the code were kept closed, people can look at the running binary code. It is not that hard to read. In the old days I used to write in Cobol for IBM mainframes and it was common to get a "core dump". This was a printed hexadecimal dump of the computer's RAM to paper. We'd plow through it with a pencil and figure out what went wrong. We did not have debuggers. Anyone can still do this if they take the time. The Intel instruction set is more complex than was IBM360 but not by much.

    Certainly the average user can't read a hex dump of a closed source crypto library but many people can. So closing it just makes it harder to read, not impossible.

    That said, there are systems were the executable code itself is encrypted. These are completely unreadable and I'd worry there are backdoors and whatever in there
     

Share This Page