Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
63,547
30,863



Apple announced yesterday that the company has opened up its cryptographic libraries so that third-party developers can build more "advanced security features" into their apps (via VentureBeat). The cryptographic libraries being opened to developers are the same ones Apple uses to protect iOS and OS X, as Apple notes on its updated site.

crypto-libraries.jpg

Developers will have access to two of the company's advanced security features, including Security Framework and Common Crypto. Security Framework gives developers tools for organizing certificates, public and private keys, and trust policies, ensuring that all sensitive information is stored privately in a "secure repository for sensitive user data." Common Crypto library provides additional support for symmetric encryption, hash-based message authentication codes, and digests.
Both Security Framework and Common Crypto rely on the corecrypto library to provide implementations of low level cryptographic primitives. This is also the library submitted for validation of compliance with U.S. Federal Information Processing Standards (FIPS) 140-2 Level 1. Although corecrypto does not directly provide programming interfaces for developers and should not be used by iOS or OS X apps, the source code is available to allow for verification of its security characteristics and correct functioning.
Check out Apple's official website for reference sheets, service guides, and links to the open source releases for Security Framework and Common Crypto libraries.

Article Link: Apple Opens Cryptographic Libraries to Third-Party Developers to Encourage Security
 

konqerror

macrumors 68020
Dec 31, 2013
2,298
3,700
I don't think these are "advanced" capabilities, but really intended for applications which don't use Cocoa for whatever reason. It looks like Apple wants to replace OpenSSL by providing a low-level C interface to native crypto.
 

SpinThis!

macrumors 6502
Jan 30, 2007
480
135
Inside the Machine (Green Bay, WI)
That was my thought as well. Plus, could this knowledge enable a small "back door" that the government has been pestering Apple about?
No. It doesn't matter. Good security isn't based on obscurity. The current security we have is based on our collective knowledge of mathematics. It's good that Apple is opening this up. If developers need to do secure hashing or what not, it's better to use a tried and tested crypto algorithm than trying to roll your own.
 
Last edited:

ArtOfWarfare

macrumors G3
Nov 26, 2007
9,561
6,059
That was my thought as well. Plus, could this knowledge enable a small "back door" that the government has been pestering Apple about?

No, as others said, security comes from having good practices, not by keeping your practices secret.

Linux is generally regarded as the most secure platform, even though it's completely open source. Lots of people have read through the code looking for weaknesses, and lots of people have contributed fixes for any weaknesses they find.
 

RabidMacFan

macrumors 6502
Jun 19, 2012
356
170
California
This seems misreported. The only thing new here is the source code for corecrypto. This does not appear to be made to allow third-party-developers to implement new security API's. From the page itself:

Both Security Framework and Common Crypto rely on the corecrypto library to provide implementations of low level cryptographic primitives. This is also the library submitted for validation of compliance with U.S. Federal Information Processing Standards (FIPS) 140-2 Level 1. Although corecrypto does not directly provide programming interfaces for developers and should not be used by iOS or OS X apps, the source code is available to allow for verification of its security characteristics and correct functioning.

(emphasis is mine)

From what I can understand, there are no new usable libraries or API's here. OS X and iOS developers can use Apple's existing Security Framework and Common Crypto services like they have always done.

Whats new is that the source code for the CoreCrypto library that Security Framework and Common Crypto use is available. This allows developers and security professionals to better understand what the existing frameworks were already doing in the background.

One of the benefits of this release is to allow auditing of the source code, and to give assurance to developers with a "trust no one" attitude that the built-in libraries are secure and safe to use.
 

Sasparilla

macrumors 68000
Jul 6, 2012
1,962
3,378
That was my thought as well. Plus, could this knowledge enable a small "back door" that the government has been pestering Apple about?

Generally more visibility leads to flaws being found and a more secure system, not less. Also seems like Apple says no to the Feds for that kind of stuff, so I doubt it (that being said you can always roll your own security if you don't trust it). With regards to Apple working with the government (its possible) but you don't see articles like the following about Microsoft (who's known to have worked hand in glove with the NSA):

https://theintercept.com/2015/03/10/ispy-cia-campaign-steal-apples-secrets/

Seems like Apple really does say no to the govt.. I think the questions are whether they've been compromised despite saying "No" (the article above) or at what point does the government secretly force them to do stuff they don't want to & can't talk about. If I was the govt I'd go after the compiler and / or the firmware images.
 

.max

macrumors member
Feb 24, 2009
57
78
This seems misreported.

I agree. The misinterpretation is in this phrase in the article: "Developers will have access to two of the company's advanced security features". Developers have had access to these features for years.

What's different is that now, for example, they can use the open source code and be sure that there are no backdoors. If Apple has (or would be forced in the future to have) backdoors in the system, apps compiled with the open source code would be at less risk.
 
  • Like
Reactions: CreatorCode

nt5672

macrumors 68040
Jun 30, 2007
3,334
7,014
Midwest USA
. . . .

What's different is that now, for example, they can use the open source code and be sure that there are no backdoors. . . .
No, releasing the source code is no guarantee that Apple is actually using said source code. If the three letter agency tells Apple to add something, then it can still be added and we won't know. We can't rely on checksums of the library because compiler options could make differences.

If Apple wanted to be really secure then they would include the production build expected MD5 or better checksums for each version of the library. That way it could be independently verified both in the OS and in the open source repository. While the code does provide an Xcode project file, it remains to be seen if Apple actually uses this project file to build the library.
 

Bob Zimmerman

macrumors member
Aug 31, 2015
64
86
No, as others said, security comes from having good practices, not by keeping your practices secret.

Linux is generally regarded as the most secure platform, even though it's completely open source. Lots of people have read through the code looking for weaknesses, and lots of people have contributed fixes for any weaknesses they find.
Linux's security reputation is actually pretty bad. BSDs and UNIX variants have a good reputation. OpenBSD in particular is highly-regarded, in part because they audit changes rather stringently. Windows is getting better, and Linux is getting significantly worse.

OpenSSL is an extremely strong proof against the idea that open source is inherently more secure. It had tremendous numbers of very serious flaws that had been in it for years. Just because people can look at the code doesn't mean that they do. That's why the OpenBSD foundation forked OpenSSL, removed tons of options, and started developing it with their audit model as LibreSSL. It's why the OpenBSD guys recently replaced sudo with a new tool called doas that has far fewer options and as a result, far less that can go wrong.

In my experience, much software developed for Linux is built on the platform because it's free and it lets the developers work very quickly. Tons of open-source code runs on Linux, so you just have to download a bunch of libraries and write glue code to get them to do what you want. Unfortunately, many don't take the time to set up their application to run properly under a non-root user account. For that matter, the recommended installation method for a lot of software now is to run curl to fetch a URL, then pipe the output to a root-level bash shell. That is literally telling your system to do whatever some web server or anything claiming to be that server tells it to do.
 

ArtOfWarfare

macrumors G3
Nov 26, 2007
9,561
6,059
For that matter, the recommended installation method for a lot of software now is to run curl to fetch a URL, then pipe the output to a root-level bash shell. That is literally telling your system to do whatever some web server or anything claiming to be that server tells it to do.
Teehee, that's how I tell people to install a mod for StarCraft 2 on OS X.

Just run
Code:
python < (curl url)

And it automatically runs the latest version of my Python script (hosted on my server) which searches your entire file system for your StarCraft 2 install, then installs all the latest files for the mod (not stored on my server - I didn't make the mod, just the OS X installer.)
 
  • Like
Reactions: sudo1996

usarioclave

macrumors 65816
Sep 26, 2003
1,447
1,506
No. It doesn't matter. Good security isn't based on obscurity.

Well, I can tell you that obscurity is one major part of security.

The whole "security by obsucurity is bad" idea was probably started by the NSA decades ago. Half of the security problem should be figuring out what the target is doing; ask a lockpicker how much easier their job is if they know the mechanism and model of lock they're going to be working on. Ask anyone who cracks software for a living if knowing specific information about the target is useful.

That's not to say auditing is bad; it isn't. But the idea of just putting your encryption and security regime out there is stupid. Make the attacker do some work, for god's sake.
 

CreatorCode

macrumors regular
Apr 15, 2015
159
279
US
Well, I can tell you that obscurity is one major part of security.

The whole "security by obsucurity is bad" idea was probably started by the NSA decades ago. Half of the security problem should be figuring out what the target is doing; ask a lockpicker how much easier their job is if they know the mechanism and model of lock they're going to be working on. Ask anyone who cracks software for a living if knowing specific information about the target is useful.

That's not to say auditing is bad; it isn't. But the idea of just putting your encryption and security regime out there is stupid. Make the attacker do some work, for god's sake.

Apple is using standard algorithms (AES and SHA2 and such) that everybody already knows about. They're as good as it gets; Apple can't improve on them. It just has to implement them without screwing it up.

The only risk comes if Apple has, in fact, screwed up. And the best way to avoid that is for Apple to show its work and let researchers and programmers examine them inside and out.
 

ChrisA

macrumors G5
Jan 5, 2006
12,581
1,695
Redondo Beach, California
Hopefully we have more whitehats than blackhats looking at this stuff...

No. The best locks are the ones where everyone knows how they work and can take them apart and see the insides.

Think about what would happen if this were not the case: If a lock depended on some secret design trick then you'd never really know if the secret was leaked out or maybe some smart person guessed it. But if there is NO SECRET then there can never be a leak.

So Apple and others place all the design details out in the open for all to see.
 

sudo1996

Suspended
Aug 21, 2015
1,496
1,182
Berkeley, CA, USA
Am I missing something, or is there nothing reassuring about them making this open-source? Just because Apple shows us some source code and says it's the source to their crypto libraries doesn't mean that we know that that code was used to compile the libraries.

What OS X and iOS actually run could be anything, unless I'm missing something. I remember there was an "open-source" Mac Trojan horse a while back. The source code was innocent, but they distributed a malicious binary with it that people trusted and ran.
 
Last edited:

sudo1996

Suspended
Aug 21, 2015
1,496
1,182
Berkeley, CA, USA
No. The best locks are the ones where everyone knows how they work and can take them apart and see the insides.

Think about what would happen if this were not the case: If a lock depended on some secret design trick then you'd never really know if the secret was leaked out or maybe some smart person guessed it. But if there is NO SECRET then there can never be a leak.

So Apple and others place all the design details out in the open for all to see.
I think that in the short term, the "security through obscurity" actually works the best. It's a huge PITA to find flaws in closed-source code, so that discourages hackers and/or slows their progress. That being said, I would NOT vouch for it because it's a horrible long-term solution to having a secure system and does not contribute well to the computer science community (partially a "greater good" argument, I know).
 
Last edited:

ArtOfWarfare

macrumors G3
Nov 26, 2007
9,561
6,059
No. The best locks are the ones where everyone knows how they work and can take them apart and see the insides.

Think about what would happen if this were not the case: If a lock depended on some secret design trick then you'd never really know if the secret was leaked out or maybe some smart person guessed it. But if there is NO SECRET then there can never be a leak.

So Apple and others place all the design details out in the open for all to see.

I think you misunderstand me. A whitehat person looking at this will discover the vulnerability, tell Apple about it, give Apple 90 days to patch it, and then publicly talk about it.

A blackhat person looking at it will discover the vulnerability and simply exploit it.

A whitehat person would actually be useful for patching the vulnerability. A blackhat person would expose the vulnerability as a side effect of their criminal actions, perhaps, if the breach is even detected.

I know that open is better than closed. I'm just saying, hopefully any vulnerabilities found by this code moving from closed to open get found by whitehats before blackhats.
 

ChrisA

macrumors G5
Jan 5, 2006
12,581
1,695
Redondo Beach, California
Am I missing something, or is there nothing reassuring about them making this open-source? Just because Apple shows us some source code and says it's the source to their crypto libraries doesn't mean that we know that that code was used to compile the libraries.

What OS X and iOS actually run could be anything, unless I'm missing something. I remember there was an "open-source" Mac Trojan horse a while back. The source code was innocent, but they distributed a malicious binary with it that people trusted and ran.

In theory you could verify that the source code is in fact running. Compile the source to binary and look at the binaries. It would take some effort but it certainly could be done.

You could also disassemble (convert the binary to assembly) the crypto library and look at what the code is doing and verify that it is doing about the same job as the C source code.

Both methods are labor intensive because you don't get a byte per byte match even if everything is as it should be. Even so, I bet someone has already looked and we'd hear quickly if we were tricked.
 

ChrisA

macrumors G5
Jan 5, 2006
12,581
1,695
Redondo Beach, California
What could go wrong?

:apple:

Not much. Even if the code were kept closed, people can look at the running binary code. It is not that hard to read. In the old days I used to write in Cobol for IBM mainframes and it was common to get a "core dump". This was a printed hexadecimal dump of the computer's RAM to paper. We'd plow through it with a pencil and figure out what went wrong. We did not have debuggers. Anyone can still do this if they take the time. The Intel instruction set is more complex than was IBM360 but not by much.

Certainly the average user can't read a hex dump of a closed source crypto library but many people can. So closing it just makes it harder to read, not impossible.

That said, there are systems were the executable code itself is encrypted. These are completely unreadable and I'd worry there are backdoors and whatever in there
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.