PDA

View Full Version : Mac OS X uses encrypted files


MacBytes
Nov 10, 2006, 07:28 PM
http://www.macbytes.com/images/bytessig.gif (http://www.macbytes.com)

Category: Mac OS X
Link: Mac OS X uses encrypted files (http://www.macbytes.com/link.php?sid=20061110202816)
Description:: none

Posted on MacBytes.com (http://www.macbytes.com)
Approved by Mudbug

mkrishnan
Nov 10, 2006, 07:29 PM
Interesting!

SMM
Nov 10, 2006, 07:34 PM
So what? Apple is trying to protect itself? There WOULD be a story if it were the opposite.

bluebomberman
Nov 10, 2006, 07:40 PM
But it's a decidedly different approach to Windows' antipiracy measures. I think that deserves to be noted.

ChrisA
Nov 10, 2006, 07:52 PM
The author of this article failed to look up "TPM" (trusted Platform Modual" in wikipedia.
Had he done so he would not have made the technical errors he did. Intel macs have a TPM chip on the M/B the chip is used to de-crypt the binary. This has nothing to do with stoping people from giving away copies of the software. Read the Wiki.

bousozoku
Nov 10, 2006, 08:39 PM
The author of this article failed to look up "TPM" (trusted Platform Modual" in wikipedia.
Had he done so he would not have made the technical errors he did. Intel macs have a TPM chip on the M/B the chip is used to de-crypt the binary. This has nothing to do with stoping people from giving away copies of the software. Read the Wiki.

Everywhere else I've read, the TPM is not being used. It is available though.

The technique of encryption of operating system modules is also not that unusual but in this case, it assures that the Intel version of Mac OS X is running on a machine for which it was meant.

Don't put absolute faith in Wikipedia documents--they're written by people.

matticus008
Nov 10, 2006, 09:04 PM
The technique of encryption of operating system modules is also not that unusual but in this case, it assures that the Intel version of Mac OS X is running on a machine for which it was meant.
To take that one step further for the obvious-impaired, this is a system which protects unauthorized modification of system files and creates an added layer of security.

It does not preclude anyone from replacing them if they choose to do so, however. You can disable the encrypted check and replace modules with unencrypted ones if you were so inclined or needed to run a customized version of an encrypted system file.

In other words, it has a legitimate purpose.

Nermal
Nov 10, 2006, 09:24 PM
The author of this article failed to look up "TPM" (trusted Platform Modual" in wikipedia.
Had he done so he would not have made the technical errors he did. Intel macs have a TPM chip on the M/B the chip is used to de-crypt the binary. This has nothing to do with stoping people from giving away copies of the software. Read the Wiki.

TPM is not used, the new MBPs don't even have a TPM chip. In fact, the original article (which I've lost the link to) mentions this.

Analog Kid
Nov 10, 2006, 09:39 PM
TPM is not used, the new MBPs don't even have a TPM chip. In fact, the original article (which I've lost the link to) mentions this.
These links will do:
MacDev (http://www.oreillynet.com/mac/blog/2006/11/apple_and_trusted_computing_or_1.html?CMP=OTC-13IV03560550&ATT=Apple+and+trusted+computing+or+not)
OS X Book, Link 1 (http://www.osxbook.com/book/bonus/chapter10/tpm/)
OS X Book, Link 2 (http://www.osxbook.com/book/bonus/chapter7/binaryprotection/)

I tend to put some faith in this source considering he wrote a driver to access the TPM. Apple isn't using the TPM and at least some Intel machines don't even have one.

It's amazing how widely spread the myth is that Apple is locking their OS with the TPM-- probably because so many people say "you shouldn't trust Wikipedia" but do anyway...

It does raise the question though where Apple keeps their keys... Is that what the limerick is used for? He refers to that as integrity data though... I can't think of where they could hide a key though that couldn't be easily extracted so you may as well use a copywriten limerick as the key.

Nermal
Nov 10, 2006, 09:45 PM
OS X Book, Link 2 (http://www.osxbook.com/book/bonus/chapter7/binaryprotection/)

That's the one I was thinking of :)

shadowfax
Nov 11, 2006, 03:59 AM
That article (OS X Book, Link 2) makes me curious as to what (if any) performance hit the executables take, since that module that acts as a vnode for the "encrypted" segments seems to "decrypt" those segments from RAM every time the processor requests them. I suppose this would depend on the size of the segments.

I also wonder if those segments are ever stored unencrypted in L1/L2 cache, and if there's any way to look at that and extrapolate decrypted versions of the segments, replace them, and remove that flag from the executable.

The other bit I don't understand is whether this has anything to do with keeping OS X off my Dell (no I do not actually own one). I mean, it all sounds like it isn't looking at any specific hardware in the machine. It really sounds like it's more to make these kids a little harder to reverse engineer, which I really don't think anybody cares too much about (although, figuring out how to break such protection would be interesting, no doubt).

matticus008
Nov 11, 2006, 04:52 AM
That article (OS X Book, Link 2) makes me curious as to what (if any) performance hit the executables take, since that module that acts as a vnode for the "encrypted" segments seems to "decrypt" those segments from RAM every time the processor requests them. I suppose this would depend on the size of the segments.
It's my understanding that the files are decrypted at boot time, and once decrypted, remain in RAM until the next boot cycle--there's no performance hit beyond the slight additional time needed to decrypt the files during the boot process. They're not stored in RAM in their encrypted form, in other words. Granted, it has been quite some time since I paid close attention to this issue.

bousozoku
Nov 11, 2006, 05:49 AM
It's my understanding that the files are decrypted at boot time, and once decrypted, remain in RAM until the next boot cycle--there's no performance hit beyond the slight additional time needed to decrypt the files during the boot process. They're not stored in RAM in their encrypted form, in other words. Granted, it has been quite some time since I paid close attention to this issue.

That's not as stringent as OpenBSD where RAM is checked to make sure that it hasn't had modification to code.

Analog Kid
Nov 11, 2006, 06:59 AM
The other bit I don't understand is whether this has anything to do with keeping OS X off my Dell (no I do not actually own one). I mean, it all sounds like it isn't looking at any specific hardware in the machine. It really sounds like it's more to make these kids a little harder to reverse engineer, which I really don't think anybody cares too much about (although, figuring out how to break such protection would be interesting, no doubt).
That's the part I'm not getting either, and the articles seem to skip over that part. Where does Apple keep the keys? It seems like, without a TPM type interface, the keys can always be found. From what I'm seeing so far, the encryption doesn't actually prevent someone from hacking it onto PC hardware, just makes it very clear that what the hacker is doing is illegal.

I guess that fits the evidence...
It's my understanding that the files are decrypted at boot time, and once decrypted, remain in RAM until the next boot cycle--there's no performance hit beyond the slight additional time needed to decrypt the files during the boot process. They're not stored in RAM in their encrypted form, in other words. Granted, it has been quite some time since I paid close attention to this issue.
The way I read it, it's decrypted into virtual memory and there's no method for paging it out. I think that means that if it were paged out, it would be thrown away and would need to be decrypted again-- but since these are high use modules, they're unlikely to be paged out at all.

59031
Nov 11, 2006, 01:05 PM
Don't put absolute faith in Wikipedia documents--they're written by people.

And so is the Encyclopedia Britanica....

From what I'm seeing so far, the encryption doesn't actually prevent someone from hacking it onto PC hardware, just makes it very clear that what the hacker is doing is illegal.

And very difficult....

bousozoku
Nov 11, 2006, 02:43 PM
And so is the Encyclopedia Britannica....
...

They have to correct mistakes, as well.

Have you heard of revisionist history? Simply finding it coming from a certain source does not make it the truth, whether people believe it or not.

mkrishnan
Nov 11, 2006, 02:51 PM
That's not as stringent as OpenBSD where RAM is checked to make sure that it hasn't had modification to code.

Is that an anti-infection feature more than an anti-piracy one? It sounds like an excellent idea to use some kind of check codes for kernel elements in memory....

Is there any way Apple can adapt Plays For Sure to protect its OS? Just to spite MS, now that it's done with it? :D That way we'll be sure that at least the Zune is an MS device that will never run OS X. :D

bousozoku
Nov 11, 2006, 03:37 PM
Is that an anti-infection feature more than an anti-piracy one? It sounds like an excellent idea to use some kind of check codes for kernel elements in memory....

Is there any way Apple can adapt Plays For Sure to protect its OS? Just to spite MS, now that it's done with it? :D That way we'll be sure that at least the Zune is an MS device that will never run OS X. :D

Yes, they're making sure that the operating system is not compromised. They check and re-check the code before running it. I've not seen anything about the runtime latency. I would assume that the virtual memory would have to be very fast for the system to not appear sluggish.

I'm not sure that the Plays for Sure programme is done. It's just that Microsoft doesn't need to proclaim it for their player. Everyone else must comply. :D

I'm sure that Zune will run Linux, just to see that it can be done, but of course, it probably already runs on the Toshiba Gigabeat. Does that mean that MS has to go to greater lengths to make sure their device isn't compromised?

bobber205
Nov 11, 2006, 07:59 PM
I hope this messes up dorks at my college from trying to pirate OS X. They asked me for my copy to copy.

Guess what I told them. ;)

mkrishnan
Nov 11, 2006, 08:03 PM
I hope this messes up dorks at my college from trying to pirate OS X. They asked me for my copy to copy.

Guess what I told them. ;)

But so for some reason, this logic does not apply to all those people running bootleg OSX86 type computers?

Analog Kid
Nov 11, 2006, 08:09 PM
And very difficult....
I guess that depends on your reference point for difficult.

shadowfax
Nov 11, 2006, 09:12 PM
I guess that depends on your reference point for difficult.

yeah, I'll say. some girl at blackhat root-kitted windows by malloc-ing memory til it paged the kernel out, and then used some I/O magic to overwrite sections of the kernel. That was cool.

if you have root access on OS X, I don't think it'd be that hard to pull out the decrypted segments of the executable from RAM and piece them back together into the original executable, then remove that LC flag or whatever it was called...

Seems that in cryptography that the chain's only as strong as it's weakest link--for example, if you're sending a message with 2048-bit encryption or something, and you decide to send your buddy the key to decrypt it over the channel, unencrypted... you might has well have not bothered. This isn't exactly like that, of course, but it's still odd. I mean, you're storing it with encrypted sections on disk, and that's all well and good, but it seems like somewhat of a waste of time--decompiling binaries is a challenging art to begin with, especially if you obfuscate your source code to begin with. It seems like most 'hackers' who could decompile an executable into something they could modify/otherwise steal could get the bit out of ram. I'm just rambling now, but yeah.

I really don't see why this is getting published. This really sounds like an obfuscation scheme. "packing" executables has been around for years, as a way to save space as well as to keep nosey eyes out. The trade-off with "packing" is that you often get a (usually minor) performance hit. It seems like Apple's got a sort-of solution to that by only encrypting parts of the code. Who cares? everyone who's not doing open-source code obfuscates their stuff somehow... why not put that in the news? seems like it might help whoever thought this was news....

Analog Kid
Nov 12, 2006, 03:45 AM
yeah, I'll say. some girl at blackhat root-kitted windows by malloc-ing memory til it paged the kernel out, and then used some I/O magic to overwrite sections of the kernel. That was cool.

Maybe it's the engineer in me, but I agree that is pretty freakin' cool. And people wonder why it's so hard to secure a system...

Some poor MS engineer is seeing that and asking "some nut case gobbled up the entire system memory, pushed my kernel code to disk, tweaked it to root the system, and you're saying that's my fault?!".

Yeah, I know, proper permissions on the swap file probably would have prevented it, but you know that girl was awfully darned proud of herself when that one worked.

shadowfax
Nov 12, 2006, 04:04 AM
Maybe it's the engineer in me, but I agree that is pretty freakin' cool. And people wonder why it's so hard to secure a system...

Some poor MS engineer is seeing that and asking "some nut case gobbled up the entire system memory, pushed my kernel code to disk, tweaked it to root the system, and you're saying that's my fault?!".

Yeah, I know, proper permissions on the swap file probably would have prevented it, but you know that girl was awfully darned proud of herself when that one worked.
I think that Microsoft has or will go the safer-than-that route and modified the OS to require that the kernel never be paged out, no matter how much any other program mallocs--this seems more secure and performance-enhancing. I can't remember if microsoft did this, but that's what the woman recommended when she demonstrated her exploit.

shamino
Nov 12, 2006, 01:07 PM
Everywhere else I've read, the TPM is not being used. It is available though.
Actually, a few articles I've read say that the newest Macs (the Core 2 Duo boxes) don't even have TPM chips.

Mac OS does not use TPM for anything. This has been proven by many authors.

shamino
Nov 12, 2006, 01:11 PM
That's the part I'm not getting either, and the articles seem to skip over that part. Where does Apple keep the keys? It seems like, without a TPM type interface, the keys can always be found. From what I'm seeing so far, the encryption doesn't actually prevent someone from hacking it onto PC hardware, just makes it very clear that what the hacker is doing is illegal.
This kind of encryption does nothing to prevent you from running the OS on third-party hardware. As you pointed out, the decryption keys are part of that same OS.

What it prevents is third parties from easily disassembling and reverse-engineering key non-open-source parts of the OS (which might include the code used to determine if the hardware is made by Apple or not.)

MikeTheC
Nov 12, 2006, 08:33 PM
From what I'm seeing so far, the encryption doesn't actually prevent someone from hacking it onto PC hardware, just makes it very clear that what the hacker is doing is illegal.

Ah, but is it illegal? That, to me, is the million-dollar question.

Illegal to me means the actions undertaken ultimately lead to a criminal trial, or more to the point, State vs. Doe.

However, if it's just simply that you've violated the terms of the Mac OS X license, then that's not really a criminal offense. It would really only lead to a civil action, i.e. Apple vs. Doe.

Of course, DMCA exists to make fools of us all, so who knows.

matticus008
Nov 12, 2006, 08:44 PM
However, if it's just simply that you've violated the terms of the Mac OS X license, then that's not really a criminal offense. It would really only lead to a civil action, i.e. Apple vs. Doe.
Breaking encryption to gain access to unauthorized data is a criminal offense, as is unauthorized distribution--so the hacker would only be able to get away with it if he kept it to himself and no one ever found out he'd done it. The sticking spot is whether Apple, philosophically, should be able to protect data in this way. Legally, for the moment, the answer is "yes, they can."

Of course, DMCA exists to make fools of us all, so who knows.
True, but breaking encryption was illegal before the DMCA, so in this case the DMCA's just another nail in the coffin if someone goes to trial.

Analog Kid
Nov 13, 2006, 01:54 AM
Ah, but is it illegal? That, to me, is the million-dollar question.

Illegal to me means the actions undertaken ultimately lead to a criminal trial, or more to the point, State vs. Doe.

However, if it's just simply that you've violated the terms of the Mac OS X license, then that's not really a criminal offense. It would really only lead to a civil action, i.e. Apple vs. Doe.

Of course, DMCA exists to make fools of us all, so who knows.
I'll refrain from asking what kind of business you're in that this is worth a million dollars to you, but civil or criminal is moot. "Illegal" means "against the law", of which we have many both civil and criminal-- thus the distinguishing feature of an illegal act isn't who is prosecuting but simply whether it can lead to you standing in front of some guy in a black robe saying "guilty!".

I find it interesting that the "integrity data" is in the form of a limerick. That may just be cute, or it may be an attempt to pull in additional copyright protections.

ogee
Nov 13, 2006, 05:26 AM
The legality of this probably depends on your location as well. In Germany you are allowed to "crack" protection to make a backup for personal use. Distribution is illegal.

shamino
Nov 13, 2006, 10:00 AM
True, but breaking encryption was illegal before the DMCA, so in this case the DMCA's just another nail in the coffin if someone goes to trial.
Actually, it wasn't.

Breaking encryption for the purpose of gaining unauthorized access to information has been illegal. Just like gaining unauthorized access without breaking encryption is illegal.

But breaking encryption to gain access to something that you are authorized to access (like playing a DVD you paid for using an open source player program) was perfectly legal before the DMCA came along.

matticus008
Nov 13, 2006, 12:42 PM
]
But breaking encryption to gain access to something that you are authorized to access (like playing a DVD you paid for using an open source player program) was perfectly legal before the DMCA came along.
And breaking the encryption on these files would be breaking encryption to unauthorized data. There is no situation in which you would have to break this encryption in order to make use of the OS. The software is not open source. To argue otherwise would be analogous to saying you have a right to break Visa's secure database because you're a Visa customer.

DVD encryption and encryption of certain system files is not the same because there is absolutely no condition in which encryption interferes with usage. CSS interfered with valid playback under Linux (though the appropriate solution would have been to allow CSS-compatible players).

shamino
Nov 14, 2006, 08:13 AM
And breaking the encryption on these files would be breaking encryption to unauthorized data. There is no situation in which you would have to break this encryption in order to make use of the OS. The software is not open source. To argue otherwise would be analogous to saying you have a right to break Visa's secure database because you're a Visa customer.
No.

Of course you have a right to the software on your own computer. You bought the computer, you bought the software, you have the right to run the software, and you are running the software. Unless you plan on redistributing the software, the closed-source nature of the code is irrelevant.

Your Visa analogy would be to claim that you have the right to hack into Apple's corporate source code server. Nobody is claiming anything of the sort.

If an auto manufacturer welds the hood shut on a car, would you say that you (a customer and owner) have no right to open it up and look inside?

bousozoku
Nov 14, 2006, 01:27 PM
No.

Of course you have a right to the software on your own computer. You bought the computer, you bought the software, you have the right to run the software, and you are running the software. Unless you plan on redistributing the software, the closed-source nature of the code is irrelevant.

Your Visa analogy would be to claim that you have the right to hack into Apple's corporate source code server. Nobody is claiming anything of the sort.

If an auto manufacturer welds the hood shut on a car, would you say that you (a customer and owner) have no right to open it up and look inside?

Auto manufacturers do better than that. They use special tools so that you can open the hood but can't really do anything without a tool that fits the connection or space. That's a bit like encrypting code modules, isn't it?

shamino
Nov 14, 2006, 01:38 PM
Auto manufacturers do better than that. They use special tools so that you can open the hood but can't really do anything without a tool that fits the connection or space. That's a bit like encrypting code modules, isn't it?
You mean like the PCV valve in my 2002 Chevy Prizm? It easily screws in and out using a 1.75" socket. But you can't get a socket that large at any normal hardware store. So mere mortals have to use an adjustable wrench, which makes a 2-minute procedure end up taking 30 minutes. (But it's still worth it to avoid spending $60 to replace a $7 part.)

jhu
Nov 14, 2006, 02:04 PM
Auto manufacturers do better than that. They use special tools so that you can open the hood but can't really do anything without a tool that fits the connection or space. That's a bit like encrypting code modules, isn't it?

they also make it very difficult to access your car's on-board computer

matticus008
Nov 14, 2006, 03:58 PM
No.

Of course you have a right to the software on your own computer. You bought the computer, you bought the software, you have the right to run the software, and you are running the software. Unless you plan on redistributing the software, the closed-source nature of the code is irrelevant.
Where, exactly, does the encryption play into any of this? It does absolutely nothing to impair your use of the software.

Your Visa analogy would be to claim that you have the right to hack into Apple's corporate source code server. Nobody is claiming anything of the sort.
There's no distinction with software. If you are shipped an encrypted binary, it's the same form as the compiled version stored at Apple. The source code doesn't matter here, because breaking the encryption doesn't give you access to the source. We're only talking about compiled modules.

If an auto manufacturer welds the hood shut on a car, would you say that you (a customer and owner) have no right to open it up and look inside?
No, but they're not welding the hood shut. They're black-boxing certain components. Car manufacturers already do this--the computer, the ABS module, the ignition packs, and on my Audi, the transmission itself are all sealed units. Most of the OS is right there in the open, and none of what's encrypted is user-serviceable. If you want to build your own module from source available online, that's fine, and you can do that easily with this OS and replace the encrypted one.

This is all beside the point, however, that data which is encrypted, regardless of the content, is protected by law. It could be a recipe for chocolate chip cookies for all it matters; the owner of data, if s/he chooses to encrypt it, is permitted by law to do so. Buying some of the cookies doesn't give you a right to the recipe. Since you do not own the OS or the data contained in the binaries, you do not have authorization under the law to break the encryption. That's really as simple as it gets.

The encryption in no way impairs or alters your legal use of the software, and as such you have no plausible argument for being granted access to it.

yellow
Nov 14, 2006, 04:02 PM
Bad link... Here is an updated one:

http://www.macworld.co.uk/macsoftware/news/index.cfm?newsid=16400

And for those who might be interested, this article is a rip off of Amit Singh's blog from October 22nd:

http://www.osxbook.com/blog/2006/10/22/understanding-apples-binary-protection-in-mac-os-x/

shamino
Nov 15, 2006, 08:25 AM
No, but they're not welding the hood shut. They're black-boxing certain components. Car manufacturers already do this--the computer, the ABS module, the ignition packs, and on my Audi, the transmission itself are all sealed units.
And there is nothing illegal about cracking open the box to learn how it works.
This is all beside the point, however, that data which is encrypted, regardless of the content, is protected by law.
*sigh*

You just lost focus of the thread, and are now just repeating your assertions with no new content, so this will be my final reply on the matter.

This activity is illegal only because of the DMCA. Nobody here is disagreeing with the fact that DMCA makes all acts of encryption-breaking illegal.

I am arguing with your assertion that it was illegal before the DMCA became law - which is simply untrue.

Unless you can point to a pre-DMCA law that makes it illegal to break decryption on a product/device that you have paid for, your argument is nothing more than your personal opinion.

matticus008
Nov 15, 2006, 02:42 PM
And there is nothing illegal about cracking open the box to learn how it works.
There is when it's not your box to open. The OS is not something you own outright. That's your opinion--something which current law does not support. There is no current law that identifies software purchase as ownership rights to the data itself. In fact, the rulings are just the opposite--software is the property of its publisher and the purchaser has ownership rights to (some subset of that--current law is ambiguous depending on context as to whether it's a copy or a license, as it does not fit perfectly into existing language).

This activity is illegal only because of the DMCA. Nobody here is disagreeing with the fact that DMCA makes all acts of encryption-breaking illegal.
No, the DMCA is redundant in this regard. Breaking encryption was illegal before the DMCA--just as cracking passwords was and pirating (in the true sense) copyrighted music and film. It can be (and has been) defended under trade secret law, copyright law, as well as communications law. Depending on the infraction, it can also be a matter of criminal law if the data is personal or financial in nature.

Unless you can point to a pre-DMCA law that makes it illegal to break decryption on a product/device that you have paid for
Payment != ownership. Ownership gives you the right to break encryption or security countermeasures on your own property. See Copyright Act of 1976 (as well as 17 USC 101-810) re: securing property. Also Electronic Communications Privacy Act and 18 USC 2707. I can keep going, but I really don't need to.

Cases upholding owner's right to encrypt:
DVD Copy Control Ass'n, Inc. v. Bunner
Blanchard v. DirecTV, Inc.
Freeman v. DirecTV
Bernstein v. US Dept. of Commerce
Schlafly v. PKP
...

Further, it has been upheld in criminal cases such that breaking encryption is the electronic equivalent of breaking and entering. Given that the property owner, Apple, has not granted the customer/purchaser access to secured data, the customer/purchaser has no legal right to bypass encryption. Apple maintains that it is owner sole of the content of the encrypted data and until a court rules that a company may not protect its data through encryption or that a purchaser owns the data, it remains illegal.