Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Telp

macrumors 68040
Original poster
Feb 6, 2007
3,075
25
The difference between gigabytes and gibibytes and when these differences are used? Thanks.
 
Here is a pretty thorough article.

In short, gigabytes are base 10, and so a gigabyte is 1,000,000,000 bytes.

Computers handle binary numbers better, so they use base 2. A gibibyte is a close equivalent to a gigabytes but is actually equal to 1,073,741,824 bytes.

Computer hard drive manufacturers measure their drives in gigabytes (base 10). Computers measure their drives in gibibytes, but mistakenly call them gigabytes, and it results in an apparently smaller number. (This partly because the term "gibibyte" didn't even really exist until fairly recently.)
 
Here is a pretty thorough article.

In short, gigabytes are base 10, and so a gigabyte is 1,000,000,000 bytes.

Computers handle binary numbers better, so they use base 2. A gibibyte is a close equivalent to a gigabytes but is actually equal to 1,073,741,824 bytes.

Computer hard drive manufacturers measure their drives in gigabytes (base 10). Computers measure their drives in gibibytes, but mistakenly call them gigabytes, and it results in an apparently smaller number. (This partly because the term "gibibyte" didn't even really exist until fairly recently.)

Hey thanks for that, that is very interesting. Why are they measured differently though?
 
Holy Cow

I have an engineering degree and had never knew that term gibibytes!

When did this term come about exactly? I knew about the gigabyte difference in actual and reported by manufacturers. Nobody can get hold of the 1024 versus 1000 size thing. Starts with bits goes to Kbits and then bytes and megabytes.

HD sizes have always been smaller than reported.
 
wow never knew!

Amazing what you learn on here, I aways just thought they were both gigabyte but were calculated differently, didn't know the binary calculation had a different name! :eek:
 
Amazing what you learn on here, I aways just thought they were both gigabyte but were calculated differently, didn't know the binary calculation had a different name! :eek:

Yeah, i just learned about it today on an earlier post.
 
If only I could refrain from giggling everytime I try to say mebibyte or gibibyte out loud. :p The terms need to be adopted by the masses, I think, but they're too funny sounding.
 
The dictionary definition of gigabyte is 2^30 as well as 10^9. It won't be the first word, or the last, to have more than one definition. "Gibibyte" will never catch on because it sounds really really incredibly stupid. Even stupider than "gigabyte". ;)

Edit: ha, just saw apfhex's post. Whatever dweeb felt the (completely unnecessary) need to come up with different terms for megabyte etc. would have done better without such a tin ear....

--Eric
 
Hey thanks for that, that is very interesting. Why are they measured differently though?

Speed of computation.

At the most fundamental level, your computer is a huge pile of switches that can either be on (1) or off (0). Thus, all numbers (whole and factional) are represented in a Base 2 system.

As you should recall from grade school, multiplying and dividing in our normal Base 10 system is very easy to do when one of the operands is a power of 10. You just shift the decimal point and add zeros when needed. So, if you multiply a number by 100, you shift the decimal point two spaces to the right and add two zeros. If you divide a number by 1000, you just shift the decimal three spaces to the left.

This correlates directly to multiplying and dividing by 2 in a binary system (base 2). In a binary system, if you multiply a number by 8 (which is 2 raised to a power of 3), you shift the bits left by three places, adding zeros where needed. If you divide a number by 32 (which is 2 raised to a power of 5), you shift the bits right by 5 places.

Shifting bits around is very fast; you can do it in very few clock cycles compared to a multiply or divide. In the interest of performance, its preferable to work in even powers of 2 than in powers of 10.

So...decades ago operating systems, for the sake of speed, would represent 1000 bytes as its nearest power of 2; 1024 bytes. There were a few exceptions that would perform the calculation properly and divide by powers of ten. This was inefficient, but it was accurate and kept to SI-prefix standards. In fact, there were some very early versions of the Macintosh System Software and Finder that would perform the calculation by dividing by 10 (i.e. 1 kilobyte was 1000 bytes not 1024 bytes).
 
Put me in the camp that considers "gibibyte" to be a silly term. No one seriously uses it, do they? Giga-, mega-, and so on have always had special meanings when referring to base-2 computer values. When I first learned about computers, a kilobyte was always 1024 bytes. A megabyte was always 1024 x 1024 bytes.

Hey thanks for that, that is very interesting. Why are they measured differently though?

I thought it was because the hard drive manufacturers realized they could make a 72 MB hard disk and put "80 MB" on the label to make it look better than it really was, and still avoid fraudulent advertising charges.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.