Separate names with a comma.
Discussion in 'Mac Basics and Help' started by Telp, Apr 7, 2007.
Can anyone tell me the difference between 1 gigabit and 1 megabyte?
1 gigabyte is 1,000 megabytes.
Is that what you wanted?
no a gigabit
Check this out.
A gigabit should be around 125 megabytes according to some quick calculatoring.
(1,000,000,000 / 8) / 1,000,000 = 125
1,000,000,000 bits in a gigabit. 8 bits in a byte. ~1,000,000 bytes in a megabyte.
A bit is a 1 or 0 (on or off). Eight bits make up a byte. Each byte is a character represented by a mixture of eight 1s or 0s.
So the word "jello" would be made up of 40 bits.
EDIT: And here is "jello" in binary: 0110101001100101011011000110110001101111
So, sending "jello" over gigabit ethernet would take 0.00000004 seconds under ideal conditions.
Oh, sorry, poor reading on my part.
There are 8 bits in one byte, so 1 gigabit is equal to 125 megabytes.
EDIT: ...And dpaanlka, I saw that.
Here is a fun table.
Thanks, does that mean that 1.5 gigabits/s is equal to 187.5MB/s?
Yes, it does.
Wow, that was the most technical/educational thing that I have read all weekend.