Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

mrgreen4242

macrumors 601
Original poster
Feb 10, 2004
4,377
9
OK, I know that the specs for gigabit ethernet are faster than either FW, and especially FW400, but USB2 outpaces FW400 on paper, too, and we know that's not reality.

So, if I was going to build an external RAID drive that had a hardware controller and presented itself as a single storage device to whatever computer connected to it, would I get better performance using a FW400 port or a gigabit ethernet connection? This is assuming that the ethernet connection is direct from the RAID device to the computer, no other machines on the "network".

Thanks for any input on this subject!
 
So, if I was going to build an external RAID drive that had a hardware controller and presented itself as a single storage device to whatever computer connected to it, would I get better performance using a FW400 port or a gigabit ethernet connection? This is assuming that the ethernet connection is direct from the RAID device to the computer, no other machines on the "network".

Firewire methinks. Firewire directly controls data reading and writing, and is optimized for serial transmission of bulk data, ,with guaranteed delivery.
Any server on EN would have to go through the mediation of SMB and another OS before its written, EN doesn't have the delivery gurantee.

<wanders away to check references>

http://ask.slashdot.org/article.pl?sid=02/01/23/166210&mode=thread&tid=137


From Tom's Hardware
"FireWire Vs. Gigabit: Advantage For Home Use?

The new high-speed variation of IEEE1394 is popularly called FireWire-800 or FireWire-b. This is not quite correct as IEEE 1394b has not yet been accorded a trivial appellation. Officially, the standard is currently simply referred to as IEEE 1394b. Besides new connectors - from the front these look like a larger version of the i.Link plug (about three times as big) - new drivers ensure a performance leap. Maximum gross throughput rates in an ideal environment are around 100 MByte/s (800 MBit/s). In practice, average throughput in measurements of connections from the PC to external devices such as hard drives or DVD drives usually only reaches half of that (50 MByte/s) and direct connections between two computers only a quarter (25 MByte/s) of the maximum data rate of 800 MBit/s = 100 MByte/s. Despite Ethernet's collision management (CSMA/CD), Gigabit-Ethernet networks attain average throughput rates of 70 to 80 MByte/s (effective). Collisions occur when two data packets are sent simultaneously.

As a serial transmission technology, FireWire avoids collisions through intelligent time management - unlike Ethernet - so that jitters are in the pico-second time range, in other words over three decimal places less than with Ethernet. Hence FireWire is better suited to time-critical data transmission such as uncompressed audio and video data. Whether the industry will ever take it on, however, will reveal itself in the fullness of time."

This article suggests that a different protocol (that presumably addresses concerns about retries and timeliness of arrival of packets) transmitted over GBEN would be suitable
http://www.prosilica.com/support/why_firewire.htm
 
It's possible my math is wrong, but I think FW400 is still faster than Gigabit ethernet (the main thing there being that it's GigaBIT, not GigaBYTE).

I'd post the math I used, but my fragile ego couldn't handle the bruising if I were way off :)
 
Well, technically, Gigabit (1000bitspersecond) is faster then firewire 400 (400megabitepersecond). Theoretically speaking. However, a networked array has to deal with router overhead and network traffic, etc. And the FW400 deals only with the data transfer of itself. So, it may end up being faster in the end. Of course, FW800 would probably nuke them both.
 
Well, technically, Gigabit (1000bitspersecond) is faster then firewire 400 (400megabitepersecond). Theoretically speaking. However, a networked array has to deal with router overhead and network traffic, etc. And the FW400 deals only with the data transfer of itself. So, it may end up being faster in the end. Of course, FW800 would probably nuke them both.

At the risk of looking like an idiot...

Gigabit isn't 1000 bits per second - it's 1000 Megabits per second, which is 1000 kilabits per second, which is 1000 bits per second :)

1k * 1k * 1k = 1billion bits per second / 8 (8 bits = 1 byte) = 125,000,000 = 125MBytes per second (minus the overhead you mentioned).

So....someone smarter than I am - is my math way way off?
 
It's possible my math is wrong, but I think FW400 is still faster than Gigabit ethernet (the main thing there being that it's GigaBIT, not GigaBYTE).

I'd post the math I used, but my fragile ego couldn't handle the bruising if I were way off :)
FW is in bits as well, so the comparison is valid... no fear, though, no one will mock you for trying to help. :D

Well, technically, Gigabit (1000bitspersecond) is faster then firewire 400 (400megabitepersecond). Theoretically speaking. However, a networked array has to deal with router overhead and network traffic, etc. And the FW400 deals only with the data transfer of itself. So, it may end up being faster in the end. Of course, FW800 would probably nuke them both.

Agreed that FW800 is probably easily the fastest option of the three, but of the slower two I'm not sure. If the network was just the drive and computer, there's not a whole lot in way of collisions or added slowdowns from routing, etc. If network overhead were even 50% of available bandwidth, it would still be 20% faster than FW400...
 
So....someone smarter than I am - is my math way way off?

Sorry, you're totally correct. I was obiously thinking megabit. 1 Gigabite is 1,073,741,824 bits. I think it would be 128MBps. Theoretically.

Even so.. I think it would depend on what the network topology looks like. But FW400 never reaches theoretical speed. Maybe sustained 200mbps, max. Probably slower than a gigabit network (assuming the full network really WAS gigabit). But I don't know what kind of error correction goes on in either. Does the extreme speed of a gigabit network mean that more packets arrive out of order and the network stack has to wait on them to sort or be resent? ****, I don't have a clue. :)
 
Sorry, you're totally correct. I was obiously thinking megabit. 1 Gigabite is 1,073,741,824 bits. I think it would be 128MBps. Theoretically.

Even so.. I think it would depend on what the network topology looks like. But FW400 never reaches theoretical speed. Maybe sustained 200mbps, max. Probably slower than a gigabit network (assuming the full network really WAS gigabit). But I don't know what kind of error correction goes on in either. Does the extreme speed of a gigabit network mean that more packets arrive out of order and the network stack has to wait on them to sort or be resent? ****, I don't have a clue. :)

Pedant time: there's no such thing as a gigabite. It's a gigabyte.
 
Bumping this as it seems the subject was never resolved.

Want to build a Mac Mini server and want to know which is faster out of firewire 400 or Gigabyte Ethernet.

Rumor has it that the revised Mac Mini was supposed to include FW800. Shame that it never happened.

Building a Mac OS X File Server is still so much of a compromise.

And, NO. I'm not forking out 4K for an Xserve or Mac Pro to server my daughters Kindergarten photos and backup my E-mail.

:(
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.