Sure, FireWire is better than USB 2.0, and FireWire 800 is probably better than the yet-to-be-announced (and supported) USB 3.0. However, only profesionnals use FireWire 800, so it may be a good time to drop FireWire before the main public catch on. Let's face it: aside from miniDV (and iPods), FireWire 400 never really got off the ground when compared to USB 2.0.
in the long run, having two standards that do pratically the same thing is bad for the users (kinda like Beta vs VHS, ADC vs DVI, Blu-Ray vs HD-DVD, Compact Flash vs Memory Stick vs SD vs, well, too many flash card formats). It doesn't make for a "simple computing experience" if you have too many standards. Even FireWire 800 isn't connector compatible with FireWire 400. Not a good move, if you ask me. I also have to guess that USB 3.0 will use the same connector as USB 2.0 (here's hoping).
Maybe Intel could, with the help of Apple, add the good bits of FireWire (what prevents it from sucking CPU power) to USB 3.0 so it's the best of both worlds.
So in the short term, it's a bad idea (because of all the miniDV camcorders out there), but in the long term it's the same as the ADC vs DVI debate: you have to go with the most spread standard even if it's not the best one (as long as it's "almost as good").
In the long term, it's good for everyone because you don't have to fight for something "special" (ADC, PowerPC, FireWire, to name a few), you go with the flow and simplify the "computer" experience for regular (non-technical) users.
What makes a Mac isn't the hardware itself. It's the way it's built together, it's the operating system it runs. I don't care if it has an Intel processor or no FireWire ports, as long as has OS X and incredible software like the iLife suite.