There are some Mac "quirks" that actually are features and take getting used to.
The mouse acceleration thing is one of those. I find the "Mac way" of doing things superior especially for fine work like in Photoshop. If you need to get across the screen quickly, move the mouse quickly and it will go. If you want to do fine detail work, move the mouse slowly and you can position the cursor much more exactly than in Windows.
I bought a DLSD from a forum member here that Ubuntu installed on it. I still have the Ubuntu HDD as I know he put a lot of work into getting it to work, but I lasted about 12 hours with it. One of the biggest annoyances to me was that movements on the trackpad were completely linear-i.e. moving your finger a certain distance moved the pointer an equal amount every time. IMO, this it terrible on a trackpad. I have an old Compaq Armada(133mhz Pentium to give you some idea) where the touchpad is about the size of two postage stamps, and the touchpad works similarly in it. Back when I used it regularly, I loved it because I was the only one who could actually use the computer

. On the rare occasion I try to use it now, however, I end up plugging in a mouse.
I guess there's no right or wrong way, but Apple did bring the mouse to personal computers and I tend to think that they usually get it right. As I said, I much prefer the Mac way, especially after using it for several years.
Another somewhat contentious point is the font rendering on Macs vs. PCs. Back when I used iMac G3s in the school computer lab(probably running 8.5 or 8.6, as they were mostly tray loaders and I doubt any had ever been upgraded), I had a hard time using the Macs because I thought that all the text was "blurry." I was use to the crisp rendering in Windows. Higher screen resolutions have made fonts more "crisp" in Macs(they even look great to me under OS 9 with a decent video card on a 100ppi Cinema), but they maintain the fidelity to print that has always been the hallmark of Apple font rendering. Now that Macs are virtually the only things I use, I think that renderings look like crap in Windows. I also think its telling that Windows has increasingly been moving toward a more "Mac like" font rendering.
As for the rest of the video-he does actually have some decent point in regard to software ability, but you still can't get around the fact that a G5 is a beast when running software appropriate to it. Throw CS2 or a contemporary version Final Cut on it and it will tear through things with ease-especially on a Quad. Heck, even a high spec G4 is great, although not generally as fast as a G5.
All the stuff about upgradeability, though, is a pile of crap. He keeps going on about the FX5200, but most of us who use G5s regularly realized that these are terrible cards. I know a lot of folks like to run 9600XTs in their G5s, which are great cards(although I usually end up putting them in G4s). The 9600 Pro has dual DVI(if you'd prefer that) including one dual-link port. I absolutely love my 6800 Ultra, which has two dual links. There are plenty of Mac cards that don't require flashing, although flashing generally isn't a big deal(I flashed my FireGl, which flashes beautifully into an X800 XT, in a G5 running OS X). Plus, there's the whole fact that even if a card has an ADC port, ADC ports
are DVI ports and can be converted as such using a passive adapter.
BTW, with regard to the whole pronunciation of OS X-I go back and forth between saying X and 10. I think Apple consistently(in keynotes and such) calls it OS 10. The "X" is, of course, the Roman numeral 10 and also designates it as a successor to OS 9. To me, though, "X" also has merit because it was also chosen to indicate the Unix underpinnings. I find it awkward to say "OS 10 10.5.8"(for example) so will often instead say "OS X 10.5.8." Of course, I guess I could just say "OS 10.5.8" which would be unambiguous and also not start a war. Personally, I usually just say Leopard

.