Did you take a look at the WWDC 06 video of OpenGL demonstration? It was done by ATI, and it was amazing. It had rain that looked real, buildings that looked real, items inside buildings that looked real, well... I think you get the picture.
Perhaps it isn't the engine DirectX vs. OpenGL, but instead the game developers?
I can't find the video you're talking about unfortunately- google tells me that you have to be a member of Apple Developer Connection to access it. So I'll take your word for it that it looks good, but I refuse to believe that it looks nearly as good as what Crysis is doing on DX10.
Ultimately you are right, it is all about the developers- but developers can only do so much with the hardware. The truth is that compared to PCs, Macs do not have competitive graphics cards. Since switching to intel the hardware of Macs and PCs are identical, there is no megahertz myth or anything that can make up the difference between the x1900 and the newer cards.
As for DX10, it is the favored tool for developers and so has a tight grip on a gaming market where good development tools help offset spiraling costs. Making games mac compatible doesn't make a great deal of economic sense unless you are using opengl in the first place- something most studios do not do. I'm all for openGL becoming more competitive with DX10 but it's equally important to ensure the hardware is up to the job.