man... i wish apple would host this download, my connection is normally pretty quick,,,, but wow. this d/l is going to take all day.
Originally posted by Sol
I had UT2003 running on 640 X 480 on window mode and the CPU Monitor showed both my 800 MHz processors being used at about the same level. On the Terminal I was running the top process and that showed CPU usage between 90% and 115%. With medium graphics settings I was getting an average of 25 to 30 fps on the out-doors level of the demo. By the way, the longer the demo was running the higher my fps seemed to get. Running the game full-screen with a higher resolution does not seem to make any noticeable difference in the frame-rate.
The test computer was a dual 800MHz, 1GB RAM, 7200 RPM hard drive and a 32 MB nVidia GeForce 2 TwinView outputting to two monitors. Booting up with a single monitor did not seem to make a difference in frame-rates either.
well i hear ut2003 uses 1 chip for game 1 for sound. still a great loss when using 2. this is why a single 1.4 970 will blow the doors off a 1.42 dual g4. lots of loss potential when trying to get programs to use both cpu's at 100% aint going to happen but now you have a single chip executing more instructions at the same clock. Upgrade my powermac or bite the bullet and get a new 970. we are about 1 month away folks. The 970 will be a new level of performance that will only get better.Originally posted by MacBandit
Why do you say that? Don't you think it should be maxing out both cpus or at least one of them if it was the cpus holding the game back?
I think what it potentially shows is that the game wasn't well written for the Mac.
Originally posted by Dont Hurt Me
well i hear ut2003 uses 1 chip for game 1 for sound. still a great loss when using 2. this is why a single 1.4 970 will blow the doors off a 1.42 dual g4. lots of loss potential when trying to get programs to use both cpu's at 100% aint going to happen but now you have a single chip executing more instructions at the same clock. Upgrade my powermac or bite the bullet and get a new 970. we are about 1 month away folks. The 970 will be a new level of performance that will only get better.
Originally posted by Sol
I disagree that the Unreal Tournament 2003 application is badly written because the processors are not being used 100% each. It seems like all the hard work is done by the graphics card and the CPUs are utilised for things like sound and physics; isn't this the way console games utilise their limited host hardware? I think in the long-term this way of doing things is better for us Powermac owners because a graphics card upgrade would be cheaper and increase our framerates more than a CPU upgrade would.
Originally posted by ExoticFish
a 128Mb Geforce 3 Ti 200
Originally posted by ExoticFish
believe it or not there was. it's no faster than the 64Mb Geforce3 but I just wanted to see the 128Mbs of ram on the video cards boot screen It was the same price as the 64Mb so why not?
Originally posted by ExoticFish
It's most definately obvious that the CPU is what's holding this game back, not the video cards, I agree, just saying that every little bit helps. and I don't know what I was smoking, the video card in my machine at work is a Radeon 7500 not a 9000. A 9000 would be much better.
Originally posted by ExoticFish
Well 70% of a 733MHz CPU is less than 70% of a Dual 1GHz system. My friends dual 1 GHz machine gets about 10 more fps than my single 1GHz TiBook. I'm not saying that I don't agree with you. I'm saying that factors on every front contribute to the sluggishness of the game. I.E.
1) The game is not optimized enough
2) The G4 is long due to be replaced
etc.... etc....
But you saying that the frame rates don't vary that much between 700MHz and a Dual 1.42 GHz machine is a little unrealistic.
Originally posted by Dont Hurt Me
It boils down to framerates and to get em where we would love them you have to have both cpu power and gpu power. one without the other isnt going to help much. And we all know the g4 has been lacking, and apps are not written to use it at its max. So here we are. and yes frame rates between a 733 and dual 1.42 are going to be very different. Anyone who says they are not has just burnt one and is euphoric.
Originally posted by Dont Hurt Me
come on bandit i was just there what graph are you talking about? i saw where my 733 was getting 30 fps in wolfenstein and a single 1.2 upgrade was getting 50. 20 fps can be the difference of constant smooth action or stuttering slow down also those same 20 fps may allow you crank up to the next resolution depending on your gpu and play at higher resolution or more things on with a exceptable slow down. When compared to the pc world our gaming frame rates suck. low clock g4s
Originally posted by Dont Hurt Me
I think the original was the better game, sure we have new bells and whistles but the original was the ticket that rocked my gaming world for the longest of any game.