Re: Re: Re: frame rate
essentially, 200fps *IS* quantifiably better - teh overhead built into displaying 200 allows for an acceptable dip when things get complex (flak fights in UT spring to mind...). And like someone above rightly mentioned, the eye isnt digital but analog - if I remember rightly, the film analogy is a little fixed - dont they show each frame twice or something? Simply, faster is better- 50 fps is unacceptably headache inducing - especially if you play games with precise hitscan weapons whilst dodging like a rabbit on speed! (try turning 360 degrees fast in ut and lining up a shot as you do so - now fix the frame rate at 30 fps, and increase it by 10 each time and see teh difference...)
Originally posted by daveg5
i get that but is a game any better at say 200fps than 100fps, this seems to be the only bench mark people take other than anti aliasing, lighting,fog and other effects. personally i would like the videocard manufacturs to increase 2d speed by the same leaps and bounds unless it cant go any faster(screen redraw, scrolling, resizing,zooming, video playback and encoding ,etc. ) but the only thing on thier mind is 3d fps and antaliasing and 3d efx. not a bad thing mind you but when an old radeon 32MB pci can do 2d about as good as the latest gforce4 128MB something is missing.
essentially, 200fps *IS* quantifiably better - teh overhead built into displaying 200 allows for an acceptable dip when things get complex (flak fights in UT spring to mind...). And like someone above rightly mentioned, the eye isnt digital but analog - if I remember rightly, the film analogy is a little fixed - dont they show each frame twice or something? Simply, faster is better- 50 fps is unacceptably headache inducing - especially if you play games with precise hitscan weapons whilst dodging like a rabbit on speed! (try turning 360 degrees fast in ut and lining up a shot as you do so - now fix the frame rate at 30 fps, and increase it by 10 each time and see teh difference...)