Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

arn

macrumors god
Original poster
Staff member
Apr 9, 2001
16,363
5,795
MacWelt.de posted benchmarks from the new Dual 1-GHz PowerMac G4 - with comparisons to older PowerMacs including the dual-800.(English Translation)

Unfortunately, they are comparing OS X 10.1.2 (Dual 1GHz) vs 10.0.4 (Dual 800).

This page shows pictures of the insides of the Dual 1-GHz.. and shows the Seagate Barracuda and GeForce4 MX. (English Translation)

Duxbury.la also posted some more realistic benchmarks (I'm assuming the same OS) between the GeForce 3 and GeForce 4MX on a Dual 1-GHz... the GeForce3 is faser... as well as some screenshots and internals.
 

Sayer

macrumors 6502a
Jan 4, 2002
981
0
Austin, TX
So change it!

So it doesn't come with the EXACT video card you want.

CHANGE IT! For crying out loud.

The used car I bought didn't have the exact level of mileage I wanted but do I go on and on and on about to my wife? No. I accept it and drive to work and the store and take my daughter to school.

Geeze. Find a GeForce3 and pop it in place of the GeForce4MX (note it says MX, i.e. the whimpier of the line).

Sell the GeForce4 for someone who DOESN'T HAVE ONE.

Man its never good for you guys is it?
 

mcbane

macrumors member
Jan 4, 2002
48
0
question: I am hearing that the radeon 8500 is better than the gf3 (and maybe even gf3 ti500), can anybody confirm this? I ordered a dual 1ghz G4 with the radeon 7500 and I plan on replacing that with the radeon 8500.
 

evanmarx

macrumors regular
Oct 23, 2001
105
23
Switzerland
gee....

of course the geforce4 mx is slower than the geforce 3 ... but that cost alone 400-500$ to upgrade ...

but the mx not only stands for "slower and cheaper" but also for "low power and low heat". if you check the images of the card you'll notice that it has no fan! so less noise ... if you ever owned a DELL with a geforce card WITH fan, you know what i'm talking about ... unfortunately the dual processor has a small fan know ... sniff
 

blakespot

Administrator
Jun 4, 2000
1,364
142
Alexandria, VA
Indeed, note the presence of a heatsink, but the lack of a fan on the GPU on the GeForce 4MX in those photos. Also, the memory on the board lacks heatsinks, while all GeForce 3 boards have heatsinks on the DDR SDRAM.

Glad I've got a GeForce 3 in this DP G4 800...!



blakespot
 
U

Unregistered

Guest
Previous tower configuration has the GF2 by default, the GF3 was an expensive upgrade.
GF4 is surely faster than GF2, so actual configuration is better.
For game benchmark remember that human eye can see up to 30 frame per second, so it's not important the difference between 120 and 150 FPS.
If someone play games with a resolution greater than 1024 GF3 can make big difference, but there is no real reason to play games with 1280 or 1600.
GF3 is too expensive and few people have a real reason to buy one, it has limited market.
The problem of the GF3 is similar to the problem of the P4-2giga, few people buy one of them because there is no a real need of that power.
Theese are the reasons that force NVIDIA to make lowered cost GF.
However Apple has always the possibility to (re)introduce the option of the GF3 or 8500 without the need to wait next cpu upgrade.
 

Pants

macrumors regular
Aug 21, 2001
194
9
Originally posted by Unregistered
[B
For game benchmark remember that human eye can see up to 30 frame per second, so it's not important the difference between 120 and 150 FPS.
If someone play games with a resolution greater than 1024 GF3 can make big difference, but there is no real reason to play games with 1280 or 1600.

[/B]

Im so pleased you really know what your talking about. Enlighten us some more, please?
 
U

Unregistered

Guest
Human eye has a limited capability to view frames per second. I'm not sure, but I think this limit is 30fps.
This mean that human eye has not the capability to see the difference between 100 and 150 fps.
If you move as fast as you can your arm in front of your eyes you don't see the fluid movement of your arm, you see several hands.... because your eye has a limited capability in fps resolution.
So, if a graphic cards game benchmark result in a difference between 50 and 70 fps, we can suppose that in particulary hard game situation (many monster in the same room) the fps difference become visible, but if the difference is between 100 and 150...
For exaple the XBox has a GF3, PS2 has very slower graphic card, but booth play in 640x480. I've visually compared a race demo in the two consolle and I don't see any difference....
Obviously GF3 is long term graphic card, but sometimes is preferable to chose a newer chip, that makes newer special effects, instead of a chip that increase fps from 100 to 150.
So a good solution can be choose a medium cost graphic card now and save money for a newer medium cost graphic card when the one you use become slooooow.
 

spikey

macrumors 6502a
Apr 26, 2001
658
0
If you cant tell the difference in graphics between the PS2 and Xbox then you are blind. the makers of Dead or alive3 say they couldnt have made it for the PS2 because it wasnt powerful enough. And if you look at the difference between say Project gotham racing and gran turismo3 then you will notice Xbox can process a higher level of detail because it is more powerful.
And by the way, the Xbox actually runs at a slightly higher resolution than the PS2.

And yes the graphics card with more graphics processing features is often better, but when playing a game like black and white which is GPU intensive then you will need those extra FPS to run it smoothly without getting the annoying jerkiness.

The ATI 8500 is better than the Geforce3 Ti500, and also cheaper, aswell as having GPU features that arent taken advantage of yet.

The difference in FPS will be made clear in a GPU intensive game, where the detail is so high you will need more power to run it above 30FPS.
 

dim

macrumors newbie
Jan 30, 2002
8
0
europe
Re: So change it!

Originally posted by Sayer
So it doesn't come with the EXACT video card you want.

CHANGE IT! For crying out loud.

The used car I bought didn't have the exact level of mileage I wanted but do I go on and on and on about to my wife? No. I accept it and drive to work and the store and take my daughter to school.

Geeze. Find a GeForce3 and pop it in place of the GeForce4MX (note it says MX, i.e. the whimpier of the line).

Sell the GeForce4 for someone who DOESN'T HAVE ONE.

Man its never good for you guys is it?


Sayer, ... if you pay this amount of money, ... it must be good. ... And I DO NOT want to pop a GeForce 3 in place!! ... NOT!!
... that's because it's gonna cost me EXTRA!!

... Don't you get that?

dim
 

PUSH

macrumors newbie
Jan 3, 2002
18
0
Human FPS

Originally posted by Unregistered
"Human eye has a limited capability to view frames per second. I'm not sure, but I think this limit is 30fps."

To be technical,
the human eye visualises in Hz not FPS. The figure is 100Hz (you'll notice most good modern TVs run at this refresh rate) for humans. The closer to 100Hz the refresh rate of your monitor is, the more comfortable it will be to look at. Moreover, the higher your FPS is, the more fooled your brain will be into thinking its real as it more closely resembles the natural frequency of our eyes. Here endeth the science...
 
To the gamebox crowd: In the States, NTSC resolution (television) is 752 x 486. If you're hardware renders out higher than that, then it has to be interpolated down with a loss of information- still ends up at 752 x 486. I doubt and console manufacturer renders out to a lower resolution and interpolates up. Television is 30 frames per sec (60 fields). If your hardware is playing out 60 fps, it has to drop one frame as the television is stuck at 30fps.

On your computer, you're running into the same limits when you're talking about refresh rate. If your moniter's refreshing at 75Hz, it's having to drop every other frame if your video card is pumping out 150fps.

What the eye can see has long been the subject of debate, but some heavy hitters in the entertainment industry have toyed with movies filmed and projected at 60fps. The conclusion was that these movies were much more lifelike, almost a 3D quality. I've never seen this, so I can't comment personally.

I doubt that anything above 60fps is perceivable by the human eye, but the power of suggestion and the desire to believe are very powerful and there will always be those who are sincere in their claim that they can see the difference between 70fps and 120fps. That doesn't mean they can. It just means they're sincere...
 

spikey

macrumors 6502a
Apr 26, 2001
658
0
the question though is not about whether the human eye can see above 30FPS.

The point is that when you play a game say Quake3 for example, when you have 4 ****heads attemting to frag you and 10 rockets travelling towards your face then you will need the extra power of the GPU to deliver enough FPS to keep the game running smoothly.

These FPS measures are not necesserally real life examples, in real life you would sacrifice the FPS for detail until you got enough detail without the game becoming jerky.

So yes the difference between 100FPs and 150FPS is marginal, but it is just to show the power of the GPU, it is an example of the power and not an example of how you would set up the detail/smoothness level of the game. And the more powerful the GPU, the more detail in the game, and also the smoother the game is when you have 5 ****heads wanting to Frag your ass.



FPS does matter.
 
Enlightening!

So if the manufacturer says a card will play Quake 3 at 70fps, then that's an average?

When you're running around with nothing to kill, it may be doing 150fps (which you wouldn't notice), but if you're dodging the death rays of 5 "ass-fraggers" it may slow to something like 15fps if your card isn't powerful enough?

On top of that, the more options you select in terms of detail, smoke, etc. will slow you down even more? How do you find honest evaluations of cards set to run on similar specs?
 
U

Unregistered

Guest
Ok guys, I just want to say that if Apple offers me a GF3 or 8500 instead of a GF4 for an add 100$ i can consider the option, but for 200-300 more not. This thread seems to be originated to prove that GF4 is ****. Many people that still use a GF2 don't feel any need to change it.

800x600 is a good resolution for games, ok 1024 is better but the difference is not so noticeable in action games, and I think GF4 in 800x600 can run heviest game situations.

GF4 is not ****, GF3 is too expensive.
8500 is interesting, but i don't know how much it cost.
 

arn

macrumors god
Original poster
Staff member
Apr 9, 2001
16,363
5,795
Hey all...

the point isn't whether or not you can get 100 or 200 FPS on a game... point is many assumed that the GeForce 4 MX would be faster than a GeForce 3... it is not... and while many may not need to see quake run at 120 FPS vs 110... in future games such as Doom 3, it will make a noticable difference.

The info is important so you can make informed purchases.

arn
 
U

Unregistered

Guest
FOR jayscheuerle

I suppose that Quake internal benchmark is honest, more honest than PowerPC vs Pentium Apple benchmark.
Surely there will be particulary heavy game situations that results in a lower fps in what you see in benchmark, but usually benchmark test medium-to-heavy game situations. At least I hope so.
 

Pants

macrumors regular
Aug 21, 2001
194
9
Originally posted by arn
Hey all...

the point isn't whether or not you can get 100 or 200 FPS on a game... point is many assumed that the GeForce 4 MX would be faster than a GeForce 3... it is not... and while many may not need to see quake run at 120 FPS vs 110... in future games such as Doom 3, it will make a noticable difference.

The info is important so you can make informed purchases.

arn

sorry for the earlier sarcasm. Just because a tv has a low refresh rate doesnt mean to say your eye cant tell the difference (anyone get eye strain on a 60 hz monitor? yep, exactly...)....Im not sure eyes have a 'refresh rate' in the computer sense either.....and yes, gimme a 200 fps gpu over a 60 any day. UT flak is a frame sucker ... :)

Arn - agreed totally - its a rip off. but then so is the 'ultimate' over the 'fast'. The question Id like answered is why a gf3 or 4 at all? Open gl lags behind dx8/9 seriously and with a gf4 on a mac, your not going to see all those fancy particle effects any time soon. Hence your really paying over the odds for a card with capabilities that no game you have can use..... (I notice that ogl is not a supported standard on the upcoming unreal2 - its only dx8/9 - what hope a good quality port??)
 

mcbane

macrumors member
Jan 4, 2002
48
0
Originally posted by Unregistered
Human eye has a limited capability to view frames per second. I'm not sure, but I think this limit is 30fps.
This mean that human eye has not the capability to see the difference between 100 and 150 fps.
If you move as fast as you can your arm in front of your eyes you don't see the fluid movement of your arm, you see several hands.... because your eye has a limited capability in fps resolution.
So, if a graphic cards game benchmark result in a difference between 50 and 70 fps, we can suppose that in particulary hard game situation (many monster in the same room) the fps difference become visible, but if the difference is between 100 and 150...
For exaple the XBox has a GF3, PS2 has very slower graphic card, but booth play in 640x480. I've visually compared a race demo in the two consolle and I don't see any difference....
Obviously GF3 is long term graphic card, but sometimes is preferable to chose a newer chip, that makes newer special effects, instead of a chip that increase fps from 100 to 150.
So a good solution can be choose a medium cost graphic card now and save money for a newer medium cost graphic card when the one you use become slooooow.

Sorry, have to respond to this. It has been proven that the human eye can detect up to at least 80 fps. Tests were done in the military where frames would be shown at 80fps, and a single frame would contain some type of plane. The people could not only say that they saw the plane, but identify the make of it. So humans can see the 80fps.

You are getting it confused with the number of frames per sec in film and video. Video runs at 30fps, film at 24fps. This is okay because you factor in motion bluring and many other things. Added to the fact that, in video games, the frames have to match the twitch reflexes a person may throw at it, but in film it is not a person controlling the actions, so there is no sense of delay.
 

BillGates

macrumors member
Jan 12, 2002
56
0
Minnesota
Originally posted by mcbane


Tests were done in the military where frames would be shown at 80fps, and a single frame would contain some type of plane. The people could not only say that they saw the plane, but identify the make of it. So humans can see the 80fps.

The article I read about the military tests mentioned that the image of the plane was flashed on screen for 1/200th of a second. I wonder what the truth is?

NTSC video frame rates are 29.997 (with 2 fields/frame)
US HDTV frame rates can be 24, 30 and 60 both progressive and interlaced
I'm not aware of any US video standard having a frame rate of 80.

Being an owner of a Radion, GeForce 2MX, GeForce3 and a GeForce4MX on the way. I can tell you that faster has always been better when playing Quake and Unreal. If you have the money buy the fastest card you can get. Anyone who says you can't see the difference either doesn't have the money to buy the better card or hasn't actually seen it in action.

One reason for the faster card I've not seen mentioned is its ability to maintain high frame rates at high resolution 1280 x 1024 and up. The resolution of the image adds to the realism as much as the frame rate, maybe more. I know I hate watching NTSC now that I have an HDTV.

If your really into games with high frame rates you better stay away from LCD displays. Just like exceeding your refresh rate on a CRT is a waste. Exceeding the response time of the LCD pixels causes noticable negative visual affects on the screen. An LCD response time is much slower than the refresh rate on a CRT. A good LCD can only do about 40 FPS.
 

mcbane

macrumors member
Jan 4, 2002
48
0
Originally posted by BillGates


I'm not aware of any US video standard having a frame rate of 80.

I was under the impression it was a reel of film that was sped up to play 80 fps.
 

BillGates

macrumors member
Jan 12, 2002
56
0
Minnesota
Originally posted by mcbane


I was under the impression it was a reel of film that was sped up to play 80 fps.

My mistake. I miss read your post as 80 video frames. You clearly state 30.


I play Quake and Unreal at 1280 x 1024 85Hz. My GeForce3 gives me a minimum frame rate around 80FPS and it looks damn good. I'll have to see if there is some way to govern the FPS. It would be fun to experiment with slowing it to 70 then 60. I'm confident it would be real noticable.
 

sturm375

macrumors 6502
Jan 8, 2002
428
0
Bakersfield, CA
8500 Better?

Originally posted by spikey
If you cant tell the difference in graphics between the PS2 and Xbox then you are blind. the makers of Dead or alive3 say they couldnt have made it for the PS2 because it wasnt powerful enough. And if you look at the difference between say Project gotham racing and gran turismo3 then you will notice Xbox can process a higher level of detail because it is more powerful.
And by the way, the Xbox actually runs at a slightly higher resolution than the PS2.

And yes the graphics card with more graphics processing features is often better, but when playing a game like black and white which is GPU intensive then you will need those extra FPS to run it smoothly without getting the annoying jerkiness.

The ATI 8500 is better than the Geforce3 Ti500, and also cheaper, aswell as having GPU features that arent taken advantage of yet.

The difference in FPS will be made clear in a GPU intensive game, where the detail is so high you will need more power to run it above 30FPS.

I have found that what graphics chipset you prefer depends greatly on the person's experiance. However if you look stricktly at the numbers, nVidia Ti500 blows any and all ATI stuff away. Also nVidia's foundation is in OpenGL, where ATI has just entered, I believe Rage was the first to fully support it. Also, correct me if I am wrong in this, OS X is heavy into OpenGL. Having a video card with strong, constant support for OpenGL will be very important to OS X users.

This does not apply here, but in the PC world, ATI is well known to have driver problems, in every incarnation.

Disclamer: Economic Theory - Compitition
ATI is the only ones making ATI cards, no competition.
nVidia only makes the chipset, others make the boards. Lots of competition, brings about much better tech because all manufactures want theirs to be the best.
 

BillGates

macrumors member
Jan 12, 2002
56
0
Minnesota
What are mac users doing with a Microsoft XBox? :)

If you want to see a huge improvement in your new game consoles video performance, attach it to an HDTV via component video cables.

I love how these message threads get off track...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.