Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Drumline789

macrumors newbie
Original poster
Jan 18, 2008
23
0
In America..... Maryland
I have recently considered purchasing a mac pro (MP) and am curious if their standard graphics card is "decent" enough for the average gamer. Being a mac guy for a long time, i have never really encountered any graphic cards because i have mostly owned laptops, is the performance of these things that noticable in the sharpness and fps and what-not?
 
By standard card do you mean the ATI Radeon HD 2600 XT?

If so, I'm sure it would be adequate for today's games, but if you're truly serious about gaming you might want to opt for the NVIDIA GeForce 8800 GT and double your VRAM and most likely extend the amount of time before the machine will encounter games which will refuse to run on "only" 256MB...
 
-

Yea, it comes with a single ATI 2600, now will that be efficiant on a cinema display? II truely dont know too much about graphics, but will the picture be fuzzy on a large cinema display if you have a standard GC? Have you ever played starcraft 2? Thats my main goal here, is too successfully run that game.
-Ben
 
Yea, it comes with a single ATI 2600, now will that be efficiant on a cinema display? II truely dont know too much about graphics, but will the picture be fuzzy on a large cinema display if you have a standard GC? Have you ever played starcraft 2? Thats my main goal here, is too successfully run that game.
-Ben

which cinema display? And it gets fuzzy when you don't run a LCD at its native resolution, that base card will be able to run any LCD at its standard resolution the question is if it can run games at that resolution.
 
which cinema display? And it gets fuzzy when you don't run a LCD at its native resolution, that base card will be able to run any LCD at its standard resolution the question is if it can run games at that resolution.
The base MP video card should power at least dual 30"s right?
 
The base MP video card should power at least dual 30"s right?

Yes it can run two 30" however, it wouldn't be able to drive a 30" at native resolution for gaming. It would be fine for other stuff in finder though. You wouldn't want to do any GPU accelerated stuff though ie aperture etc. since your already taxing it with the two 30"s In my opinion if you can afford two 30" monitors then just upgrade to the 8800gt it will be worth it.
 
I was just in a mac store today and the employee took off the side wall, to let me have a look inside, the single ATI, (which was installed) runs 88 fps, which seems like a lot, considering i have only been arround 30 and sometimes 60 on my better video camera, but will the 8800's 141 fps be noticeable? it just seems that once you get to a certain amount of frames per second, it just all looks the same. ha, kind of a strange question, but how many fps do we as humans see?
 
Frames per Second is not a constant figure across all games, though. So with a higher-end game, your 88 & 141 could drop to, oh, say, averaging 44 and 70 fps with further slow-downs in processor intensive fast-paced scenes, for example, and you might see some choppiness.

It only takes one high-end (by today's standards) game you really like to make the nVideo 8800 GT worthwhile to you over time. It adds some waiting time, and you could upgrade to it later down the road (at more cost), but worth considering.

Richard.
 
Given that the refresh rate on LCDs is pretty much fixed at 60 Hz or at best 75 Hz (don't know the ACD's stats), no, even if your eye (brain, really) could distinguish between 75 frames per second and 700, it wouldn't matter as the refresh rate effectively limits the visible framerate.

I don't know which games you're talking about, though i can tell you that the 8800 GT will run games more than the ~2x as fast you're citing here in many instances.

If you're talking about StarCraft 2, no I haven't played it, but I'd bet my life that a 2600XT will run it just fine.

The thing about graphics cards today, is that midrange cards are sort of like yesteryear's high-end cards, and run pretty much everything (ignoring games like Crysis) at reasonably high resolutions, but you won't really be able to turn on FSAA or Anisotropic filtering at higher resolutions while retaining good framerates.

Today's high-end cards are kind of obscene.... and nothing like the high-end cards of a few years ago. They're basically trying to fit as much power into one card (two slots or otherwise) as they can... and as a result, prices are skyhigh for the fastest cards (8800 Ultras were $800 at one point) and heat/power usage is insane. Your average powerhouse graphics card is now pulling well over 200 watts, more than whole computers used to.

Anyway, if you're doing SC2, you're fine with the stock card.

Graphics cards (at least, these days) have no effect on whether o not your screen is blurry/fuzzy. Don't worry about it =)
 
I was just in a mac store today and the employee took off the side wall, to let me have a look inside, the single ATI, (which was installed) runs 88 fps, which seems like a lot, considering i have only been arround 30 and sometimes 60 on my better video camera, but will the 8800's 141 fps be noticeable? it just seems that once you get to a certain amount of frames per second, it just all looks the same. ha, kind of a strange question, but how many fps do we as humans see?

72 frames per second.

the ati radeon 2600 XT, is a good video card (apart from the fact apple put in 256mb instead of the original 512mb). It will play almost any game on the market, apart from Crysis, FSX, and some other insane games that need those awesome Nvdia QUADROS!!!

I am right now downloading the demo on my imac

The imac is said to have the ati radeon 2600 PRO, PRO's are always worse than the XT version of the model. Here is the thing, Apple has modded the video card in the imac. The iMac NEEDS a mobile video card for heat problems, and there is no such thing as a 2600 PRO mobile. So apple bought 2600XT's, put in 256MB instead of 512MB's. and decreased the clock speed. Making the core chip, the same as the XT, but apple calls it their own [modded] 2600 PRO. the MP's 2600 has been modded as well, but little, as ive mentioned, it is supposed to have 256mb of vram. The difference between the 2 cards, is that the imac has a slightly slower clock speed, everything else is the same, so the results i see from playing starcraft should be the same as the mac pro.

EDIT:All of them are in .sit, which doesnt work under leopard :mad:
 
72 frames per second.


Debatable...

Humans can apparently percieve motion as far as 1000 "frames" per second under special circumstances. Under more "normal" circumstances humans can notice flicker even at 100-150 fps. What people sees as adequate framerates for gaming,30-40 fps come more from the response/control part than the actual judder of the picture.Why people get by in with the 24/25 fps with the movies is due the inheritant softness and motion blur associated with the medium.
Aaand apparenty there is difference if the display is emitting (lcd/crt displays) or reflection (projectors)...

If I remember right..
 
Debatable...

Humans can apparently percieve motion as far as 1000 "frames" per second under special circumstances. Under more "normal" circumstances humans can notice flicker even at 100-150 fps. What people sees as adequate framerates for gaming,30-40 fps come more from the response/control part than the actual judder of the picture.Why people get by in with the 24/25 fps with the movies is due the inheritant softness and motion blur associated with the medium.
Aaand apparenty there is difference if the display is emitting (lcd/crt displays) or reflection (projectors)...

If I remember right..

Well you obviously know more than me. I got 72 fps from this ninja video on youtube from doogtoons. the dude goes "a video camera can capture at 24 fps*, and human i can see 72 fps, A NINJA CAN MOVE 80,000 FPS!!!"

pretty funny

*yeah i know, v camera's can capture up to 30 fps
 
Debatable...

...Why people get by in with the 24/25 fps with the movies is due the inheritant softness and motion blur associated with the medium.
Aaand apparenty there is difference if the display is emitting (lcd/crt displays) or reflection (projectors)...

When the frame is pulled down into place in a film it is covered by a shutter blocking off the light, and thus you would still notice the flicker whatever the softness. Film projectors show each image twice giving a flicker rate of 48hz. Generally, we don't notice flicker because of persistence of vision. The same thing that causes us not to notice when we blink or if we look at a bright light. A flicker rate of around 20 is considered the cut off point. However, since different people have different thresholds 24 wouldn't be enough hence the need for showing each image twice.

However, I think that it's as you say the motion blur that is at work here. When something is moving fast it's blurred, and thus each frame 'fuses' into the next. When you move something fast (just swish your finger past your eyes) you'll be actually seeing motion blur because old images will persist. Thus I guess 'by chance' the motion camera emulates this perception. On the other hand video games show static images with spaces between them depending on the speed, and thus perhaps need higher frame rates. Trying to emulate motion blur would seem to be the way forward to give a real feel.

I guess that the higher frame rates are really beyond what we can see and thus cause a motion blur sensation emulating a more real experience.

In other words, frame rate and the gap between renditions of fast moving objects are different but related issues.
 
I was just in a mac store today and the employee took off the side wall, to let me have a look inside, the single ATI, (which was installed) runs 88 fps, which seems like a lot, considering i have only been arround 30 and sometimes 60 on my better video camera, but will the 8800's 141 fps be noticeable? it just seems that once you get to a certain amount of frames per second, it just all looks the same. ha, kind of a strange question, but how many fps do we as humans see?

I assume you're pulling the 88 and 141 fps numbers from the Apple site? Well, those numbers are for the Mac Pro running Doom 3 at a whopping resolution of 1024x768. Firstly Doom 3 is for all intents and purposes an old game at least in the PC world. It is no longer all that valid of a benchmark because with current processors and video cards, the game cannot tax a system at all. Secondly a resolution of 1024x768 is redonkulous. Nobody running a Mac Pro is going to pair it up with a 15" monitor which is what you would need to be running at a res of 1024x768.

If you're wanting to game, do yourself a favor and grab the 8800GT. It is basically the card to get right now. The 2600 is adequate for older games (it runs Prey, BF2142, and World of Warcraft fairly well at 1920x1200), but for anything more intensive (say like Quake Wars or Call of Duty 4 when they're released) it will become a slide show.
 
I was just in a mac store today and the employee took off the side wall, to let me have a look inside, the single ATI, (which was installed) runs 88 fps, which seems like a lot, considering i have only been arround 30 and sometimes 60 on my better video camera, but will the 8800's 141 fps be noticeable? it just seems that once you get to a certain amount of frames per second, it just all looks the same. ha, kind of a strange question, but how many fps do we as humans see?

141 fps will not be noticable, but as mentioned previously, games don't run at constant framerates. Every object, every "particle", and every effect on the screen requires additional computational power. Add in high-resolution textures, anti-aliasing, lighting effects, etc, and your graphic card has to work even harder. When you are standing still in a relatively sparse environment, your card might be running at 100-200 fps, but when you get into, say, battle situations where there are tons of objects, explosions, and effects, your framerate will drop considerably. Your goal is to keep the frame rate above 25-30 in those graphic-intensive situations, otherwise your experience will suffer.
 
Max Framerates dont mean much because you only notice its shortcomings when the framerate drops below a certain number (say, 30fps). Running a game at 141 fps or 72fps is not going to make a difference to the vast majority of people. Most LCDs only refresh at 60fps anyways.

A good stat to look for in a video card is min framerate.
 
On the other hand video games show static images with spaces between them...

Now what are these "spaces" in between? Like in movies as well, I mean, i have a good idea what a movie film looks like being arround them because my fathers in the buissness, but wont these spaces effect the image if they are too large? Im more interested the digital spaces, since there is no reel to project. Blue screen? Black Screen? And does anyone by any chance know what resolution humans see in? I don't see any pixels... is this an illusion?

Thanks,
-Ben
 
Now what are these "spaces" in between? Like in movies as well, I mean, i have a good idea what a movie film looks like being arround them because my fathers in the buissness, but wont these spaces effect the image if they are too large? Im more interested the digital spaces, since there is no reel to project. Blue screen? Black Screen? And does anyone by any chance know what resolution humans see in? I don't see any pixels... is this an illusion?

Thanks,
-Ben

As i said above, 72 fps for fps.

humans can see in any res. If you don't see pixels, you just have bad eye vision. i see them on my imac, and the macbook, that you have, is even bigger, due to the lower quality of the screen, you should be able to see it, Ben.
 
As i said above, 72 fps for fps.

humans can see in any res. If you don't see pixels, you just have bad eye vision. i see them on my imac, and the macbook, that you have, is even bigger, due to the lower quality of the screen, you should be able to see it, Ben.

Or your sitting farther away. I can't see the pixels on either of my monitors unless i look at them closely. It all depends how far away your sitting from them. It also changed if you have a gloss or matte finish.
 
I was just in a mac store today and the employee took off the side wall, to let me have a look inside, the single ATI, (which was installed) runs 88 fps, which seems like a lot, considering i have only been arround 30 and sometimes 60 on my better video camera, but will the 8800's 141 fps be noticeable? it just seems that once you get to a certain amount of frames per second, it just all looks the same. ha, kind of a strange question, but how many fps do we as humans see?


The FPS that a given video card can manage for a typical game is going to be dependent on several variables, most of which are settable in the game's control panel. The user can determine resolution, texturing, lighting, anti-aliasing. If those things are all set very high, some lower level video cards will choke and not be able to run them at playable frame rates. The solution then becomes to turn various of those settings down in order to get a good frame rate. The holy grail for most gamers, however, is a video card sufficient to muster smooth game play with beautiful scene rendering.

My Mac Pro has two 30" ACDs and an ATI 1900XT video card - the best card available for the Mac Pro at the time, but by no means a top-level video card overall. I play Call of Duty 4 using Windows via Boot Camp. I play it with all texturing and lighting set to their max, 4x anti-aliasing and a resolution of 1280x1024. The game is beautiful at those settings (smoke, haze, raindrops splashing etc) and my average maximum frame rate is 78 FPS, and average minimum is in the mid 40's, which I hit in only the most intense action settings (being in the middle of an airstrike, for example, with lots of explosions). Those are good frame rates for a game that's designed to look good at 60 FPS, but I confess, I'll be toward the front of the line for an upgraded Mac video card when it becomes available for my older Mac Pro.
 
Or your sitting farther away. I can't see the pixels on either of my monitors unless i look at them closely. It all depends how far away your sitting from them. It also changed if you have a gloss or matte finish.

Well yeah, but if he is saying that he was looking up close and couldnt see the pixels, then he has sight problems. But its quite obvious if you are sitting from a normal distance of your display, you cannot see the pixels.
 
24 fps work well in film because you get a full ( equalized illumination ) frame each 1/24 sec vs. video in which you get a raster scanned frame 30 to 150 times a second. the raster scanned nature of video causes flicker to be much more noticeable. Flicker in video can be reduced by the use of high persistence phosphors ( in CRT ) but have the down side of image lag when motion is depicted.
 
24 fps work well in film because you get a full ( equalized illumination ) frame each 1/24 sec vs. video in which you get a raster scanned frame 30 to 150 times a second. the raster scanned nature of video causes flicker to be much more noticeable.

Thanks for clearing that up.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.