Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
There seems to be some confusion about FB-DIMMs and why they are being used in the Mac Pro. There is no performance advantage in using FB-DIMMs, and in fact you get slightly less performance compared to regular DDR. FB-DIMMs are used by intel on their high end workstation/server boards for one reason only, they make it far easier and cheaper to allow mass quantities of RAM. In computational situations where you need 16, 32, 64GB or more of memory absolute speed may be secondary to having enough RAM available.

Performance is slightly worse than regular DDR memory because of the added latency each module will add when in place. Each FB-DIMM has what is called an Advanced Memory Buffer (AMB) which is essentially a small memory controller. Most other computers have a memory controller on the motherboard or the main CPU which essentially controls the memory directly. This has an added advantage of less latency because of the lack of go betweens. The disadvantage is as you add more sockets for more memory you need more pins on the memory controller and more wires on the motherboard. Adding more and more RAM capacity makes the problem get worse.

FB-DIMMS don't talk to the memory controller directly they have the AMB on each stick which gets the request first and also acts as a repeater for the requests that go downstream. The added benefit is that there are some fault corrections built in that impose no performance hit compared to a regular DDR based system. Theoretically a system based on the FB-DIMM architecture would have higher bandwidth and be cheaper and easier to manufacture with a much less complicated motherboard. The downside is all the AMB's inline will increase latency and they get hot so they will also introduce more heat into a computer that uses them. The price difference is mostly due the the low volume they are made compared to regular DDR, in fact the memory chips on a FB-DIMM theoretically can be anything as the AMB would be the controller that interfaces with the rest of the system and are really just regular memory chips. The good news on price is that intel's new Skull Trail if popular should make the FB-DIMM more popular and help bring down the prices.

As it stands today the reasons intel chose to use them in the server/workstation arena are to allow very large memory capacity, cheaper and easier to manufacture motherboards, and some better fault tolerance and correction which would appeal to professional, scientific and other high end usages.

In everyday use a system based on FB-DIMMs will likely be a little slower than a standard DDR one but it should be pretty close in any event. Mac Pros use Xeons which are essentially souped up Core Duo 2's with a few nice features which should actually more than make up for the latency associated with the FB-DIMMs, especially the brand new ones. The new Xeons have 12 MB L2 cache per processor (6 MB shared per pair of cores) and a 24 MB snoop filter on the FSB which essentially keeps track of all the cached data on both CPU's which should dramatically reduce traffic on the FSB and push bandwidth usage to the limits. There are quite a nice little bunch of tweaks in there.
 
Nope Lurgen sort of said it was not worth doing your Crysis fix. I totally changed the game for me.

Lurgen Said "It is also important to remember that a lot of the perceptions regarding Crysis at max settings relate to Vista. You can't max out the settings in Crysis without DirectX10, which in turn requires Vista (OK, yes there are hacks to make it look like you have max settings under XP but you're actually running DX9 emulations of the DX10 features, which is not the same thing). "

Oh, I see. Thanks for defending me :D
 
This is one of the biggest myths out there.

You most definitely can tell the difference between 30 fps and 60 fps.

While 30 fps is by all means fine, 60 fps is smoother.

Look at this years NCAA Football which ran at 60 FPS on the 360 and 30 on the PS3.

Watch the following:

http://loot-ninja.com/2007/04/29/video-comparison-24fps-vs-60fps/

and to quote from:

http://www.daniele.ch/school/30vs60/30vs60_1.html

"So what is the answer to how many frames per second should we be looking for? Anything over 60 fps is adequate, 72 fps is maximal (anything over that would be overkill). Framerates cannot drop though from that 72 fps, or we will start to see a degradation in the smoothness of the game. Don't get me wrong, it is not bad to play a game at 30 fps, it is fine, but to get the illusion of reality, you really need a frame rate of 72 fps. "

a typical display runs at 60 hertz. Therefore, it flashes frames at 60 fps. If the game runs above 60 fps, the monitor will only display at 60 fps anyways....i don't get it.
 
Are you getting the 8800 slackpacker?

Anyway, AMD released new drivers today, I haven't testes them yet, but people are reporting a 10 fps increase in Crysis.

Let me know if you try it

http://game.amd.com/us-en/drivers_catalyst.aspx?p=xp/radeonx-xp

(for xp pro)

Cool Thanks!

Yes I have the 8800 on order--- March 31st is my delivery date HA

Best Regards

Update~ I got the Mac Pro on the 12th and Got Crysis the day of the Driver release on the 13th :). So thats the first driver I installed. I have been using that all along. Works great as far as I can see. No Crashes and everything runs really well. ATI Driver ver 8.2
 
Cool Thanks!

Yes I have the 8800 on order--- March 31st is my delivery date HA

Best Regards

Update~ I got the Mac Pro on the 12th and Got Crysis the day of the Driver release on the 13th :). So thats the first driver I installed. I have been using that all along. Works great as far as I can see. No Crashes and everything runs really well. ATI Driver ver 8.2

coolness
 
Alright guys, as promised -

I only have the Crysis demo, so the final game might be more optimized, especially with 1.1. I'm also running 32bit Vista Ultimate - which shouldn't matter since I only have the stock 2gb of RAM at the moment. But I do have access to all 8 cores.

So, maxed out Crysis: 1920x1200, everything on Very High. No AA. Getting 10-12 frames per second all around. That's "choppy but playable for single player" in my book. Firefights are obviously worse.

Going to play around with some settings and post more results.

EDIT: some more:

Knocking down resolution from 1920x1200 to 1680x1050 didn't do... anything at all to the framerate.
I get about 8-10 fps in open areas overlooking water.
 
Alright guys, as promised -

I only have the Crysis demo, so the final game might be more optimized, especially with 1.1. I'm also running 32bit Vista Ultimate - which shouldn't matter since I only have the stock 2gb of RAM at the moment. But I do have access to all 8 cores.

So, maxed out Crysis: 1920x1200, everything on Very High. No AA. Getting 10-12 frames per second all around. That's "choppy but playable for single player" in my book. Firefights are obviously worse.

Going to play around with some settings and post more results.

EDIT: some more:

Knocking down resolution from 1920x1200 to 1680x1050 didn't do... anything at all to the framerate.
I get about 8-10 fps in open areas overlooking water.

I guess this is with the 8800GT Card?
 
I guess this is with the 8800GT Card?

Hah, yep.

Interesting thing is, I went through "high" "medium" and "low settings" after having run it on very high.

Very high - 8-12fps
High - 9-13fps
Medium - 14-20fps
Low - 28-32fps

This was all at 1920x1200
 
Hah, yep.

Interesting thing is, I went through "high" "medium" and "low settings" after having run it on very high.

Very high - 8-12fps
High - 9-13fps
Medium - 14-20fps
Low - 28-32fps

This was all at 1920x1200

What processor(s), single or dual, are you running, and how much memory do you have, and what speed is that memory running?
 
What processor(s), single or dual, are you running, and how much memory do you have, and what speed is that memory running?

Sorry, I should have probably provided full system specs

2.8Ghz Octo-Core
2gb of 800mhz ram
running vista 32bit Ultimate
8800GT card with latest drivers
Running Crysis demo


Different game: I ran the UT3 beta/demo - didn't get a formal fps count, but it felt smooth on any map I've played. And by smooth I am guessing well over 30fps 1920x1200 every setting cranked. That means chances are you can play any of the UE3 blockbusters really well (Bioshock, Mass Effect, etc)
 
Alright guys, as promised -

I only have the Crysis demo, so the final game might be more optimized, especially with 1.1. I'm also running 32bit Vista Ultimate - which shouldn't matter since I only have the stock 2gb of RAM at the moment. But I do have access to all 8 cores.

So, maxed out Crysis: 1920x1200, everything on Very High. No AA. Getting 10-12 frames per second all around. That's "choppy but playable for single player" in my book. Firefights are obviously worse.

Going to play around with some settings and post more results.

EDIT: some more:

Knocking down resolution from 1920x1200 to 1680x1050 didn't do... anything at all to the framerate.
I get about 8-10 fps in open areas overlooking water.


Hmm, that's disappointing, but I'll assume that the final game along with 1.1/2 and latest nvidia drivers will provide better results.
 
Hmmm on my 8800GT I was averaging around 50fps or so on very high but at a low resolution of 1024x768 on a 21" CRT (its on a quad core PC running vista64)

If I remember right, bumping it up to 1280x1024 lowered the framerate by about 10fps. I can try tonight and see what I get at different resolutions but I have no widescreen displays so I'll have to try 4:3
 
Hmmm on my 8800GT I was averaging around 50fps or so on very high but at a low resolution of 1024x768 on a 21" CRT (its on a quad core PC running vista64)

If I remember right, bumping it up to 1280x1024 lowered the framerate by about 10fps. I can try tonight and see what I get at different resolutions but I have no widescreen displays so I'll have to try 4:3

I'll try a significantly lower resolution later on. I like running games at native resolution (I'd rather knock detail down) for some reason, but I'll see what's up on lower ones.
 
Ya if I had an LCD, I"d feel the same as you do, but since I'm still using my old school CRT monitors (they still work great), there is no 'native' resolution so no scaling issues. 1024 looks great on very high by the way :)

I'll try a significantly lower resolution later on. I like running games at native resolution (I'd rather knock detail down) for some reason, but I'll see what's up on lower ones.
 
Hah, yep.

Interesting thing is, I went through "high" "medium" and "low settings" after having run it on very high.

Very high - 8-12fps
High - 9-13fps
Medium - 14-20fps
Low - 28-32fps

This was all at 1920x1200

Run it with SHADERS and SHADOWS set to medium, everything else set to high. It's playable this way on my MacBook Pro w/ 8600M GT (256MB VRAM), so I'm sure you'll have no problem with an 8800GT.
 
So, maxed out Crysis: 1920x1200, everything on Very High. No AA. Getting 10-12 frames per second all around. That's "choppy but playable for single player" in my book. Firefights are obviously worse.

yeesh, i guess different people have different thresholds for perceiving frame drops. I installed crysis lasst night on my pro (10gb RAM, 8800gt, vista 64bit). everything on very high was totally unplayable. I settled on everything on high, except for shaders, shadows, and physics set to medium, no AA, 1920x1200. I didn't turn on the FPS display, but with those settings, the motion blur when you whip the mouse around quickly is super smooth.
I couldn't see much of a quality difference from reducing shaders and shadows. Actually, i thought it looked a bit better!
 
RAM Setup

:D

Is anyone here running it with either a 8x1GB setup or a 4x2GB of RAM setup, from what i've read so far, if the Dimms are distributed evenly across both risers then there is a performance advantage due to the Mac Pro's RAM running in a similar mode to dual channel on a pc? Not been able to confirm this but would be good to know.

:D
 
:D

Is anyone here running it with either a 8x1GB setup or a 4x2GB of RAM setup, from what i've read so far, if the Dimms are distributed evenly across both risers then there is a performance advantage due to the Mac Pro's RAM running in a similar mode to dual channel on a pc? Not been able to confirm this but would be good to know.

:D

the mac pro has 4 memory channels. optimal config is 4 dimms of whatever capacity.
 
Well

A single 8800GT is far from enough to crank up Crysis to very high in DX10. If you are getting high FPS you are probably running XP, and then Crysis is not so much different from any other game out there.
 
Sorry, I should have probably provided full system specs

2.8Ghz Octo-Core
2gb of 800mhz ram
running vista 32bit Ultimate
8800GT card with latest drivers
Running Crysis demo


Different game: I ran the UT3 beta/demo - didn't get a formal fps count, but it felt smooth on any map I've played. And by smooth I am guessing well over 30fps 1920x1200 every setting cranked. That means chances are you can play any of the UE3 blockbusters really well (Bioshock, Mass Effect, etc)

Are you sure your using the latest drivers? Did you install "Leopard Graphics Update 1.0" ? This is a separate update after the 10.5.2 update. You might want to run software update one more time if you didn't install Leopard Graphics Update 1.0 yet.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.