Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Go for the 3 GHz, it sounds better when you say it and it looks even better on paper...;)
 
So, ummmm what exactly is the point of the 3.2 then, if the difference isn't that noticeable?
 
So, ummmm what exactly is the point of the 3.2 then, if the difference isn't that noticeable?

Serious professional number-crunching.
Being the "I have the best that's out there" person.
Mac Pro Newbie with more money than sense who sees a bigger number and figures they'd better spend the extra to get the most out of the machine (this was me, almost... minus the "more money than sense" bit... I have little of either :p)
Other?
 
If you primarily do lots of CPU bound work then the 200mhz difference could pay for itself in productivity. That's 7% more work done per day, significant if your paycheck depends on how much work gets done.
 
In previous Intel CPU families, faster CPUs sometimes had more features or supported faster memory and front-side buses and such.

All three Mac Pro CPUs are identical other then clockspeed and power-draw/heat-generation. They all have 1600MHz FSBs, they all have 12MB combined total caches, and they all use the same .45nm production process.

So the only real decision point is how fast each one does the work you most do.
 
Well, thanks guys. You've sold me LOL:D I'm going for the 3.0... and the 30".... and yeah, I'll still wait for MW, after all it's only 6 days now so I might as well, just in case there's any change to pricing or a surprise improvement to the displays.

Thanks for all your advice, I appreciate it.

Hey, have you noticed how chilled everyone is, now it's actually out?:D
 
Well, thanks guys. You've sold me LOL:D I'm going for the 3.0... and the 30".... and yeah, I'll still wait for MW, after all it's only 6 days now so I might as well, just in case there's any change to pricing or a surprise improvement to the displays.

Thanks for all your advice, I appreciate it.

Hey, have you noticed how chilled everyone is, now it's actually out?:D

It's a weight off my chest as it almost cements the fact that they're making room for something else at MWSF :D
 
Semanticly 7% is more accurate as the machines are 3.0 and 3.2 GHz, not 24.0 and 25.6 GHz even if you argue 24 Vs. 25.6 is still 7%.
 
So, ummmm what exactly is the point of the 3.2 then, if the difference isn't that noticeable?

There are definatly people who can use that extra processing power, and for whom it even makes financial sense. Apple's large overhead on top of Intel's already larger premium for that extra performance probably makes it less appealing though.
 
Who, Pixar?

Until they are rendering instantly then more power can always be used. Whether it is worth the upgrade depends on how much time you are wasting waiting for rendering and how best to cut that time down. Faster processors will definatly be helping with this for a number of users in many fields.

In the end there is no reason not to offer faster processors, they aren't going to be a high demand item, then again neither is the Mac Pro, but it is pure profit for Apple. Not to mention helping with their image of having the fastest pro systems.
 
Until they are rendering instantly then more power can always be used. Whether it is worth the upgrade depends on how much time you are wasting waiting for rendering and how best to cut that time down.

Well this was my point really. If I'm rendering a low-end 3D image, and it currently takes 3 hours to render on my G4 (as an example. I've also had to do renders which took 36 hours, and even one massive render that took 7 days).
On a 3.0 MP, I'd expect those times to be cut considerably, not by just a few minutes. But if I were to render them on a 3.2, would I see a vast improvement over the 3.0? Maybe not more than a few extra minutes, rather than hours? While it all adds up, and any saving of time is good, it still looks like I can't justify the expense of a 3.2, when I'm not producing work to the same budgets and deadlines as people like Pixar.
 
Here's another thought: Downgrade to the SP 2.8 GHz model and use the savings toward an upgrade to CS3, which is Intel native.
 
Wait until after the 15th and you will benefit from Geekbench scores for the new Mac Pros. New ACD are almost certainly going to be released and at least we will see the current prices drop.

I would say for the sake of not becoming obsolete quickly that 3.0gHz and 4Gb RAM (2x1Gb in addition to the factory stuff from another source) should be your minimum.

Go to an Apple Store and use the ACD before purchase. And please do not open credit to buy this!
 
Well this was my point really. If I'm rendering a low-end 3D image, and it currently takes 3 hours to render on my G4 (as an example. I've also had to do renders which took 36 hours, and even one massive render that took 7 days).
On a 3.0 MP, I'd expect those times to be cut considerably, not by just a few minutes. But if I were to render them on a 3.2, would I see a vast improvement over the 3.0? Maybe not more than a few extra minutes, rather than hours? While it all adds up, and any saving of time is good, it still looks like I can't justify the expense of a 3.2, when I'm not producing work to the same budgets and deadlines as people like Pixar.

Theoretically if you are doing long renders on a single system or lots of smaller renders and being held up by them then yeah you could see time add up and it might be worth investing in faster processors than the 2.8GHz base. 3GHz seems to be the sweet spot for this in relation to performance and overall cost of the system and I don't think you will regret it.
 
Well this was my point really. If I'm rendering a low-end 3D image, and it currently takes 3 hours to render on my G4 (as an example. I've also had to do renders which took 36 hours, and even one massive render that took 7 days).

First. I would have been in Kinkos renting an adequate machine before a multiple day rendering!

Second. Software is only going to become more processor hungry while you own this machine.
 
First. I would have been in Kinkos renting an adequate machine before a multiple day rendering!

Actually, we looked into that one:) It just wasn't feasible; they didn't support the software we had, there were concerns that we may have to take all the original figures/textures/etc with us, in case the new system asked "can't find xxx.obj" etc. In the end, it seemed like it would have taken us 6 days to set up in kinkos then get the render done in one day, and for the hassle involved, (and the cost) it wasn't worth it.
 
There are definatly people who can use that extra processing power, and for whom it even makes financial sense. Apple's large overhead on top of Intel's already larger premium for that extra performance probably makes it less appealing though.

The Mac Pros are an exception to the "Apple's large overhead" rule. Apple sells Mac Pros (especially when newly revved) for less than the Dell Precision or other comparable workstations. This is because Dell and friends make almost no margin on a lot of their consumer computers and try to make it up in the high-end space (the gaming computers are also pretty bad), while Apple just has 20% margins on everything (except iPods, where the margins are astronomical).

But yes, you really get rocked for that last 0.2 GHz. Intel does that across the board - even on desktops, the mid-high-range is surprisingly affordable, then things just jump up for the top 2-3 chips. It's worse now that AMD can't compete on the high-end. AMD's fastest chip now is a quad-core 2.5 GHz Opteron, and it can't even match Intel clock-for-clock (meaning it loses to a 2.5 GHz Intel chip), much less compete against the 3 GHz models. AMD's flooding the low-end, so you wind up (especially on the desktop) having a lot of options sub $300 (including dual-3.0 for $180-something and a 2.5 GHz Quad at $266)**, but no competition outside that.

That said, 7% off a 12 hour rendering job is 50 minutes. Off a day-long rendering job, it's a hour and a half. That's a big deal.

If you spend half your day compiling (or doing other stuff waiting on the compile)*, a 7% gain on half your workday is solid 15 minutes. 15 minutes/day x 5 days/week x 50 weeks/year = 62.5 hours. 60 hours at $40 an hour ($80k salary for a top developer) is $2400. It pays for itself 3 times over in a year.

* = obviously you're not jousting in the compile-time, but you're far from doing the job you're most efficient at.

** = I hear the new 2.5 GHz Quad might have been delayed until Feb/March. Whatever. You get my point.
 
I don't have any hard facts to back this up, but just for another point of view consider that you could buy the 2.8 GHz machine with lots of RAM, and then replace the processors with new way faster ones in a couple years. These are just off-the-shelf Intel CPUs, right? So I think in the future you could just drop in a 4.0 GHz CPU (when they're available and the price comes down), couldn't you?
 
I don't have any hard facts to back this up, but just for another point of view consider that you could buy the 2.8 GHz machine with lots of RAM, and then replace the processors with new way faster ones in a couple years. These are just off-the-shelf Intel CPUs, right? So I think in the future you could just drop in a 4.0 GHz CPU (when they're available and the price comes down), couldn't you?

I think these Penryns might be the last of this socket. At some point with at least some Nehalem chips (Q1 2009 or Q4 2008) on the server, Intel is moving to a point-to-point architecture and/or IMC, either of which would require a new socket.
 
I must be dumb... but can anyone link me to some 800MHz DDR2 ECC FB-DIMMs? I the only FB-DIMMs I can find run at 667MHz...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.