Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
There is nothing inherent to the 2009 CPUs that keeps them from being upgraded. Unless Apple sodered the CPUs to the daughterboard (which I doubt) there no reason can't drop in another CPU. You will piss away your warrantee most likely. But other than Apple not wanting aftermarket socket stuffing, there is nothing inherently prohibiting that.





The CPUs and memory being shoveled onto a daughtercard is Apple's (which other folks have had in other contexts for similar reasons) idea. It isn't inherent in the CPU/memory in a single package design. Apple did it to conceptually make getting at memory upgrades easier. Pull the daughter card out and then zero obstructions to getting to the memory slots.
That actually raises the cost of the finished product and would also be done by making box taller. It is a well reasoned compromise to hold the hieght and just the width in a more clever manner while still maintaining clean accessibility inside.

Ah, so the socket is essentially untouched within the Xeon family? Does this mean, supposedly, that in 2011 when my 2008 Mac Pro's AppleCare expires, I could simply drop in a new processor and "upgrade" the system (minus the other benefits to a completely new system update)?
 
When CPU designers are designing the chip, all we think about is that maximum speed. We simulate everything to that cycle time.

That was the old days. Now CPU customers are tell the vendors that they need to conserve power. Any customer who is running a major datacenter and looking at the HVAC bill is cares alot about power.

Similarly, large numbers of deployed servers run at lower than 50% utilization rates. That's one reason why VMWare and virturalization have taken off so well. Again the model of "running at 100%" isn't a real world design point.
 
Ah, so the socket is essentially untouched within the Xeon family? Does this mean, supposedly, that in 2011 when my 2008 Mac Pro's AppleCare expires, I could simply drop in a new processor and "upgrade" the system (minus the other benefits to a completely new system update)?
The Mac Pro (Early 2009) is socket compatible with Gulftown. LGA 1366 is going to be long lived from November 2008 to at least until 2011.
 
So you're a sysadmin at a media company. Then I was right (except the street address).

No, I am an Infrastructure Engineer. I design the infrastructure. We have over 6500 machines in a heterogeneous environment. My job here is to make it all work with our Windows environment. On that note if I was a sysadmin I would have said that. Media Company, we own a few of them.
 
Why does Apple have to build a screen. There is no incentive for Apple to compete with other vendors for separate screens. There is a standard connector ( so they can't really make it proprietary, beyond going with the annoy mini-DisplayPort .... which will be an adopted/deployed standard soon. )

Between vendors like Ezio (or LaCie ) and top end NEC (SprectraViewII) on the high end color corrected front and top end Dells, HPs, Samsung prosumer IPS panel front, you can't find a monitor whose output meets or exceeds Apple's stuff? May not be exactly enclosure color coordinated with Apple looks but the images coming off the screen. However, the exact same monitor is needed in the much larger overall PC/Workstation market. The same vendors who fill that space for Windows/Linux are very likely going to be able to fill that space for the MacPro. The connectors are standard. Apple doesn't really sell integrated color correction ( so some vendor needs to fill those sensors also. )

This is very similar to the disappearance of XRaid. A very nice box but all of the connection were standard and there are other quality players out there who have much bigger market to leverage than just Mac market.

Additionally, Apple doesn't have to compete for monitors for those enclosed with the rest of the computer. Why compete if don't have to.

Apple will still sell the 24" display since it is in part a "docking station" with MagSafe , USB , mini-Display three headed hydra connection solution.
(none of which needed for Mac Pro ).

It has been what 3 years since a upper level Cinema Display. You think needed that much time to do one if wanted to? The 30" Display as McCoy would put it: "He's dead Jim".

Clearly you do not understand the mind of typical Apple customer. Any other monitor (but Apple) on your desk is a blasphemy. It does not matter if it is better and cheaper than Apple monitor. :D
 
I want your perspective on the folloowing. I've been guaranteed 2.66 GHz from my Core i5 750 at full load on all four cores. Under certain occasions depending on how those four cores are all loaded it can boost to 2.8 GHz. (Given recent BIOS updates I can force that x21 multiplier 100% of the time regardless.)

Should Intel market my processor at 2.66 GHz or 2.8 GHz?

This is disregarding the usual load balancing and Turbo Boost to increase performance for single/dual threaded applications. (3.2 GHz for those tasks.)

2.66. This illustrates how things changed. It used to be that the thing would run at 3.2 all the time, and drop down to 2.66 (or lower) when it could get away with it. Now if you run at 3.2 all the time the processor melts down.
 
I don't know. I agree with you that it's been a few years (especially for the Cinema Displays, it's been since 2003!), but the design is flawless! Apple has developed an impressive system built around the logic/mother board and its components. The seamless design and access for the RAM, internal hard drives, fans, cooling system - it is quite impressive. As for cosmetics, maybe a few differences here and there, but normally Apple makes changes with their design when it benefits their function, not just the form. The perforated aluminum enclosure is utilized primarily for it's cooling factor(s). However, using a little black on the shell would be a nice tie in to the black utilized on the iMac's and MacBook lines.

Hi,

The Pro case is still a beauty.

On the other hand, the mini being designed to accommodate 2.5 HDDs rather than 3.5 HDDs is a significant bottle neck that didn't need to happen. I'm sure it was a deliberate design so as to encourage iMac purchases.

s.
 
2.66. This illustrates how things changed. It used to be that the thing would run at 3.2 all the time, and drop down to 2.66 (or lower) when it could get away with it. Now if you run at 3.2 all the time the processor melts down.
The world is a better place with idle clocks and voltages with performance on demand.

My processor can handle 2.8 GHz 24/7 but the box isn't labeled as such.
 
You are correct here.

No he's not. Most things that do rendering are highly scalable and directly benefit from more cores, and rendering tends to be one of those things that people do on workstations.

--Eric
 
2.66. This illustrates how things changed. It used to be that the thing would run at 3.2 all the time, and drop down to 2.66 (or lower) when it could get away with it. Now if you run at 3.2 all the time the processor melts down.

That is a very interesting viewpoint. I always thought (at least in the Core 2 days) that Intel was just being conservative with their clocks. It seemed like almost every chip they put out could easily run at 4GHz. It was like they didn't need that much speed to beat the competition. So what you are really saying is they can't produce enough chips to meet those fast speed (even with Core iX)? And that Eidorian is going to burn up his CPU running it at the "Turbo Boost" speed all the time?
 
Close!

I design, maintain, and manage a large Macintosh environment for a major conglomerate (entertainment division).

So I essentially take out the trash all day.

I am curious how typical large Macintosh environment looks like. Clearly Mac Pros and iMacs are used as workstations but what about the server part?
 
That is a very interesting viewpoint. I always thought (at least in the Core 2 days) that Intel was just being conservative with their clocks. It seemed like almost every chip they put out could easily run at 4GHz. It was like they didn't need that much speed to beat the competition. So what you are really saying is they can't produce enough chips to meet those fast speed (even with Core iX)? And that Eidorian is going to burn up his CPU running it at the "Turbo Boost" speed all the time?

If they could produce enough chips at a higher speed to sell them, they would. The speed distribution forms a bell curve. Most are in the middle speeds, and a few work at very high speed. But Power=CV^2f. (And f is also a function of V). So increasing the frequency at least linearly increases power consumption (more than linearly if the voltage is increased to accomplish the speed increase). And temperature is a linear function of power. Electromigration and hot carrier effects also increase as voltage is increased (and, for certain effects, when frequency is increased). So even if they could see 5GHz chips, it would be ill-advised for the general public consumer.
 
I am curious how typical large Macintosh environment looks like. Clearly Mac Pros and iMacs are used as workstations but what about the server part?

We have a large mixture.

Mac Pro's, Mac Book Pro's, iMacs, and Mac Mini's. Bulk of our machines are Mac Book Pro's though, We have a bunch of Mac Pro's we use for Avid and Broadcast but those are on a separate VLAN for security purposes. For servers, we have Final Cut Pro Servers and Avid Servers. Everything else we use NetApp Appliances and Windows Servers utilizing ExtremeZI-P for AFP/SMB sharing and Centrify for AD integration etc.
 
Sitting here with my two year old top of the line 8 Core Xeon I suddenly miss the days of DayStar and the ability to buy a top of the line CPU plug in board for an old Mac ...sigh
 
Logic is a good example. It will use as many cores as are available.

That's not the case with Logic, it is limited to 8 cores (meaning 8 full cores or even an i7 with four full cores and four hyperthreading cores).

Logic SHOULD be able to use 16 cores on the current MP and 24 on the upcoming machines, but it will take a software update for that. I'm baffled why they still haven't updated the software to better take advantage of the machines that shipped a year ago.
 
Ah, so the socket is essentially untouched within the Xeon family? Does this mean, supposedly, that in 2011 when my 2008 Mac Pro's AppleCare expires, I could simply drop in a new processor and "upgrade" the system (minus the other benefits to a completely new system update)?

There is an assembly that would plug in CPU + Heatsink . If you find a CPU + Heatsink that meets Apple's design criteria theoretically it will work.
I suspect in several cases though you will cook your MacPro because it won't.

http://www.anandtech.com/mac/showdoc.aspx?i=3597&p=10


I suspect a decent number of folks will fry their Pros trying to attempt to extend life on the cheap. There is no largely supported Pro CPU upgrade aftermarket. You may find a "how to" and maybe even a "kit" that someone sells. However, by in large you'll be mucking around with you Mac Pro. If have a good outcome, great. However, that is not what most folks are looking for on a $2.5+K machine.


P.S. Also note that Apple will still do service after Applecare expires. Applecare is whether you pay a big bill or not. If Apple (or a authorize repair shop) finds you've glued up your own CPU upgrade, they may tell you they won't touch your Mac Pro with a ten foot pole. You are in "do it yourself" mode for everything inside the box.
 
There is nothing inherent to the 2009 CPUs that keeps them from being upgraded. Unless Apple sodered the CPUs to the daughterboard (which I doubt) there no reason can't drop in another CPU. You will piss away your warrantee most likely. But other than Apple not wanting aftermarket socket stuffing, there is nothing inherently prohibiting that.

The Mac Pro (Early 2009) is socket compatible with Gulftown. LGA 1366 is going to be long lived from November 2008 to at least until 2011.

You also need BIOS/EFI support for the CPU - if Apple's firmware
doesn't support the hexacore, then dropping a couple in turns a
Mac Pro into a boat anchor.

Even if the firmware would work, it might have a CPU ID check - and
might halt if the ID doesn't match a known supported CPU.
 
I would say either x-server or another server running linux. Although he did say he mixes in windows so there might be some window servers in the mix also.

Yeah I a gave a brief run down in my previous post. If you want more in depth stuff you can PM me.
 
Fair enough, but aren't we moving away or redefining the definition of "computer" in the first place?
....snip
No, PDAs are becoming more popular. Before everyone starts whining about me using a "90s" buzzword, let's actually look at it: Personal Digital Assistant. IOW, technology to help get your personal crap done. The 2000s have added phone, entertainment, surfing to the concept.

This is what smartphones are today, a PDA. This is what the iPad is. This is all the use many laptops get.

Computers are where work gets done by many, many people. Those that only need calendars and phones to get all their work done may fit into your new definition of PDA as computer. But others do not, and never will. Quit talking about sales people and CEOs as if they are all that exist.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.