Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
If they could produce enough chips at a higher speed to sell them, they would. The speed distribution forms a bell curve. Most are in the middle speeds, and a few work at very high speed. But Power=CV^2f. (And f is also a function of V). So increasing the frequency at least linearly increases power consumption (more than linearly if the voltage is increased to accomplish the speed increase). And temperature is a linear function of power. Electromigration and hot carrier effects also increase as voltage is increased (and, for certain effects, when frequency is increased). So even if they could see 5GHz chips, it would be ill-advised for the general public consumer.
Efficiency is something fun to explore. I can hit 3.48 GHz on stock voltages. Though many power saving mechanisms don't come into play anymore. I don't like the noise either.

You also need BIOS/EFI support for the CPU - if Apple's firmware
doesn't support the hexacore, then dropping a couple in turns a
Mac Pro into a boat anchor.

Even if the firmware would work, it might have a CPU ID check - and
might halt if the ID doesn't match a known supported CPU.
Why would Apple update the BIOS/EFI when they can just release a new Mac Pro? :rolleyes:
 
No, PDAs are becoming more popular. Before everyone starts whining about me using a "90s" buzzword, let's actually look at it: Personal Digital Assistant. IOW, technology to help get your personal crap done. The 2000s have added phone, entertainment, surfing to the concept.

This is what smartphones are today, a PDA. This is what the iPad is. This is all the use many laptops get.

Computers are where work gets done by many, many people. Those that only need calendars and phones to get all their work done may fit into your new definition of PDA as computer. But others do not, and never will. Quit talking about sales people and CEOs as if they are all that exist.

I agree and the fact that a vast number of people sit with a PC doing nothing but mail and surfing and making the occasional photo gallery (in between fighting their anti-virus software of course), iPad will I am sure, over time, dramatically eat into that market.
 
Because they want you to get maximum value from your Mac Pro
purchase by extending its lifetime? :rolleyes:
Brace yourself for Bloomfield Mac Pros on the Marketplace. :rolleyes: With 4-core Gulftown parts I wonder how much longer Intel is going to offer the older 45 nm parts.
 
Clearly you do not understand the mind of typical Apple customer. Any other monitor (but Apple) on your desk is a blasphemy. It does not matter if it is better and cheaper than Apple monitor. :D

I know you are joking but that is the iFad customer base, not the professional one. When someone cares more about what is on the bezel and the logo on the device than what the display output is, not isn't particularly professional. [ I know there are folks who only like to buy from one vendor, "one throat to choke", that isn't necessarily professional either. ]

Also the case that a Mac Pro is likely not on the desk with the monitor. All the more reason why it doesn't necessarily have to match.

There are a limited number of vendors who can live in the high end monitor space. It doesn't really improve the market over the long term if Apple made them share a smaller piece of the already small piece. There are plenty of other Apple devices to assign their display engineers to and still be working in the upper half of the quality spectrum in the overall display space.

Same logic ( can only buy Apple accessories) would mean that Apple made everything, ( external webcams , external hard drives , video cards, etc. ). That will only lead to focus problems.
 
No he's not. Most things that do rendering are highly scalable and directly benefit from more cores, and rendering tends to be one of those things that people do on workstations.

--Eric

Well he asked if something would be 50% or even 20% faster by adding 4 more cores (going from 8-12). I doubt anything would speed up that much except very particular applications.

Of course there are specific applications where more cores help. Rendering is one of those, lots of server specific tasks are others. More cores does not automatically equal better performance for all applications though. First, the applications you're using need to be solving problems that are parallelizable. Second, even when you have such a problem you still have to deal with the overhead of managing (communication) multiple cores operating on multiple sets of data. Eventually the communication overhead can outweigh the benefits of additional cores as diminishing returns kicks in.

Rendering in particular seems like there would need to have a lot of data sent to and from the processor(s). Adding more cores in this case without also increasing the memory throughput could actually decrease performance as the cores fight for control of the bus.

Technologies like grand central aim to make it easier for programmers to use more cores, but they still need a problem that is able to run in parallel to begin with.

Adding more cores can also help with running more than one application, although I would argue that we're getting close to the number of cores average people need to put most apps on their own core already. It would be interesting to see any user studies about the number of applications the typical user has open at any one time.
 
Why would Apple update the BIOS/EFI when they can just release a new Mac Pro? :rolleyes:

Apple isn't going to realse any EFI update that supports aftermarket CPU upgrades.
Your MacPro "upgrade" path is going to be largely limited to what was the build to order options on the MacPro when you bought it (and any BTO updates the Apple rolls out later. ). Similar to how the "better supported" hackintosh market to larger limited to other implementations that use essentially the same subset of parts that Apple uses. Other parts might "happen to work" but you are always just rolling the dice.
 
Apple needs to drop the prices on the current Mac Pro's and put more resources into actually implementing the new technology properly. Apple always seems to be playing catch up and wrapping it up in a pretty bow.

...or make a mid-range tower like everyone else.

I just hope that Apple Consumer Electronics will.
 
I realize most people are still wiping their tears of happiness with the launch of the huge iphone ipad... but seriously apple, get on with the hardware updates already.


I agree, if you think Apple will distract the masses with a mac hardware refresh before that giant iPod touch comes out, you have another thing coming. I think they'll refresh the line in the 2nd or 3rd week of April, especially since Macs are apparently still selling through the roof :confused:
 
I would also like to see eSATA added to the Mac Pro as a standard connection. Just one port.... that's all I ask.... right now at least.

USB 3.0 would be a far more forward looking single port addition. eSATA isn't going to die overnight, but only has modestly more forward momentum than FW800 does.

USB 3.0 is plug-and-play in the standard default format ( not in the "happens to be vendor support" context). What is going to be a larger support problem for Apple to roll out: eSATA or USB 3.0 ? Most likely eSATA. That is likely a determining factor.

USB 3.0 is bleeding edge (so not tons of devices), but the top level, expandable Mac has often gotten the bleeding edge tech first. Gotten get corporate experience with it starting somewhere.
 
...or make a mid-range tower like everyone else.

I just hope that Apple Consumer Electronics will.

I feel personally if Apple refuses to make a mid range tower they should just kill the mini as well. iMac and Mac Pro only or else have a Mac Mini and a Mac Pro Mini.
 
Well he asked if something would be 50% or even 20% faster by adding 4 more cores (going from 8-12). I doubt anything would speed up that much except very particular applications.

However, there is more to life than running a single application at a time. If there is longer term "batch job" computation that needs to happen very often you can increase productivity by moving on to another task that has a different set of computations. If you have "maxed out" your Mac Pro with the batch job you can't really do anything else. If you haven't then you can timeslice in another task.

Likewise can have archiving , Software raid , etc. issue can also float in parallel.

If you look extremely narrow, yes parallelism often peters out. However, multitasking is another form of parallelism. That some is often much more conducive to SMT (or Hyperthreading as Intel wants to relabel it) than more homogeneous threads all doing the exact same kind of computation with the exact same kind of data.
 
Well he asked if something would be 50% or even 20% faster by adding 4 more cores (going from 8-12). I doubt anything would speed up that much except very particular applications.

In other words, nothing speeds up by that much except stuff that does. That would include a vast array of scientific and engineering software, among other things.
 
Apple isn't going to realse any EFI update that supports aftermarket CPU upgrades.
Your MacPro "upgrade" path is going to be largely limited to what was the build to order options on the MacPro when you bought it (and any BTO updates the Apple rolls out later. ). Similar to how the "better supported" hackintosh market to larger limited to other implementations that use essentially the same subset of parts that Apple uses. Other parts might "happen to work" but you are always just rolling the dice.

Damn, that sucks. I really wanted to upgrade to the 6-core CPUs. :(
 
...or make a mid-range tower like everyone else.

I just hope that Apple Consumer Electronics will.

Nope. I (and many others) have floated the idea of a Mac gaming machine for years - fast RAM, water cooled, RAID 0 disks and spankin' graphics. Bigger than a mini - smaller than a Pro, not using Intel Server chips (Xeon's) or ECC RAM, but using the fast, hot max-clocked versions of their consumer chips.

Seemed to me that there was a market there. Apple, however disagrees, and since they've been more successful than anyone could have imagined, I'm inclined to trust there judgement over mine.
 
The additional cores are "generally helpful" since the sheer number of running threads these days is numerous. My takeaway as to primary benefits is the reduction to 34nm dies and the resulting headroom in thermal and Ghz features. This is where most of the "60%" performance boost comes from. It also makes such advanced technology fit in a thermally constrained "quiet enclosure" and make possible more than 2 cores in the future in "some case", perhaps a 1U or 2U server module, with real fans.

Rocketman
 
However, there is more to life than running a single application at a time. If there is longer term "batch job" computation that needs to happen very often you can increase productivity by moving on to another task that has a different set of computations. If you have "maxed out" your Mac Pro with the batch job you can't really do anything else. If you haven't then you can timeslice in another task.

Likewise can have archiving , Software raid , etc. issue can also float in parallel.

If you look extremely narrow, yes parallelism often peters out. However, multitasking is another form of parallelism. That some is often much more conducive to SMT (or Hyperthreading as Intel wants to relabel it) than more homogeneous threads all doing the exact same kind of computation with the exact same kind of data.

Actually I didn't look at anything narrowly and mentioned running more than 1 application in my post that you quote.

In other words, nothing speeds up by that much except stuff that does. That would include a vast array of scientific and engineering software, among other things.

The problem is that nothing is linear and everything has an exception. It's close to impossible to speak in absolutes when it comes to technology.

Adding more cores might help assuming the application was written to handle it and the application was already processor bound and the application isn't already saturating the memory bus and the application isn't already saturating the other IO sub-systems and well you get the point. More generic processor cores is not the end all be all of computer performance no matter how much Intel would like everyone to think so. They too will hit a point of diminishing returns where adding more simply will not improve performance enough to be worth the hassle.

From what I've been reading and friends have been telling me, if you really want to boost your computers performance right now go buy an SSD. They really make a difference in the feel of the computer because no matter how fast all the other components are or have gotten going to the disk is still the slowest operation by orders of magnitude.
 
The problem is that nothing is linear and everything has an exception. It's close to impossible to speak in absolutes when it comes to technology.

Adding more cores might help assuming the application was written to handle it and the application was already processor bound and the application isn't already saturating the memory bus and the application isn't already saturating the other IO sub-systems and well you get the point. More generic processor cores is not the end all be all of computer performance no matter how much Intel would like everyone to think so. They too will hit a point of diminishing returns where adding more simply will not improve performance enough to be worth the hassle.

From what I've been reading and friends have been telling me, if you really want to boost your computers performance right now go buy an SSD. They really make a difference in the feel of the computer because no matter how fast all the other components are or have gotten going to the disk is still the slowest operation by orders of magnitude.

Actually, lots of things are linear. And not everything has an exception. But that's neither here nor there.

Many engineering and science problems are naturally broken into parallel streams (finite element analysis, place and route, SPICE, static timing analysis, static power analysis, logical verification, simulations, etc.) and will always benefit from more physical cores so long as there is sufficient RAM per core. An SSD provides little benefit to these types of problems as they are strictly compute-bound. When I worked at AMD, nearly every piece of software we used for designing CPUs was designed to use farms of thousands of CPUs. Having more CPUs in a single box speeds up any of those types of problems in an essentially linear manner (not quite linear unless the number of parallel problems is an exact multiple of the number of available CPUs).
 
I don't think so... although recently it has been reported that the new socket for Sandy Bridge B2 will be 1356 rather than 136x.
It looks like the current Socket B (LGA 1336) and Socket H (LGA 1156) are going to see some minor updates in their B2/H2 successors. I'm not expecting Sandy Bridge processor compatibility with current sockets though.

Intel's 6 Series feels a lot like the minor LGA 775 platform update that the 4 Series was.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.