Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
Not open for further replies.
How is this even exciting? Minor performance improvements, more RAM (who the hell here needs 96 GB...?), etc.

"16 logical cores" isn't the same as 16 physical core. Hyperthreading is useful in some situations (hurtful in others), but not that amazing.

As for Blu-Ray around 2015, I doubt it. I see absolutely zero point in any resolution above 1080p. I personally am quite content with 720p, and would probably be okay 90% of the time with uncompressed 480p. This is how I would imagine most consumers are. 4320p just sounds insane to me. That's what, 16x the res of 1080?

I agree bandwidth won't be on par with it, not by a long shot, I just don't think it'll really ever become widespread. *shrugs*
 
How is this even exciting? Minor performance improvements...

Gainestown itself isn't even minor.

As for Blu-Ray around 2015, I doubt it. I see absolutely zero point in any resolution above 1080p. I personally am quite content with 720p, and would probably be okay 90% of the time with uncompressed 480p. This is how I would imagine most consumers are. 4320p just sounds insane to me. That's what, 16x the res of 1080?

I see zero point of resolution above Super Hi-Vision, simply because that is the practical limit of the human eye to distinguish clarity. We won't see anything larger than Super Hi-Vision in non-JumboTron use... ever.

We'll be milked one more time for an HD upgrade from 1080p, but absolutely nothing above Super Hi-Vision makes any sense whatsoever, so that'll be the end of it.
 
Gainestown itself isn't even minor.

Architecturally no, but performance-wise it's a meh, imo.

I see zero point of resolution above Super Hi-Vision, simply because that is the practical limit of the human eye to distinguish clarity.

I imagine it depends on the human being, but I suppose. Though "Super Hi-Vision" on a 50 foot theater screen might still look better than 1080 for those in the front *shrugs*
 
It seems the release schedule has tickled through.
Courtesy of computerbase.de there is an updated release date.
It´s actually the last day of q1 2009, March 30.
 

Attachments

  • cb.JPG
    cb.JPG
    56.6 KB · Views: 85
Don't limit yourself to thinking with today's tech.

There is no reason to believe that LCD / Plasma technology is going to stop.

I imagine that screens will eventually become truly "flat" panels that can be attached to a wall....perhaps 8 ft tall ...such that TV and or computer can be all over the wall.

Cheaper and bigger has been the trend. It won't stop at 30".
 
Architecturally no, but performance-wise it's a meh, imo.



I imagine it depends on the human being, but I suppose. Though "Super Hi-Vision" on a 50 foot theater screen might still look better than 1080 for those in the front *shrugs*

Clock for clock I get roughly 30% higher fps in the latest x264 build.
That´s not minor. Other than scientific applications and hardcore number crunching, HD editing is about the only application scenario I would want more cpu power for.
 
There is no reason to believe that LCD / Plasma technology is going to stop.

I imagine that screens will eventually become truly "flat" panels that can be attached to a wall....perhaps 8 ft tall ...such that TV and or computer can be all over the wall.

Cheaper and bigger has been the trend. It won't stop at 30".
Last I saw (CES), LCD & Plasma are not dead yet. Manufacturers want to continue with it to continue to profit off it. OLED has been delayed for this purpose.

I do expect both size an resolution to become major factors, but more notably for HDTV's, not monitors. Provided they can keep the manufacturing costs realistic of course. There will be limits. ;)

Personally, I think computer monitors will have limits to usable size. The largest (greater than ~40"), would be for conference rooms I would imagine. Maybe a 37" specialty model for graphics pros. It might be too difficult/uncomfortable to work with if you try to work on anything larger. (Just my opinion, as a 24" can cause me eye strain). ;) :p
 
How is this even exciting? Minor performance improvements, more RAM (who the hell here needs 96 GB...?), etc.

"16 logical cores" isn't the same as 16 physical core. Hyperthreading is useful in some situations (hurtful in others), but not that amazing.

As for Blu-Ray around 2015, I doubt it. I see absolutely zero point in any resolution above 1080p. I personally am quite content with 720p, and would probably be okay 90% of the time with uncompressed 480p. This is how I would imagine most consumers are. 4320p just sounds insane to me. That's what, 16x the res of 1080?

I agree bandwidth won't be on par with it, not by a long shot, I just don't think it'll really ever become widespread. *shrugs*

Although it shares the same name as the technology found in the Pentium 4, this version of HyperThreading is definitely not the same.

A single Gainestown Xeon is already faster than the current Mac Pro with 8 logical threads.
 
Although it shares the same name as the technology found in the Pentium 4, this version of HyperThreading is definitely not the same.

A single Gainestown Xeon is already faster than the current Mac Pro with 8 logical threads.

Certainly not in the vast majority of cases that use every physical core available. There have been benchmarks of SAP Software floating around but it´s not clear whether or not these are meaningful bc of unclear test methodology. Any inreased performance beyond what we have witnessed in the desktop arena must stem from the elimination of current workstation design inherent bottlenecks (eg fsb / memory and so forth).
 
Is the TeslaGLContext stuff remnants of CUDA?

As I understand it, CUDA is a programming (API) interface that allows software on the host computer to tap into the cores on the GPU device to perform parallel computations. In particular, it can be used for floating-point calculations ... useful for graphics computation, scientific calcs, and various numerical analysis problems, but not particularly useful for garden-variety applications. I've also heard that CUDA is not especially easy to code for, which has limited its use.

Tesla is just another form of nVidia's GPU with more cores and no graphics output -- think of it as an extremely parallel plug-in floating-point engine for your machine. Tesla originally supported single-precision f.p. and a newer Tesla now supports double-precision f.p. (desired by the HPC/scientific crowd).

Bottom line: Tesla means nothing for most Mac users. It's not a graphics card (you can't plug a display into it), it's just a highly parallel computation engine for specialized uses.

...and I'd even go so far as to argue that none of the graphics cards past the base one on Mac Pro (supporting 2560x1600 on dual-link DVI) matter to most Mac Pro users, either. As long as it can drive a high-res 30" display, anything beyond that is wasted on most of us. Some of you will throw rocks at me for saying that, but the reality is that The Emporer Has No Clothes -- graphics performance is so high even on the base video cards that anything beyond that (for most current applications) is wasted on almost all of us except extreme gamers.

So, I don't care what video cards are offered on the new Mac Pro, as long as they support 2560x1600 output. And I could care less whether Tesla is offered or supported. But I do care about the Gainestown processors, the I/O options*, the release date, Snow Leopard (Grand Central and ZFS!), of course the system price, and to some degree its reduced power consumption.


* I thought Apple would be late in coming to the party if it finally shipped Blu-ray drives in the 2009 Mac Pro ... but if it's not, I'm shocked that Apple still doesn't support it!
 
Although it shares the same name as the technology found in the Pentium 4, this version of HyperThreading is definitely not the same.
That's right. HyperThreading is a marketing term, not the name of an actual technology.

The Pentium-4 HyperThreading was pretty much a joke, as far as multi-threading goes. In the best case, it yielded, what, 20%-30% of what a full additional thread would have added (instead of 90%+)?

Gainestown's second thread per core is supposed to be a better implementation than that on the Pentium-4. How much better isn't clear yet. Obviously Intel's marketing dept decided to revive/re-use the "HyperThreading" brand name. Maybe Intel is hoping that the new implementation can actually give "HyperThreading" a good name? ;-)

(I really doubt even the new HyperThreading is as efficient as the CMT [chip multi-threading] that Sun uses on its CMT UltraSPARC processors ... those processors hit 64 threads per chip a year ago, compared to Gainestown just now getting up to 8 threads per chip. Although Sun's threads scale a lot closer to linearly, they are used more in "server" applications, as opposed to "workstation" apps like Intel's chips)
 
Architecturally no, but performance-wise it's a meh, imo.
Clock for clock I get roughly 30% higher fps in the latest x264 build.
Keep in mind that Gainestown is more expensive than Harpertown clock for clock.

Any inreased performance beyond what we have witnessed in the desktop arena must stem from the elimination of current workstation design inherent bottlenecks (eg fsb / memory and so forth).
2x QuickPath and 3-channel DDR3 IMC. :cool:

Gainestown just now getting up to 16 threads per chip
Gainestown has 8 threads (4 cores) per CPU. Beckton (MP server) will have 16 threads (8 cores) per CPU.
 
(I really doubt even the new HyperThreading is as efficient as the CMT [chip multi-threading] that Sun uses on its CMT UltraSPARC processors ... those processors hit 64 threads per chip a year ago, compared to Gainestown just now getting up to 8 threads per chip.

AFAIK, it was mid/late 2007, but yeah.. those little beasts scream.
 
I have doubts on its reliability, but I'm quoting this for interest only.

MyAppleGuide said:
Last week, we received anonymous tips from two different sources that hint at an Apple media event in late March - possibly March 24th. Apple is said to be announcing a new Mac mini, iMacs with the new NVIDIA chipset, Mac Pros with Xeon processors along with a pleasant 'surprise'. While we have no way of proving the credibility of the sources, recent information that we have obtained has led us to believe that the information may be correct. Here's to seeing a preview of Snow Leopard thrown in the mix!
 
So how much performance increase are we expecting of the new cpu's vs the current mac pro? I've read 15-20%, but I've also read 40%.

Take for instance on a heavy duty app like logic fully loaded or ableton?
 
So how much performance increase are we expecting of the new cpu's vs the current mac pro? I've read 15-20%, but I've also read 40%.

Take for instance on a heavy duty app like logic fully loaded or ableton?
My take is the lower percentages are with apps that don't really take advantage of the memory controller and DDR3. Memory intensive apps produce the higher percentages. IIRC, the highest I've seen is 50%.

Nice performance increase, if comparing to a current MP, and looks really good at the estimates of $3k USD for the base ($200 increase).

If you don't use such apps or will in the next couple of years though, it might not be worth it to you if you already have the current model. You could also save some cash buying the current model as a Refurbished machine or on a close out deal, if you can find one. ;)
 
My take is the lower percentages are with apps that don't really take advantage of the memory controller and DDR3. Memory intensive apps produce the higher percentages. IIRC, the highest I've seen is 50%.

Nice performance increase, if comparing to a current MP, and looks really good at the estimates of $3k USD for the base ($200 increase).

If you don't use such apps or will in the next couple of years though, it might not be worth it to you if you already have the current model. You could also save some cash buying the current model as a Refurbished machine or on a close out deal, if you can find one. ;)

Ooh, but I really need the xtra power. A couple of months ago, the hard drive of my macbook 2,16ghz 3gb ram died. Cause: overheating. My track in ableton live was just too bloody huge! I guess I best wait for the new pro's then! Any other musicians waiting for the upgrade here?
 
Apple doesn't manufacture drives. 0o
They might offer a solid state drive as an option. I kind of doubt it though for a workstation.


No no, I am talking about the way the drive placement is. Setting up the hrives side by side cuts down on heat quite a bit, only problem is that the drive are more susceptible to electro-magnetism. The Mac Pro as it is currently designed is a hard drive eater.

Solid state drive would be nice though as magnets would be a none issue.
 
Ooh, but I really need the xtra power. A couple of months ago, the hard drive of my macbook 2,16ghz 3gb ram died. Cause: overheating. My track in ableton live was just too bloody huge! I guess I best wait for the new pro's then! Any other musicians waiting for the upgrade here?
I've not used them, and don't know how much power is actually needed, let alone your desire. :p

If you need it, or can use it, I think the extra $200 is a good steal...err...deal. :D :p
 
No no, I am talking about the way the drive placement is. Setting up the hrives side by side cuts down on heat quite a bit, only problem is that the drive are more susceptible to electro-magnetism. The Mac Pro as it is currently designed is a hard drive eater.

Silliest thing I've ever heard, to date.
Congratulations!
 
We'll be milked one more time for an HD upgrade from 1080p, but absolutely nothing above Super Hi-Vision makes any sense whatsoever, so that'll be the end of it.
ha! Yes, milked indeed. After that money sucking exercise will probably be 3D. Not sure if that's a :( or a :) :confused: ;)
 
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.