Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Not so much "new" as "custom"/"proprietary". Apple designed and makes these cards not AMD. Apple has licensed some stuff and buys a few basic components from AMD but it Apple's card.



It will show up once macrumors and other rumors sites get indexed.

roughly

D300 approximately a W7000 ( less VRAM 2 versus 4 ) [ approximately AMD HD 7870 ]

D500 approximately a W8000 ( less VRAM and fewer cores and tweaked memory width ) [ approximatel AMD HD 7950 or AMD HD 7870 XT ]

D700 approximately a W9000 [ approximately AMD HD 7970 ]

Actually D700 has a lot more power than HD 7970 according to the specs.
 
!! Pull vs Push

I don't need a Pro, but I'd love one sat on my desk - awesome design and beauty :)

One thing I'm curious about is why they didn't put the fan at the bottom rather than the top? Why? Well, at the top it is in the warm airflow whereas at the bottom it would be in the cold airflow - and I imagine a 12-core Pro running at 100% CPU is going to get quite warm - so the fan would last longer.

Pulling the air through from the top guarantees the correct flow through the central heat sink channel. Pushing the air in from the bottom would push some of the hot air into nooks and crannies of the chassis causing heat problems. Aerodynamics. :apple:
 
One of the GPUs is dedicated solely for driving display and graphics. The other GPU is dedicated for non-graphics processing. Normally when you run an application using OpenCL you are using up precious resources of the graphics card and so you can't be driving hard-core graphics and be super-crunching in OpenCL at the same time. The architecture of the Mac Pro solves this issue with two separate graphics controllers each dedicated for its own purpose.

The idea of having a separate graphics card in a high-end machine to use solely for non-graphics processing as a complement to the main cpu is a well established idea and is similar to an Intel Xeon Phi expansion card. The way Apple has done this in the Mac Pro is actually cheaper than a high-end PC Xeon Server with a single high-end graphics card and a single high-end Xeon Phi add-on card (which costs up to $4k just for the Xeon Phi card).

"The other GPU is dedicated for non-graphics processing." That can't be correct. I'm sure that both GPUs can be involved in graphics if needed, such as driving all those 4K displays. :apple:
 
CPUs and GPUs these day draw power dynamically. If not doing much they draw less. If maxed out then draw closer to the upper limits. If the CPU shuffles work to the GPU and has to wait then its power draw can drop. Same on the flip side.

For workloads that light up all cores in all CPU/GPUs then yes something would have to give. Benchmarks whose primary purpose is to run individual components up to their "redline" aren't particularly going to be as informative as benchmarks which are reflective of more normal workloads. [ what is "normal" varies by app usage and user. ]

Well both D700 GPUs at full tilt [and frankly if I were paying for them I WOULD use them to their full extent] are going to draw a fair few watts (and if they aren't fully loaded, why have two at all?).

Keeping both GPUs fed will also keep the CPU occupied, not to mention powering all the other stuff.

There is absolutely no engineering or design excuse to artificially impose power or thermal limits in the desktop environment.

But as I said, I will wait for the reviews before casting my engineering opinion.


It isn't going to be surprising to find out that systems with double the power budget and benchmarks that are geared to soak up power budgets will turn in higher numbers.

Well that is obvious.

But previously I bought a Mac Pro because a similar spec self-build would have been only about 30% cheaper.

I can now build something faster for half the price, the performance per £ ratio is completely and utterly ridiculous compared to the cMP [in things that don't use twin FirePro cards].

Maybe in 2/3 years time when Lightroom makes use of GPUs then I might be interested...
 
Yeah, coming fall meant the announcement to announce the Mac Pro in December.
Yes yes yes yes. I have been waiting for them to announce or at least reference to a 4K display. It can drive 3 4k displays, but what do you recommend us buying Apple? Dell. :rolleyes:

When price shopping for displays when I bought the 2012 Mac Mini I ended up buying a Dell. Imagine the look on my roommate's face when I spent close to $1000 on a new sexy Apple product and hooked it up to a Dell monitor. :confused:

Against all odds, they seem to actually put together a pretty decent IPS display, considering the low price I paid for it ($150 on amazon for 23"). Then again, the most graphics-dependent thing I do is Pixelmator for personal use so I'm probably not in the target market for a more suitably calibrated higher-end display, and definitely not for the new Mac Pro.
 
Last edited:
I doubt it's being bought by many first time computer buyers. No more reason it should come with those than with a monitor.

Deployment folks, deployment.

When you're an institution or a startup that needs to get 10 of these things we don't have to have to add keyboards to the list as well. My last job bought 110 Mac Pro 2,1 versions and luckily we didn't have to unwrap and dispose of the boxes for 110 keyboards and 110 mice.

This box is for running Apple Pro Apps. They are all OpenCL based. Actually I think the world is moving to OpenCL.

Yes indeed. It's truly a nice rig if you are in the Apple ecosystem. I see it doing well in the Adobe one as well. Avid maybe another thing, esp. since Avid wants to build the machine for you.

The fact that it's built around OpenCL is a bit of a off setting point for me. It'd be nice if either Adobe would implement CL. Otherwise, HP and . . . . . ugh Windows may have to be my next workstation.

Seeing they still charge $1000 for a 1440 27inch display screen they would have to charge another $3000 for a 4k display.

That's not that bad actually. The OG acrylic Apple Studio Display's intro'd at that price for the vast 23" model.

This is a bit pricier than I was expecting, especially since I'll have other expenses for peripherals. But it looks like a phenomenal machine. I have two months to save my pennies....

It's right in line with what I knew Apple would do. Price their desktops from the bottom up. Maxxed out versions of each model run into the price point of the next one. By the time you get an iMac totally decked out it's around $3200 last I checked, so a user could grab an even faster machine at that point, and then just have to buy a panel.

Same thing for the rMBP.

Still looks like a trash can. No thx. Only thing I like is Thunderbolt 2.0, which will come to next Macbook Pros eventually.

Oh yeah indeed it does. Rather uninspiring design if you ask me. The GFX are what kills it for me but I am not complaining. The need for a host of external components is even worse, but it's a nice upgrade for laptop professionals looking for the fastest Mac desktop, or freelancers that need to upgrade an iMac, each user already has a desk full of external boxes and two surge protectors loaded with worts.
 
A few years ago, Apple proudly ridiculed PCs in their "I'm a Mac" ads by having the PC character say "Oh, the rest of me is in some other boxes..." Well, it seems that now half of the Mac Pro will be in some other boxes. External storage, external video I/O, and so on.

This will be a great product for design/photography work. Maybe audio, too. And some video work, probably.

It will come down to Open CL support in video apps (Adobe, Blackmagic, etc.). If that is implemented properly, this workstation will be a decent work horse for the next couple of years. If not, then the majority of all serious video/gfx post production will remain or move on PCs, where you can have dual 12-core CPUs, SSDs, PCIe storage, Tesla cards, internal video and audio I/O, optical media, expand RAM to 512GB and work on cheap and relatively fast 3,5" SATA media.
 
Air vents at the bottom? Attach wheels and it will make a decent vacuum cleaner. That thing will get full of dust in the first month of operation.
 
Pulling the air through from the top guarantees the correct flow through the central heat sink channel. Pushing the air in from the bottom would push some of the hot air into nooks and crannies of the chassis causing heat problems. Aerodynamics. :apple:

Fair comment, except that this is Apple whose engineering is to levels of precision which should mean there are no nooks and crannies, plus the heat sink is a single aluminium extrusion.

Having said that, I've no doubt that the fan has been engineered to work at elevated temperatures for long periods of time so won't start squealing after a couple of months like the fans in many PCs do :p
 
It supports OpenCL which is the open standard. CUDA is an Nvidia proprietary API which is only available with Nvidea GPUs. They have gone the right way here.

valid points however CUDA offers better maths functions and better debugger/support.

Now if a company charges customers £100k per license for a software and they need to provide assurance that all would work, they are going to go with CUDA not OpenCL.

These software packages (with such high costs) do exist and professionals do use them. Maybe in the future things may change but that is 2-3 years in the future (at least).
 
valid points however CUDA offers better maths functions and better debugger/support.

Now if a company charges customers £100k per license for a software and they need to provide assurance that all would work, they are going to go with CUDA not OpenCL.

These software packages (with such high costs) do exist and professionals do use them. Maybe in the future things may change but that is 2-3 years in the future (at least).

While I would agree that CUDA offers a richer development environment and API it also hard codes for a specific set of hardware which is never a good thing. There is a lot of work going on the moment in providing abstractions to these APIs which make moving between the two a lot less painful so with any luck it will be faster than 2-3 years to gain real traction.
 
Inevitably there will be better GPUs available.

What would make you think that?


While I agree with you that many pros don't use Apple's apps, and certainly most don't use Apple's apps exclusively, the fault doesn't lie with Apple that they're going with OpenCL instead of CUDA....

...Bottom line, if you're a CUDA-dependant workflow, start yelling at the software developers now to get their stuff moved over to OpenCL.

Well the fault (if that's what we're calling it) does lie with Apple on some level. While Nvidia is pushing CUDA, their cards support Open CL. If Apple truly wants to make this machine a big part of the professional workplace, then they too need to push developers for support. Or offer an Nvidia option. Because right now CUDA is still a pretty big factor, and it's not like everyone updates software versions with each release. They''ll be using current versions for quite a while longer.

You can put all the storage in the machine room and run a Thunderbolt cable to it. 50m are already available and 100m coming soon.

And how much will they set you back?

It supports OpenCL which is the open standard. CUDA is an Nvidia proprietary API which is only available with Nvidea GPUs. They have gone the right way here.

Again, that remains to be seen. Nvidia supports Open CL as well.
 
...that you can order months from now and get sometime in January likely. Apple's been holding this carrot out since when, June? Sheeeez.

Patience, patience...I don't think anybody is going to order this just for its looks (well, then again...).

We all want to see some benchmarks first, to make sure it truly will be the leap (as we expect) over our current Mac Pro's.
 
Still think my idea of selling 'cubes' that integrate together to form a larger hypercube would have been better.

Sell a processing cube.....sell a rendering cube...sell a DSP cube...sell a storage cube...etc.

And it would also be a shout out to the original cube.

Anyone wanna lend me a few Billion so I can get these things made?
 
It looks like the new 8 core beats the new 4 core on virtually every benchmark, including the single core ones:
http://browser.primatelabs.com/geekbench3/compare/86294?baseline=123580

Interesting. How is single core performance better with the lower clock speed? I was under the impression that it's pretty much the same chip other than number of cores. Do the bigger caches really make that much of a difference or is it something else?

The 12 core will be around $6,000. It looks like you will pay about a grand for every 4 cores you add.

Sounds about right, around the same pricing as the current generation. But that's a lot different than the base model starting at 6k It will be interesting to see the specific pricing on 6/8/12 core BTO, hopefully it will be more competitive than the quad compared to the pc versions .

Yes, it's sad if Logic can't use Open CL. What a waste of processing power. How do you know Logic can't use Open CL?

I'm hoping they can make it happen for at least some things but I keep hearing that openCL just isn't well suited to audio in general. The one exception is convolution, but only a few plugins use that technique. On Apple's performance page they mention openCL for most other uses but in their audio section it's all about cpu and SSD speed.

I swear the fat guy mentioned "latest version of HDMI" in the keynote, surely it won't ship with 1.4??

Apple's website says HDMI 1.4 Ultra HD.
 
I am a bit surprised, the CUDA by NVIDIA is something that is being used widely in the industry and I am a bit baffled that they went for ATI ........

AMD is faaaar better at OpenCL than nVIDIA, and you must be new to the Mac and Apple if you think it's strange that Apple isn't pushing someone else's proprietary tech.

It was Apple who developed and pushed for OpenCL and AMD embraced it. From the point of view of Apple, CUDA is as dead to them as Flash. :cool::apple:
 
It is a cylinder. It is round. It has a motion sensor. Just turn the I/Os to front and be happy. LOL

And what happens to the cables already attached? I've been wondering about this for months.

I don't think I'm geometrically or spatially challenged, but I can't see any way other than to attach the peripherals with a lot of slack cable, and hope that the slack's enough to let the beast be rotated without having everything (or anything) unplug itself.

What am I missing here?
 
iMac Pro?

What about people who need a PCI expander box and 4x Nvidia GPUs to do work in Resolve/Adobe/Nuke/Cinema4D? I guess this Mac is not for them.
 
Jeez, you spend 3000 for the computer and plan to use the crappy power cord Apple provides! I don't even use those for my Mac mini!

;-)

I wouldn't think of using anything but one of these. I mean -- good clean power is very important. Only a moron uses the low-quality power cords that come in the box. And this one's 10 feet long -- a huge, huge plus that makes it worth every penny.

http://www.audioadvisor.com/prodinfo.asp?number=AQNRG10
 
Seems kinda pricey to me. I built a Windows box the other day, 16 core AMD, 32GB RAM, 2 x 3TB Hard Drives, GeForce GTX 760 w/ 8GB Ram on Card and it was just over a grand for everything. Even with adding a nice Dual GPU, it just seems really pricey to me.

Marc

:rolleyes:

No you didn't, though you could have built a very nice 8 core AMD FX8350 with 32GB RAM, 2x 3TB HDD and a GTX 760 with 2GB VRAM and a motherboard to match for about 2000 USD.
 
I think that based on watching computer technology improve continuously for 3+ decades.

Seriously... Do you think that GPU technology has reached the end of the road?

Of course not. The way your comment was worded, though, I got the impression you were suggesting the GPUs to be swappable for upgrades.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.