Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
You can talk about potential gains made. But I'll clue you in here, this company has a three year lease cycle. Most of the editors have been on the Mac Pro for the last 1.5-2 years. We haven't gotten one complaint about performance, and these guys are constantly working with 30+GB video files. You know why they haven't complained? Their current machine is better than the ones that it replaced for their workflow.

There is no hand waving about how Apple hasn't updated the Mac Pro yet, because there WILL be a new one out for months before most of these people are up for a new one anyway.

Again I have to stress this point, while the IT guys may get in a nerd rage about the the current line up and how it could be better, the people using them simply don't give a ****. They couldn't tell an i3 from a toaster oven. They just work on their tools, and the current Mac Pro does it more than adequately.

So you are saying that two editors sitting next to each other, one on a 4 Core D300 and one on a 12 Core D700 having to meet the same deadlines will not notice the advantage?. Better hardware = getting heavy process jobs done faster, these guys daily tasks/job is to process larger amount of data, how is not better hardware not an advantage?. Its why Apple offers a range of pros, with the only difference being better hardware. Just read the reviews of a D700 v D300 in relation to video editing, big advantage.

When you work with tasks that are not CPU/GPU dependent, you have a point, but in this case, working with large video files, hardware matters very much.

You may find that YOUR editors work with the tools they are provided , and deliver outputs based on what the tools can deliver, it does not mean that this cannot be improved. Give one of the editors a machine that is twice as fast as the others, and your will have LOTs of complaints about performance.

Even if someone who does not care about tech and is not a nerd, I guarantee you that these people know their toasters from i3 machines very much ;) and if you give an experienced Editor who has used macs for years in a professional environment an i5 iMac, when he previously used a mac pro, he will tell you where to stick the toaster. While many people don't care about specs, they can tell if one machine is a lot slower than the other.
 
Last edited:
So you are saying that two editors sitting next to each other, one on a 4 Core D300 and one on a 12 Core D700 having to meet the same deadlines will not notice the advantage?. Better hardware = getting heavy process jobs done faster, these guys daily tasks/job is to process larger amount of data, how is not better hardware not an advantage?. Its why Apple offers a range of pros, with the only difference being better hardware. Just read the reviews of a D700 v D300 in relation to video editing, big advantage.

When you work with tasks that are not CPU/GPU dependent, you have a point, but in this case, working with large video files, hardware matters very much.

You may find that YOUR editors work with the tools they are provided , and deliver outputs based on what the tools can deliver, it does not mean that this cannot be improved. Give one of the editors a machine that is twice as fast as the others, and your will have LOTs of complaints about performance.

Even if someone who does not care about tech and is not a nerd, I guarantee you that these people know their toasters from i3 machines very much ;) and if you give an experienced Editor who has used macs for years in a professional environment an i5 iMac, when he previously used a mac pro, he will tell you where to stick the toaster. While many people don't care about specs, they can tell if one machine is a lot slower than the other.
Everyone gets the same machine here on a three year lease cycle, lease cycles are common in most major companies. I don't know why this is so hard to comprehend, but just because something can be done quicker on paper doesn't mean these workers are clamoring for some type of update. They do their jobs, that's pretty much it.

As to your entire first paragraph, I've never once said that one machine vs. one with higher specs wouldn't complete a job quicker. That said, in the creative field this is a scenario that is so far removed from an actual workflow it's laughable. Why in the world would somebody working on a deadline be cutting it so close that the processing time on their machine would make the difference? That's either an incredible mismanagement of time by the employee, or these two hypothetical people were given an assignment that sounds more like a pop quiz from high school than it does an actual work environment....
 
Everyone gets the same machine here on a three year lease cycle, lease cycles are common in most major companies. I don't know why this is so hard to comprehend, but just because something can be done quicker on paper doesn't mean these workers are clamoring for some type of update. They do their jobs, that's pretty much it.

As to your entire first paragraph, I've never once said that one machine vs. one with higher specs wouldn't complete a job quicker. That said, in the creative field this is a scenario that is so far removed from an actual workflow it's laughable. Why in the world would somebody working on a deadline be cutting it so close that the processing time on their machine would make the difference? That's either an incredible mismanagement of time by the employee, or these two hypothetical people were given an assignment that sounds more like a pop quiz from high school than it does an actual work environment....

I should have specified , my experience is in shooting content and post production.

have you ever worked with post production?

I ask cause deadlines are tight in media, and time matters very much. We buy the top spec machines cause there is no option to miss the deadline of a show airing.

What is the creative workflow you talk about where time is not critical ?
 
Please provide quotes to prove that Apple wants the pro gaming market. As far as I know they only care about casual gamers' market, for which they leverage the iOS ecosystem. Quite frankly I wouldn't give 2 crap about the self-centred gamers' market either.

I enjoy gaming, but I agree that the gamers' market can be a bit self-centered (”PC Master Race” and juvenile stuff like that), but I would still really like to see a Mac with a ”normal” replaceable graphics card, like the Mac Pro towers used to have. Computers that are good for games are also good doing other graphics related things.
 
Anyone doing professional audio or video should not touch Apple hardware (without being paid to), because you cannot depend on Apple to stick to anything other than satisfying the teen crowd. Mac Pro 3 years old no upgrade/update. No hardware between Mini and Mac Pro. No top end laptops. I could go on. When these people move away from Apple, it no longer makes sense to stay in the Apple ecosystem, except of course for the phone, which is now a commodity with limited growth. I am one of the few professionals that I know in my profession that still uses Apple hardware, the others have moved to windows.

Potentially true. However video editing is actually a mid-end application these days. I can edit 4K happily on my 2008 machine.. and my 2013 blows that out the water in terms of layering etc.

It actually falls down in 3D and DCC and VR now which is where the real high end is.

I use a mac pro ( plus various windows machine ) daily and I'd always rather be using FCPX over Premiere and It's still generally way more stable than a PC.

The Thing is the CPU updates are not that great but The GPUs are make leaps an bounds... But the D700 does still pull some massive weight in the right apps. Just most people compare them to Gaming cards which is where they get destroyed.
 
Potentially true. However video editing is actually a mid-end application these days. I can edit 4K happily on my 2008 machine.. and my 2013 blows that out the water in terms of layering etc.

It actually falls down in 3D and DCC and VR now which is where the real high end is.

I use a mac pro ( plus various windows machine ) daily and I'd always rather be using FCPX over Premiere and It's still generally way more stable than a PC.

The Thing is the CPU updates are not that great but The GPUs are make leaps an bounds... But the D700 does still pull some massive weight in the right apps. Just most people compare them to Gaming cards which is where they get destroyed.

Thats the thing, Apple Macs are generally good hardware. To me it does not matter if they are $200 or $600 more than the same Windows PC. But Apple has some DNA that makes it think that producing performance hardware at lower margins (because of lower volume) is somehow beneath them. The only thing I can figure out is that there are only a handful of hardware engineers and they work on one product line at a time. For the last three years that has been phones. Hopefully they'll get back to the Mac line sometime soon.
 
What do you guys think they'll throw in a 2016 mac pro? AMD Vega? See pre-orders at the end of the year perhaps?
 
Just saw this from the developer of Vray for Cinema 4d. (English is not his second language, be kind):

on windows cuda and open cl works (vray is native both in core)
on osx it seems apple has no standard opencl drivers (yet?) and misses important parts vray rt needs. cuda on osx seem ok,

i see a message of the CG gpu devs on exact that on the maya V-Ray forum, he kindly asks apple every 2-3 weeks on updates, but yet waits for working things. they say it is fully up to apple for openCL to be full.

This is why people like me get frustrated. If Apple gave us options to get to nvidia hardware on the mac, I'd get one again for my workstation or a tricked out iMac. If Apple updated Open CL to what developers like Chaos Software needed to get feature parity, AMD GPUs would not be a problem and I'd get a mac again.

Both of these things are within Apple's control. Developing Open CL would be a good olive branch, but offering nVidia options would let me be a rabid mac user again instantly without having to worry about Open CL. In theory it's fine as a CUDA alternative. In reality, Apple has dropped the ball in every way possible for some pro users.

As much as I love my macs, I honestly don't notice the OS most of the time. I spend it in the apps. And windows 10 isn't the horror show that I thought it would be. There are a ton of things I like about it.
 
You can find some specs for the upcoming M400 Series on AMD's site:
http://www.amd.com/en-us/products/graphics/notebook/r9-m200

AMD is rumoured to do the full unveil in two weeks, around May 26-29.

Hopefully the M480 and M480X will end up in high-end MacBook Pros and the M485X in future iMacs.

From what I've read, the M480 and M480X should be solid 1080p gaming GPUs, comparable to GTX 950 and R9 370 desktop cards. AMD has demoed it running Star Wars Battlefront at 60 fps while drawing less than 50 watts (86 watts for the full system). It's probably in the same realm as a GTX 970M, but with half the power draw, as it's intended for thin & light laptops like the MacBook Pro.

Another way to put it, is we should see GPU performance similar to the current top-end iMac in the next-generation MacBook Pro. The aligns with AMDs claims of 2.5x performance/watt.

The M490X is said to provide the performance of the R9 390X/390 cards, which could make a future iMac VR-ready.

Of course discrete GPUs will continue to accelerate apps like Final Cut Pro X via OpenCL (or Metal). Polaris GPUs also support h.265 encode/decode, DisplayPort 1.3 and HDMI 2.0. It will be interesting to see if Apple can push DisplayPort 1.3 over a Thunderbolt 3 port, even though Intel chips still only support DisplayPort 1.2. Requiring the dGPU when an external display is plugged in isn't unheard of, but I don't know what's involved.

The latest from wccftech:
http://wccftech.com/amd-r9-480x-470x-specs-allegedly-revealed/

UPDATE: The M470 and M470X are said to be rebrands. Polaris chips aren't yet listed on AMD's site. http://wccftech.com/amd-radeon-400-mobility-gpus-confirmed/

Do take this all with a grain of salt. Neither the AMD chips nor new Macs have been released yet, so anything could still happen.
 
Last edited:
over on one of my C4d forums, there are a great many mac die hards switching to PC after that nVidia GTX1080 announcement because they need CUDA. They are tired of waiting for OpenCL to come to apps they use. Even if AMD beats nVidia on the gaming card front it won't matter. These guys aren't using the hardware for games.

Right now on the mac their only real option is Thea Render, which does take advantage of both CPU and GPU (including AMD hardware).
 
You can find some specs for the upcoming M400 Series on AMD's site:
http://www.amd.com/en-us/products/graphics/notebook/r9-m200

AMD is rumoured to do the full unveil in two weeks, around May 26-29.

Hopefully the M470 and M470X will end up in high-end MacBook Pros and the M485X in future iMacs.

From what I've read, the M470 and M470X should be solid 1080p gaming GPUs, comparable to GTX 950 and R9 370 desktop cards. AMD has demoed it running Star Wars Battlefront at 60 fps while drawing less than 50 watts (86 watts for the full system).

On Windows Direct X, which OS X's old OpenGL and new Metal are far behind at. It will be a long time before OS X can show numbers like that, if ever. We don't even have any sign of Vulkan yet.
 
On Windows Direct X, which OS X's old OpenGL and new Metal are far behind at. It will be a long time before OS X can show numbers like that, if ever. We don't even have any sign of Vulkan yet.
It does not need Vulkan to match DX12. HLSL is Open Source from this moment on, and Apple, if they want to be most familiar to whole industry standards will have to use it.
You can find some specs for the upcoming M400 Series on AMD's site:
http://www.amd.com/en-us/products/graphics/notebook/r9-m200

AMD is rumoured to do the full unveil in two weeks, around May 26-29.

Hopefully the M470 and M470X will end up in high-end MacBook Pros and the M485X in future iMacs.

From what I've read, the M470 and M470X should be solid 1080p gaming GPUs, comparable to GTX 950 and R9 370 desktop cards. AMD has demoed it running Star Wars Battlefront at 60 fps while drawing less than 50 watts (86 watts for the full system). It's probably in the same realm as a GTX 970M, but with half the power draw, as it's intended for thin & light laptops like the MacBook Pro.

Another way to put it, is we should see GPU performance similar to the current top-end iMac in the next-generation MacBook Pro. The aligns with AMDs claims of 2.5x performance/watt.

The M485X is said to provide the performance of the R9 390X/390 cards, which could make a future iMac VR-ready.

Of course discrete GPUs will continue to accelerate apps like Final Cut Pro X via OpenCL (or Metal). Polaris GPUs also support h.265 encode/decode, DisplayPort 1.3 and HDMI 2.0. It will be interesting to see if Apple can push DisplayPort 1.3 over a Thunderbolt 3 port, even though Intel chips still only support DisplayPort 1.2. Requiring the dGPU when an external display is plugged in isn't unheard of, but I don't know what's involved.

The latest from wccftech:
http://wccftech.com/amd-r9-480x-470x-specs-allegedly-revealed/

Do take this all with a grain of salt. Neither the AMD chips nor new Macs have been released yet, so anything could still happen.
This is what happens when you read clickbait sites like WCCFTech or VRWorld. The GPUs you mention are complete rebrands from previous generations of GPUs. M485X is EXACTLY the same GPU that is in R9 395X.
 
It does not need Vulkan to match DX12.

You don't need Vulkan to 'match' DX. But you do need competitive mature APIs that the industry can easily work with. Will Metal get there or will it be replaced by a much wider supported API like Vulkan? I prefer to bet on the latter, but it will take a few years.
 
You don't need Vulkan to 'match' DX. But you do need competitive mature APIs that the industry can easily work with. Will Metal get there or will it be replaced by a much wider supported API like Vulkan? I prefer to bet on the latter, but it will take a few years.
Metal is based on Mantle driver. It has similar features as that API... The only thing that Metal lacks is HLSL and Shader model at least 5.0. Shader Model 6.0 is Open Source. HLSL is Open Source.
 
Wouldn't it be great to be discussing desktop level GPUs for desktop computers rather than mobile graphics in Apples iMacs and Mac Minis. So sad we give up so much performance out of the gate. New MacBook Pros should be nice.
Yeah, it would be nice to be discussing desktop GPUs. Unfortunately, Apple stopped making Macs with PCIe slots in 2013, and they never made a true desktop for either consumers or gamers.
 
You might think so, but in reality, no. After using ATI for ages, I tried nVidia in my Mac Pro, and cards that should theoretically get 25%-50% better performance according to benchmarks, typically do more like 25%-50% worse in the apps/games that I use. I thought it would be worth supporting them since they actually support Macs, coming out with web drivers typically a few days after OS X updates, whereas AMD does nothing. But the performance just isn't there, at least on OS X, so I went back to AMD.

--Eric


Which apps do you use?
 
  • Like
Reactions: jblagden
For 3D-intensive apps, Unity and Blender. Many games were slower, the worst being Borderlands 2, which was literally half as fast. I will say it did quite well with the Unigine Heaven/Valley benchmarks, and some games were faster, but overall it was disappointing. I ended up going with a newer Radeon card, which was significantly faster in all cases.

--Eric
 
It's possible to have an eGPU, but it's not supported, and it takes a bit of work to set up.

Yup. I've been reading on some of the Modo forums about people who set it up. It doesn't play nice with games or anything, but for CUDA rendering it apparently works pretty well. There is a small speed hit to what the cards can do when plugged into a PCI slot, but hey, CUDA rendering on the mac with a small hit is OK. It might actually be a great option by moving a high end potentially hot-running component outside of such a thin enclosure.
 
Yup. I've been reading on some of the Modo forums about people who set it up. It doesn't play nice with games or anything, but for CUDA rendering it apparently works pretty well. There is a small speed hit to what the cards can do when plugged into a PCI slot, but hey, CUDA rendering on the mac with a small hit is OK. It might actually be a great option by moving a high end potentially hot-running component outside of such a thin enclosure.
I actually use it for playing StarCraft II. For StarCraft II, my eGPU works very well. The only catch is that to get the full performance of the eGPU, you have to use an external monitor, which is fine for me since my external monitor is 21 inches while my MacBook Pro’s built-in monitor is 13 inches. If you use the MacBook’s internal monitor for playing games with the eGPU, you’ll only get half the performance because you’ll only be getting half the bandwidth.

Though, it doesn’t seem to give me a speed advantage in iMovie. Though, the last time I tested it, the project I was rendering was on an external USB 2 hard drive, which seems like it’s reaching the end of its life. I had the project over there because I was running out of space on my MacBook’s 480 gigabyte Solid State Drive and at the time, I wasn’t willing to clear stuff off. I just cleared some stuff off a couple days ago, so I’ll have to give eGPU rendering another try.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.