Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
You forgot the word "currently". It still is beta and things can be added later on with new driver versions. eGPU support is not something that will be officially supported until somewhere in 2018 according to Apple's fine print.
 
You forgot the word "currently". It still is beta and things can be added later on with new driver versions. eGPU support is not something that will be officially supported until somewhere in 2018 according to Apple's fine print.
Yes, but the point is I wouldn't count on it, if 10-bit HEVC playback is important to you.

My guess is they won't bother, just because really, it's not worth the effort for the miniscule eGPU market.
 
If there is one market where 10-bit HEVC really makes sense it is the eGPU. One of the things people are using eGPUs for is acceleration in photo and video editing software. It is not the size of the market that counts here, it is where the technology makes sense.
 
If there is one market where 10-bit HEVC really makes sense it is the eGPU. One of the things people are using eGPUs for is acceleration in photo and video editing software. It is not the size of the market that counts here, it is where the technology makes sense.
And of course, Apple would rather just sell you a new MacBook Pro or iMac for that.

Apple doesn't seem to be targeting video editors at all with this eGPU foray.
 
And of course, Apple would rather just sell you a new MacBook Pro or iMac for that.
Apple tried selling people new computers and they nearly went bankrupt when they did. They learned from it. Nowadays they are selling you a user experience and even a platform. Their objective is to get you to use that platform and to keep using it. The only way they can do that is by offering you features that make you want to use the platform (and keep using it): so called added value.

Since eGPU is part of the Thunderbolt 3 standard, supporting it might just be a matter of proper Thunderbolt 3 support and something that is mandatory (by Intel). Apple did mention VR appliances when they mentioned the eGPU during the WWDC keynote but don't read too much into that.

Apple doesn't seem to be targeting video editors at all with this eGPU foray.
Apple isn't targeting anyone in particular. They are simply adding full Thunderbolt 3 support to macOS. In the keynote they primarily were targeting the VR world but these solutions are just as applicable to gamers, photographers, videographers, scientists and probably some others too. Just because they targeted a specific audience in the keynote doesn't necessarily mean that it will be the only group they are targeting. There are many things not discussed/mentioned in the keynote, they need an entire week for WWDC to discuss all the various topics.

Apple isn't the only part at play here. AMD and Nvidia are the ones making the GPUs that many of the aforementioned groups require. They are also the ones making the drivers (they may they do that with a little help from Apple). This isn't new territory here, this happened with the Mac Pro as well. When it comes to GPU drivers we've seen the situation improve a lot. It's what makes things like eGPU, hackintoshes and upgrading the GPU in the Mac Pro much easier as it used to be.

There simply are too many signs showing that Apple is now taking graphics very seriously. When you look more closely at this years WWDC, graphics spans almost the entire WWDC. It's like 90+% of the changes have to do with graphics.
 
  • Like
Reactions: xmonkey
Yes, but the point is I wouldn't count on it, if 10-bit HEVC playback is important to you.

My guess is they won't bother, just because really, it's not worth the effort for the miniscule eGPU market.

by this logic they should barely bother with eGPU support at all and yet it's a headlining feature
 
by this logic they should barely bother with eGPU support at all and yet it's a headlining feature
They're keeping their toes in the VR market. This is not being presented as a solution for video editors. It wouldn't make much sense anyway. People buying laptops for example generally buy them for portability.

Furthermore, although I'm not a video guy, I suspect most of these people would be converting to ProRes anyway.
 
Nope, they are not. There is this thing called eGPU :D

With an 8-bit display and hardware only being able to decode/encode 8-bit HEVC we are cooked when on the road but not as much as the 15" 2016 peeps, they have 10-bit displays but aren't capable of decoding/encoding 10-bit HEVC either. I wonder how fast those GPUs are going to be with the 10-bit HEVC stuff though because if it is still rather slow and turns the notebook into a jet plane the 2017 peeps are just as cooked as us 2016 peeps ;)

1411684330035.jpg
 
They're keeping their toes in the VR market. This is not being presented as a solution for video editors. It wouldn't make much sense anyway.
It is NOT being presented as a solution for VR either. VR was only one of the many things that can take great use of powerful GPUs. They could have mentioned the other things too: data-modelling, simulations, 3D rendering (think Blender), CAD/CAM (you can use the professional ISV certified graphics cards), games, blockchain mining stuff, computer tomography, photo editing, video editing, CUDA, deep learning, AI, etc. etc. The list goes on and on. With limited time in the keynote they simply made the choice of not doing that. They opted for the one thing that is rather new and where they have gotten quite some slack by one of the main VR companies. When you look at the webpage for the external graphics development kit you can see this being mentioned:
The External Graphics Development Kit enables you to develop and test demanding graphics-intensive apps,...
The kit doesn't come with a VR headset...

The reason why this makes perfect sense is this piece of software by Apple which is the main competitor for Adobe Premier. This piece of software is called Final Cut Pro X (or FCP). FCP heavily relies on the GPU which means that the faster the GPU, the faster FCP will be. This piece of software has been used by many reviewers of the 2016 MBP to test how much faster it works on this machine. Apple mentions FCPX as one of the things you can use with an eGPU:
With the power to harness External Graphics and the HTC Vive VR headset, content creators can use apps such as Final Cut Pro X, SteamVR, Epic Unreal 4 Editor, and Unity Editor to create immersive 360° video and advanced 3D content.

And when you lookup Metal 2 you can find even more examples such as deep learning (also mentioned in the keynote). On that same page there is some small print stating that eGPU will come to consumers in spring 2018. It also lists a video on how to use Metal 2 for compute.

People buying laptops for example generally buy them for portability.
Yep and that means you have to compromise big time on GPU power. The only way you can do GPU power tasks is by shelling out some more money and buy a desktop Mac. With the eGPU you can save money (eGPU is under 1k, Apple desktops are over 2k so you are at least saving 1k) as well as not having the problem of needing to manage 2 computers (where goes what data; not having access to particular data when you need it because it is somewhere else, like forgetting your keys). It simply allows you to add a very powerful desktop GPU to your notebook.
The eGPU simply sits on ones desk and doesn't change the portability of the notebook. On the road you still have to deal with the less powerful mobile GPU but at home/office you can have all the power of the desktop GPU.

An eGPU is not the special sauce for just notebooks which you seem to think it is. It's just a desktop GPU in a box connected via Thunderbolt 3 to a computer (which can be anything: desktop, notebook, tablet, server). It really is no different than whatever you got in the old cheese grater Mac Pro.

Furthermore, although I'm not a video guy, I suspect most of these people would be converting to ProRes anyway.
That is still no excuse for not doing your homework. Almost all of the photo and video editing applications use the GPU for acceleration. Either by using things like hardware encoding/decoding support or by using the GPGPU capabilities (things along the line of OpenCL and CUDA). The aforementioned Final Cut Pro is a good example of this and so is something like Lightroom.
 
It is NOT being presented as a solution for VR either. VR was only one of the many things that can take great use of powerful GPUs. They could have mentioned the other things too: data-modelling, simulations, 3D rendering (think Blender), CAD/CAM (you can use the professional ISV certified graphics cards), games, blockchain mining stuff, computer tomography, photo editing, video editing, CUDA, deep learning, AI, etc. etc. The list goes on and on. With limited time in the keynote they simply made the choice of not doing that. They opted for the one thing that is rather new and where they have gotten quite some slack by one of the main VR companies. When you look at the webpage for the external graphics development kit you can see this being mentioned:

The kit doesn't come with a VR headset...

The reason why this makes perfect sense is this piece of software by Apple which is the main competitor for Adobe Premier. This piece of software is called Final Cut Pro X (or FCP). FCP heavily relies on the GPU which means that the faster the GPU, the faster FCP will be. This piece of software has been used by many reviewers of the 2016 MBP to test how much faster it works on this machine. Apple mentions FCPX as one of the things you can use with an eGPU:


And when you lookup Metal 2 you can find even more examples such as deep learning (also mentioned in the keynote). On that same page there is some small print stating that eGPU will come to consumers in spring 2018. It also lists a video on how to use Metal 2 for compute.


Yep and that means you have to compromise big time on GPU power. The only way you can do GPU power tasks is by shelling out some more money and buy a desktop Mac. With the eGPU you can save money (eGPU is under 1k, Apple desktops are over 2k so you are at least saving 1k) as well as not having the problem of needing to manage 2 computers (where goes what data; not having access to particular data when you need it because it is somewhere else, like forgetting your keys). It simply allows you to add a very powerful desktop GPU to your notebook.
The eGPU simply sits on ones desk and doesn't change the portability of the notebook. On the road you still have to deal with the less powerful mobile GPU but at home/office you can have all the power of the desktop GPU.

An eGPU is not the special sauce for just notebooks which you seem to think it is. It's just a desktop GPU in a box connected via Thunderbolt 3 to a computer (which can be anything: desktop, notebook, tablet, server). It really is no different than whatever you got in the old cheese grater Mac Pro.


That is still no excuse for not doing your homework. Almost all of the photo and video editing applications use the GPU for acceleration. Either by using things like hardware encoding/decoding support or by using the GPGPU capabilities (things along the line of OpenCL and CUDA). The aforementioned Final Cut Pro is a good example of this and so is something like Lightroom.
You have a lot of arguments here that would equally apply to AMD support in MacBook Pros and iMacs. In fact, as mentioned, they would apply a lot more to those machines, because there are so many more of them, and the eGPU market is several orders of magnitude less.

Yet Apple has already come out to say that any 6th gen Intel Mac will not support hardware 10-bit HEVC decode, and any 5th gen or earlier Intel Mac will not support hardware HEVC decode at all. There is very little chance they will bring hardware HEVC decode to eGPU customers while leaving the rest of their customer base out in the cold.

If you expect this, then you will be very disappointed. And as for ProRes, the point is most of their video editor customers will likely be using ProRes to edit, not native 10-bit HEVC 4K video. Yes, they will be using the GPU (whether it is internal or external) to accelerate effects, but it generally won't be on native 10-bit HEVC 4K, so in this scenario, the hardware HEVC decode support is not as necessary.
 
You have a lot of arguments here that would equally apply to AMD support in MacBook Pros and iMacs. In fact, as mentioned, they would apply a lot more to those machines, because there are so many more of them, and the eGPU market is several orders of magnitude less.
You couldn't be more wrong here. The difference in mobile GPUs and desktop GPUs is enormous. To give you an idea: it's a normal car compared to a Formula 1 car. Yes they can go fast on the motorway but the difference in max speed they can achieve is considerable.

Things like deep learning cannot be done on mobile GPUs without you going insane. They are simply too much underpowered compared to desktop GPUs.

There is very little chance they will bring hardware HEVC decode to eGPU customers while leaving the rest of their customer base out in the cold.
That's not true at all. This is all part of Metal 2, APIs, etc. All that you need is the appropriate hardware and driver.
 
You couldn't be more wrong here. The difference in mobile GPUs and desktop GPUs is enormous. To give you an idea: it's a normal car compared to a Formula 1 car. Yes they can go fast on the motorway but the difference in max speed they can achieve is considerable.

Things like deep learning cannot be done on mobile GPUs without you going insane. They are simply too much underpowered compared to desktop GPUs.


That's not true at all. This is all part of Metal 2, APIs, etc. All that you need is the appropriate hardware and driver.
We're talking about hardware HEVC decode here. It has a separate ASIC purpose built for this. CPU usage is minimal. In fact, even GPU usage is minimal. All the decode is done on this specific little piece of silicon, but the rest of the GPU is essentially idle.

Even on my iMac i5-7600 5K with Radeon Pro 575, total overall CPU usage is less than 7%, and that's including WindowServer, etc. for a 76 Mbps 60 fps 4K 10-bit HEVC video. Furthermore, GPU power consumption remains quite low. You don't need GPU power for this at all. There is essentially zero advantage to having a hard core top of the line GPU do this.

Everything you argued relates to everything other than HEVC decode.
 
Last edited:
  • Like
Reactions: Queen6 and Aquamite
HEVC is in the hardware of the GPU. That's the only reason why you are seeing low CPU usage. When you don't have this in the GPU hardware then it is the software that will need to do it and that means that CPU will get pegged. Thus you don't necessarily need a powerful GPU, you just need a GPU able to do it. The 2016 MBPs do not have such a GPU but they do have Thunderbolt 3 that supports an external GPU (and later on an OS that also supports it) and so you can use a GPU that does do 10-bit HEVC hardware decode/encode.

So we are back to what I said the first time: we 2016 MBP owners will have to use an eGPU in order to have 10-bit HEVC hardware decode/encode. Something that seems rather difficult for you to grasp. In fact, the entire eGPU thing seems to be far above your head. Better to stop this silly discussion where it is more me having to explain things to you.
 
  • Like
Reactions: macintoshmac
Something that seems rather difficult for you to grasp. In fact, the entire eGPU thing seems to be far above your head. Better to stop this silly discussion where it is more me having to explain things to you.

Calm down. No need for jumping out of a comfortable seat.
 
  • Like
Reactions: Queen6
HEVC is in the hardware of the GPU. That's the only reason why you are seeing low CPU usage. When you don't have this in the GPU hardware then it is the software that will need to do it and that means that CPU will get pegged. Thus you don't necessarily need a powerful GPU, you just need a GPU able to do it. The 2016 MBPs do not have such a GPU but they do have Thunderbolt 3 that supports an external GPU (and later on an OS that also supports it) and so you can use a GPU that does do 10-bit HEVC hardware decode/encode.

So we are back to what I said the first time: we 2016 MBP owners will have to use an eGPU in order to have 10-bit HEVC hardware decode/encode. Something that seems rather difficult for you to grasp. In fact, the entire eGPU thing seems to be far above your head. Better to stop this silly discussion where it is more me having to explain things to you.
Again, you seem to have completely missed the obvious point I had already mentioned many posts ago. A ton of these 2016 MacBook Pros ALREADY have AMD GPUs capable of hardware HEVC decode.

But Apple has already said they aren’t getting 10-bit hardware HEVC decode, because they are only supporting it on Intel.

If they’re not supporting it on those, it is extremely unlikely they will support it on eGPUs either.
 
Again, you seem to have completely missed the obvious point I had already mentioned many posts ago. A ton of these 2016 MacBook Pros ALREADY have AMD GPUs capable of hardware HEVC decode.
The obvious things you are missing is the start of the entire discussion. This was about the 13" MBP 2016 and none of them come with AMD GPUs at all! Or Nvidia for that matter since they are Intel-only.

Besides, you are also missing another obvious thing: the hardware may be Apples but it isn't the only operating system it can run on it. What I said is so generic that we could even be talking Windows 10 or Linux on the Mac.

And as I stated earlier, this is still early time for both eGPU, HEVC and the High Sierra betas. As we've seen with 10-bit display support and many other things (such as eGPU) it may take quite some time before Apple is adding them into their OS. Apple has even added things midway via a point update and usually quietly (it's how we got support for a plethora of AMD cards which the Mac Pro and hackintosh people are very happy with).

I can understand why Apple is limiting it to 1 GPU manufacturer during the first phase of the beta. You want to do the testing in a controlled environment so you start at the bottom and slowly work your way up. The main thing is to get it to work, then to get it to work properly and then you can expand it to others. That's how engineers work and that's why it isn't very smart to say things like "extremely unlikely" or "impossible". APFS is a good example of this. Many things didn't work in Sierra, yet it is the default filesystem in High Sierra where it will support all of the announced features.

Btw, there are 2 obvious reasons why it won't be limited to Intel. One is called "iMac" (including the iMac Pro) and the other is called "Mac Pro". These machines do not come with Intel graphics at all due to the CPUs Apple is using for them. Since these machines are also aimed at video people Apple is going to piss off a lot of them if they aren't going to include it at some time (and that would be the exact opposite of what they said they were doing: listen to the pros).
 
Last edited:
I remember when people here were complaining about the few people who demanded Kaby Lake on the Mac for using hardware decode of 10 bits HEVC. People said they had enough with Skylake CPUs on the Mac and that they wouldn't care less about the coming Kaby Lake as it was nothing but a higher clocked Skylake...

And now those people who jumped to Skylake last year seem to be whining about their Macs having no 10bits HEVC hardware support. ._.
 
I remember when people here were complaining about the few people who demanded Kaby Lake on the Mac for using hardware decode of 10 bits HEVC. People said they had enough with Skylake CPUs on the Mac and that they wouldn't care less about the coming Kaby Lake as it was nothing but a higher clocked Skylake...

And now those people who jumped to Skylake last year seem to be whining about their Macs having no 10bits HEVC hardware support. ._.

My Skylake MBP does have 10-bit HEVC support - in the GPU :v

Silly that Apple isn't supporting it when it works great in Boot Camp.
 
Here's the specs for Polaris GPUs. They have an updated UVD that does HEVC decode at highest specs.

http://www.amd.com/en-gb/innovations/software-technologies/radeon-polaris

https://en.m.wikipedia.org/wiki/AMD_Radeon_400_series

This series is based on the fourth generation GCN architecture. It includes new hardware schedulers,[3] a new primitive discard accelerator,[4] a new display controller,[5] and an updated UVD that can decode HEVC at 4K resolutions at 60 frames per second with 10 bits per color channel.[5]

Ars Technica review mentions hardware support to:

https://arstechnica.co.uk/apple/2016/11/macbook-pro-touch-bar-13-15-inch-touch-bar-review/3/

'And the good news is that for those non-gaming applications, these GPUs still give you some neat stuff. You definitely get 10-bit 4K HEVC decoding support, which will be good for 4K HDR content. '

We have a couple of people on this forum who are trying to shut down anyone who says that Polaris GPU encode/decode should exist in macOS graphics drivers when it does exist on other platforms. Let's show them we who spend a lot of money on our Macs will not be silenced. We pay a premium price for those hardware features.

Who is trying to "shut down" anything?

Would be nice for apple to support, but you've suggested it's a legal, rather than a marketing, obligation.
 
Btw, there are 2 obvious reasons why it won't be limited to Intel. One is called "iMac" (including the iMac Pro) and the other is called "Mac Pro". These machines do not come with Intel graphics at all due to the CPUs Apple is using for them. Since these machines are also aimed at video people Apple is going to piss off a lot of them if they aren't going to include it at some time (and that would be the exact opposite of what they said they were doing: listen to the pros).
You seem quite confused. Every single 2017 iMac contains an Intel GPU. ALL of them. It just so happens some of them have AMD GPUs as well.

In fact, it would be impossible not to include an Intel GPU, because it's essentially built into the CPU. The same is also true for the iMac Pro, since Apple has told us it will be Xeon-based, and Xeons include GPUs as well. I won't comment too much on the coming Mac Pro, because it's not even out until 2018, and nobody knows what it is. But I suspect it will be using Xeons as well, and thus therefore would include an Intel GPU too.

Remember, I already told you I can play back 10-bit 4K 60 fps HEVC just fine on my 2017 iMac i5-7600, with less than 7% CPU usage. This is in High Sierra, with native QuickTime based hardware playback.
 
  • Like
Reactions: Aquamite
You seem quite confused. Every single 2017 iMac contains an Intel GPU. ALL of them. It just so happens some of them have AMD GPUs as well.
Ah then this is a change with Skylake. Previous models did not come with an integrated GPU.

In fact, it would be impossible not to include an Intel GPU, because it's essentially built into the CPU. The same is also true for the iMac Pro, since Apple has told us it will be Xeon-based, and Xeons include GPUs as well.
Ah your are making a classic mistake of "my Intel CPU contains one thus all Intel CPUs contain one". That's not true at all, especially for the Xeon line. Many of the Intel desktop CPUs never had an integrated GPU because they are going to end up in a desktop with a separate GPU anyway (no need to save battery here). The first iMac 5k in 2014 is an example of that. The only GPU in those is coming from AMD.
Xeon processors are the same thing. They are used in machines mostly require graphics that are far more powerful or come with a certain certification. In case of servers it may not even be necessary to have a GPU due to remote management (requires a different chip) or the nature of the appliance the CPU will be used for (Xeon-D in network appliances...who needs a power consuming GPU? no one and thus none of them come with a GPU). There are only a handful of Xeon models that come with a built-in GPU though, they are quite rare.

Just take a look at the different options at ark.intel.com.

Remember, I already told you I can play back 10-bit 4K 60 fps HEVC just fine on my 2017 iMac i5-7600, with less than 7% CPU usage. This is in High Sierra, with native QuickTime based hardware playback.
CPU usage says nothing about the GPU. What this says is that the offloading to the GPU works.
 
Last edited:
Ah then this is a change with Kaby Lake.
No, it's not.

Previous models did not come with an integrated GPU.
Yes they did. All 2015 model iMacs have CPUs with integrated GPUs. ALL of them.

Ah your are making a classic mistake of "my Intel CPU contains one thus all Intel CPUs contain one". That's not true at all, especially for the Xeon line. Many of the Intel desktop CPUs never had an integrated GPU because they are going to end up in a desktop with a separate GPU anyway (no need to save battery here). The first iMac 5k in 2014 is an example of that. The only GPU in those is coming from AMD.
Wrong again. All 2014 iMacs models (5K or not) have a CPU with an integrated GPU. ALL of them.

Xeon processors are the same thing. They are used in machines mostly require graphics that are far more powerful or come with a certain certification. In case of servers it may not even be necessary to have a GPU due to remote management (requires a different chip) or the nature of the appliance the CPU will be used for (Xeon-D in network appliances...who needs a power consuming GPU? no one and thus none of them come with a GPU). There are only a handful of Xeon models that come with a built-in GPU though, they are quite rare.
Not rare at all. Most of the Skylake Xeons have integrated GPUs. Most of the Kaby Lake Xeons have integrated GPUs.

Just take a look at the different options at ark.intel.com.
I suggest you take your own advice. Look up any 2014 or 2015 iMac CPU. Every single one of them has an integrated GPU. It's all nicely documented at that link you provided.

Now, whether or not Apple chooses to utilize those integrated GPUs is a different question, but nonetheless they are there.
 
Not rare at all. Most of the Skylake Xeons have integrated GPUs. Most of the Kaby Lake Xeons have integrated GPUs.
Guess it depends on what you find rare. Looking at the entire Xeon line up I think it is rare.

I suggest you take your own advice. Look up any 2014 or 2015 iMac CPU. Every single one of them has an integrated GPU. It's all nicely documented at that link you provided.
I did and when I did there was no such info regarding the GPUs. Unfortunately Intel's ark isn't always correct or complete (you can find those complaints on the Intel forum) so I checked again and you are right, now they do list the GPU. Thank you Intel.

Now, whether or not Apple chooses to utilize those integrated GPUs is a different question, but nonetheless they are there.
Indeed and in this case they opted not to use the integrated GPU and thus the initial point still stands. Since they opted to only use the AMD GPU they'll have to support them (changing to Intel will also mean going to a less powerful GPU which is not what you'd want to do).

See, it is not that difficult to say you were wrong ;)
 
One of the biggest problems with Mac. Video playback.
No options for good players and just terrible experience overall compared to Windows.
Apple going on about the first time a Mac can do something that windows could do for the last 2 years at least. Just needed to change the video player to one that supported it in the early days.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.