Yes, but the point is I wouldn't count on it, if 10-bit HEVC playback is important to you.You forgot the word "currently". It still is beta and things can be added later on with new driver versions. eGPU support is not something that will be officially supported until somewhere in 2018 according to Apple's fine print.
And of course, Apple would rather just sell you a new MacBook Pro or iMac for that.If there is one market where 10-bit HEVC really makes sense it is the eGPU. One of the things people are using eGPUs for is acceleration in photo and video editing software. It is not the size of the market that counts here, it is where the technology makes sense.
Apple tried selling people new computers and they nearly went bankrupt when they did. They learned from it. Nowadays they are selling you a user experience and even a platform. Their objective is to get you to use that platform and to keep using it. The only way they can do that is by offering you features that make you want to use the platform (and keep using it): so called added value.And of course, Apple would rather just sell you a new MacBook Pro or iMac for that.
Apple isn't targeting anyone in particular. They are simply adding full Thunderbolt 3 support to macOS. In the keynote they primarily were targeting the VR world but these solutions are just as applicable to gamers, photographers, videographers, scientists and probably some others too. Just because they targeted a specific audience in the keynote doesn't necessarily mean that it will be the only group they are targeting. There are many things not discussed/mentioned in the keynote, they need an entire week for WWDC to discuss all the various topics.Apple doesn't seem to be targeting video editors at all with this eGPU foray.
Yes, but the point is I wouldn't count on it, if 10-bit HEVC playback is important to you.
My guess is they won't bother, just because really, it's not worth the effort for the miniscule eGPU market.
They're keeping their toes in the VR market. This is not being presented as a solution for video editors. It wouldn't make much sense anyway. People buying laptops for example generally buy them for portability.by this logic they should barely bother with eGPU support at all and yet it's a headlining feature
Nope, they are not. There is this thing called eGPU
With an 8-bit display and hardware only being able to decode/encode 8-bit HEVC we are cooked when on the road but not as much as the 15" 2016 peeps, they have 10-bit displays but aren't capable of decoding/encoding 10-bit HEVC either. I wonder how fast those GPUs are going to be with the 10-bit HEVC stuff though because if it is still rather slow and turns the notebook into a jet plane the 2017 peeps are just as cooked as us 2016 peeps![]()
It is NOT being presented as a solution for VR either. VR was only one of the many things that can take great use of powerful GPUs. They could have mentioned the other things too: data-modelling, simulations, 3D rendering (think Blender), CAD/CAM (you can use the professional ISV certified graphics cards), games, blockchain mining stuff, computer tomography, photo editing, video editing, CUDA, deep learning, AI, etc. etc. The list goes on and on. With limited time in the keynote they simply made the choice of not doing that. They opted for the one thing that is rather new and where they have gotten quite some slack by one of the main VR companies. When you look at the webpage for the external graphics development kit you can see this being mentioned:They're keeping their toes in the VR market. This is not being presented as a solution for video editors. It wouldn't make much sense anyway.
The kit doesn't come with a VR headset...The External Graphics Development Kit enables you to develop and test demanding graphics-intensive apps,...
With the power to harness External Graphics and the HTC Vive VR headset, content creators can use apps such as Final Cut Pro X, SteamVR, Epic Unreal 4 Editor, and Unity Editor to create immersive 360° video and advanced 3D content.
Yep and that means you have to compromise big time on GPU power. The only way you can do GPU power tasks is by shelling out some more money and buy a desktop Mac. With the eGPU you can save money (eGPU is under 1k, Apple desktops are over 2k so you are at least saving 1k) as well as not having the problem of needing to manage 2 computers (where goes what data; not having access to particular data when you need it because it is somewhere else, like forgetting your keys). It simply allows you to add a very powerful desktop GPU to your notebook.People buying laptops for example generally buy them for portability.
That is still no excuse for not doing your homework. Almost all of the photo and video editing applications use the GPU for acceleration. Either by using things like hardware encoding/decoding support or by using the GPGPU capabilities (things along the line of OpenCL and CUDA). The aforementioned Final Cut Pro is a good example of this and so is something like Lightroom.Furthermore, although I'm not a video guy, I suspect most of these people would be converting to ProRes anyway.
You have a lot of arguments here that would equally apply to AMD support in MacBook Pros and iMacs. In fact, as mentioned, they would apply a lot more to those machines, because there are so many more of them, and the eGPU market is several orders of magnitude less.It is NOT being presented as a solution for VR either. VR was only one of the many things that can take great use of powerful GPUs. They could have mentioned the other things too: data-modelling, simulations, 3D rendering (think Blender), CAD/CAM (you can use the professional ISV certified graphics cards), games, blockchain mining stuff, computer tomography, photo editing, video editing, CUDA, deep learning, AI, etc. etc. The list goes on and on. With limited time in the keynote they simply made the choice of not doing that. They opted for the one thing that is rather new and where they have gotten quite some slack by one of the main VR companies. When you look at the webpage for the external graphics development kit you can see this being mentioned:
The kit doesn't come with a VR headset...
The reason why this makes perfect sense is this piece of software by Apple which is the main competitor for Adobe Premier. This piece of software is called Final Cut Pro X (or FCP). FCP heavily relies on the GPU which means that the faster the GPU, the faster FCP will be. This piece of software has been used by many reviewers of the 2016 MBP to test how much faster it works on this machine. Apple mentions FCPX as one of the things you can use with an eGPU:
And when you lookup Metal 2 you can find even more examples such as deep learning (also mentioned in the keynote). On that same page there is some small print stating that eGPU will come to consumers in spring 2018. It also lists a video on how to use Metal 2 for compute.
Yep and that means you have to compromise big time on GPU power. The only way you can do GPU power tasks is by shelling out some more money and buy a desktop Mac. With the eGPU you can save money (eGPU is under 1k, Apple desktops are over 2k so you are at least saving 1k) as well as not having the problem of needing to manage 2 computers (where goes what data; not having access to particular data when you need it because it is somewhere else, like forgetting your keys). It simply allows you to add a very powerful desktop GPU to your notebook.
The eGPU simply sits on ones desk and doesn't change the portability of the notebook. On the road you still have to deal with the less powerful mobile GPU but at home/office you can have all the power of the desktop GPU.
An eGPU is not the special sauce for just notebooks which you seem to think it is. It's just a desktop GPU in a box connected via Thunderbolt 3 to a computer (which can be anything: desktop, notebook, tablet, server). It really is no different than whatever you got in the old cheese grater Mac Pro.
That is still no excuse for not doing your homework. Almost all of the photo and video editing applications use the GPU for acceleration. Either by using things like hardware encoding/decoding support or by using the GPGPU capabilities (things along the line of OpenCL and CUDA). The aforementioned Final Cut Pro is a good example of this and so is something like Lightroom.
You couldn't be more wrong here. The difference in mobile GPUs and desktop GPUs is enormous. To give you an idea: it's a normal car compared to a Formula 1 car. Yes they can go fast on the motorway but the difference in max speed they can achieve is considerable.You have a lot of arguments here that would equally apply to AMD support in MacBook Pros and iMacs. In fact, as mentioned, they would apply a lot more to those machines, because there are so many more of them, and the eGPU market is several orders of magnitude less.
That's not true at all. This is all part of Metal 2, APIs, etc. All that you need is the appropriate hardware and driver.There is very little chance they will bring hardware HEVC decode to eGPU customers while leaving the rest of their customer base out in the cold.
We're talking about hardware HEVC decode here. It has a separate ASIC purpose built for this. CPU usage is minimal. In fact, even GPU usage is minimal. All the decode is done on this specific little piece of silicon, but the rest of the GPU is essentially idle.You couldn't be more wrong here. The difference in mobile GPUs and desktop GPUs is enormous. To give you an idea: it's a normal car compared to a Formula 1 car. Yes they can go fast on the motorway but the difference in max speed they can achieve is considerable.
Things like deep learning cannot be done on mobile GPUs without you going insane. They are simply too much underpowered compared to desktop GPUs.
That's not true at all. This is all part of Metal 2, APIs, etc. All that you need is the appropriate hardware and driver.
Something that seems rather difficult for you to grasp. In fact, the entire eGPU thing seems to be far above your head. Better to stop this silly discussion where it is more me having to explain things to you.
Nothing to worry about. The only thing that gets me out of that comfy seat is a coffee refillCalm down. No need for jumping out of a comfortable seat.
Again, you seem to have completely missed the obvious point I had already mentioned many posts ago. A ton of these 2016 MacBook Pros ALREADY have AMD GPUs capable of hardware HEVC decode.HEVC is in the hardware of the GPU. That's the only reason why you are seeing low CPU usage. When you don't have this in the GPU hardware then it is the software that will need to do it and that means that CPU will get pegged. Thus you don't necessarily need a powerful GPU, you just need a GPU able to do it. The 2016 MBPs do not have such a GPU but they do have Thunderbolt 3 that supports an external GPU (and later on an OS that also supports it) and so you can use a GPU that does do 10-bit HEVC hardware decode/encode.
So we are back to what I said the first time: we 2016 MBP owners will have to use an eGPU in order to have 10-bit HEVC hardware decode/encode. Something that seems rather difficult for you to grasp. In fact, the entire eGPU thing seems to be far above your head. Better to stop this silly discussion where it is more me having to explain things to you.
The obvious things you are missing is the start of the entire discussion. This was about the 13" MBP 2016 and none of them come with AMD GPUs at all! Or Nvidia for that matter since they are Intel-only.Again, you seem to have completely missed the obvious point I had already mentioned many posts ago. A ton of these 2016 MacBook Pros ALREADY have AMD GPUs capable of hardware HEVC decode.
I remember when people here were complaining about the few people who demanded Kaby Lake on the Mac for using hardware decode of 10 bits HEVC. People said they had enough with Skylake CPUs on the Mac and that they wouldn't care less about the coming Kaby Lake as it was nothing but a higher clocked Skylake...
And now those people who jumped to Skylake last year seem to be whining about their Macs having no 10bits HEVC hardware support. ._.
Here's the specs for Polaris GPUs. They have an updated UVD that does HEVC decode at highest specs.
http://www.amd.com/en-gb/innovations/software-technologies/radeon-polaris
https://en.m.wikipedia.org/wiki/AMD_Radeon_400_series
This series is based on the fourth generation GCN architecture. It includes new hardware schedulers,[3] a new primitive discard accelerator,[4] a new display controller,[5] and an updated UVD that can decode HEVC at 4K resolutions at 60 frames per second with 10 bits per color channel.[5]
Ars Technica review mentions hardware support to:
https://arstechnica.co.uk/apple/2016/11/macbook-pro-touch-bar-13-15-inch-touch-bar-review/3/
'And the good news is that for those non-gaming applications, these GPUs still give you some neat stuff. You definitely get 10-bit 4K HEVC decoding support, which will be good for 4K HDR content. '
We have a couple of people on this forum who are trying to shut down anyone who says that Polaris GPU encode/decode should exist in macOS graphics drivers when it does exist on other platforms. Let's show them we who spend a lot of money on our Macs will not be silenced. We pay a premium price for those hardware features.
You seem quite confused. Every single 2017 iMac contains an Intel GPU. ALL of them. It just so happens some of them have AMD GPUs as well.Btw, there are 2 obvious reasons why it won't be limited to Intel. One is called "iMac" (including the iMac Pro) and the other is called "Mac Pro". These machines do not come with Intel graphics at all due to the CPUs Apple is using for them. Since these machines are also aimed at video people Apple is going to piss off a lot of them if they aren't going to include it at some time (and that would be the exact opposite of what they said they were doing: listen to the pros).
Ah then this is a change with Skylake. Previous models did not come with an integrated GPU.You seem quite confused. Every single 2017 iMac contains an Intel GPU. ALL of them. It just so happens some of them have AMD GPUs as well.
Ah your are making a classic mistake of "my Intel CPU contains one thus all Intel CPUs contain one". That's not true at all, especially for the Xeon line. Many of the Intel desktop CPUs never had an integrated GPU because they are going to end up in a desktop with a separate GPU anyway (no need to save battery here). The first iMac 5k in 2014 is an example of that. The only GPU in those is coming from AMD.In fact, it would be impossible not to include an Intel GPU, because it's essentially built into the CPU. The same is also true for the iMac Pro, since Apple has told us it will be Xeon-based, and Xeons include GPUs as well.
CPU usage says nothing about the GPU. What this says is that the offloading to the GPU works.Remember, I already told you I can play back 10-bit 4K 60 fps HEVC just fine on my 2017 iMac i5-7600, with less than 7% CPU usage. This is in High Sierra, with native QuickTime based hardware playback.
No, it's not.Ah then this is a change with Kaby Lake.
Yes they did. All 2015 model iMacs have CPUs with integrated GPUs. ALL of them.Previous models did not come with an integrated GPU.
Wrong again. All 2014 iMacs models (5K or not) have a CPU with an integrated GPU. ALL of them.Ah your are making a classic mistake of "my Intel CPU contains one thus all Intel CPUs contain one". That's not true at all, especially for the Xeon line. Many of the Intel desktop CPUs never had an integrated GPU because they are going to end up in a desktop with a separate GPU anyway (no need to save battery here). The first iMac 5k in 2014 is an example of that. The only GPU in those is coming from AMD.
Not rare at all. Most of the Skylake Xeons have integrated GPUs. Most of the Kaby Lake Xeons have integrated GPUs.Xeon processors are the same thing. They are used in machines mostly require graphics that are far more powerful or come with a certain certification. In case of servers it may not even be necessary to have a GPU due to remote management (requires a different chip) or the nature of the appliance the CPU will be used for (Xeon-D in network appliances...who needs a power consuming GPU? no one and thus none of them come with a GPU). There are only a handful of Xeon models that come with a built-in GPU though, they are quite rare.
I suggest you take your own advice. Look up any 2014 or 2015 iMac CPU. Every single one of them has an integrated GPU. It's all nicely documented at that link you provided.Just take a look at the different options at ark.intel.com.
Guess it depends on what you find rare. Looking at the entire Xeon line up I think it is rare.Not rare at all. Most of the Skylake Xeons have integrated GPUs. Most of the Kaby Lake Xeons have integrated GPUs.
I did and when I did there was no such info regarding the GPUs. Unfortunately Intel's ark isn't always correct or complete (you can find those complaints on the Intel forum) so I checked again and you are right, now they do list the GPU. Thank you Intel.I suggest you take your own advice. Look up any 2014 or 2015 iMac CPU. Every single one of them has an integrated GPU. It's all nicely documented at that link you provided.
Indeed and in this case they opted not to use the integrated GPU and thus the initial point still stands. Since they opted to only use the AMD GPU they'll have to support them (changing to Intel will also mean going to a less powerful GPU which is not what you'd want to do).Now, whether or not Apple chooses to utilize those integrated GPUs is a different question, but nonetheless they are there.