Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Intel doesn't have anything to do with it, they use AMD cards which support it as far as I know.
You made an incorrect assumption here. The 13" notebooks use the GPU in their Intel CPU so Intel has everything to do with it and AMD has got nothing to do with it. The 15" version uses both: it uses the GPU of its Intel CPU by default but when a more powerful GPU is needed it will switch to AMD. With this notebook both Intel and AMD have everything to do with it.

Apparently the complaint is about the Intel chips (read: the GPU in the Intel CPU) not being able to output the 10-bit. As others have stated that isn't true since it drives the internal P3 display (which is 10 bit). Further down this thread the OP provided more information and it seems this is about the GPU being able to drive an external display at 10 bit and HDR (or UHD Premium). No matter how you put it, the OP really needs to review his posts because his wording does not make much sense to most people.

To the OP: I solely use this machine for virtualisation and UNIX/Linux administration on the go, so do tell me why I'd need to buy a device that includes these things. Your turn again.
 
  • Like
Reactions: simonsi
Radeon pro supports hdr does it not


Course not even if it did the CPU limiting factor prevents it. Don't take my word for it hook up your MacBook Pro to a TV that supports HDR you will see the Option to Enable HDR is either Greyed out or Absent. End of story.
 
The simple answer is that even if you hook up your $3,000 ish MBP to and HDR TV set or Monitor it can not feed that display with enough data or color to produce HDR. And that is a problem.

No, it's actually not a problem. It just means it doesn't have the features you want.
 
  • Like
Reactions: Balaamsdonkey
You made an incorrect assumption here. The 13" notebooks use the GPU in their Intel CPU so Intel has everything to do with it and AMD has got nothing to do with it. The 15" version uses both: it uses the GPU of its Intel CPU by default but when a more powerful GPU is needed it will switch to AMD. With this notebook both Intel and AMD have everything to do with it.

Apparently the complaint is about the Intel chips (read: the GPU in the Intel CPU) not being able to output the 10-bit. As others have stated that isn't true since it drives the internal P3 display (which is 10 bit). Further down this thread the OP provided more information and it seems this is about the GPU being able to drive an external display at 10 bit and HDR (or UHD Premium). No matter how you put it, the OP really needs to review his posts because his wording does not make much sense to most people.

To the OP: I solely use this machine for virtualisation and UNIX/Linux administration on the go, so do tell me why I'd need to buy a device that includes these things. Your turn again.

I assumed as the OP was talking about a "$3000 ish" computer he was referring to a 15" model. Obviously the 13" doesn't have an AMD card... I've tried to explain that the monitor is 10 bit but to no avail. We're all doomed.
[doublepost=1480124801][/doublepost]
Course not even if it did the CPU limiting factor prevents it. Don't take my word for it hook up your MacBook Pro to a TV that supports HDR you will see the Option to Enable HDR is either Greyed out or Absent. End of story.

Wait you're hooking it up to a TV? What cable are you using?
 
  • Like
Reactions: Balaamsdonkey
Course not even if it did the CPU limiting factor prevents it. Don't take my word for it hook up your MacBook Pro to a TV that supports HDR you will see the Option to Enable HDR is either Greyed out or Absent. End of story.

Only over hdmi 2.0. I can't speak fir the drivers just reading on it
 
What does this even mean? I've been playing around with 16-bit color and HDR for years now. Or do you mean built-in video decoding or something like that?

That 16bit colour is a graphics mode for images, not a video mode for the screen (although there used to be a 16 bit video mode years ago - that wasn't the same thing)

8 bit colour is 36 bit colour in the world of graphics and video. Those supports up to 16 million colours. The new MBP graphics drivers support this.

10bit colour is 30bit colour in the world of graphics and video. This supports about 1 billion colours palette.

Professional monitors such as Eizo have built in ASICs for up to 14bit internal look up tables.

Your eyes if you are a regular trichromat can only see about 10 million colours. A 4K monitor only has about 8.2 million pixels. A typical colour photo has only several thousand colours due to repeated colours.

Confused yet? We can go on for hours and get even more confused. Such details aren't necessary for 99.9999% of people.
 
The 2016 Macbook Pro is overpriced. Period. Why would I buy it? Because I am not willing to buy a Windows laptop. And the only alternative, then, is the 2015 Macbook Pro.

But HDR support? Way too niche.
 
  • Like
Reactions: Ghost31
But was 2015 with M370x overpriced? And is 2016 with Pro 450 overpriced compared to 2015 with M370x?

No, the 2015 Macbook Pros were not overpriced on release. They currently are overpriced now.

The 2016 Macbook Pros are overpriced.

Hope I answered your question.
 
That 16bit colour is a graphics mode for images, not a video mode for the screen (although there used to be a 16 bit video mode years ago - that wasn't the same thing)

Again, I am perfectly aware of what 10-bit color depth is. What I still don't understand what this has to do with the CPU.

Also, do we know for sure that OS X does not support 10-bit external monitors? Did anyone try?

P.S. The LG Ultrafine 5K is a 10-bit monitor, so I guess that means that MBP supports the ARGB2101010 pixel format for video output. Now I am even more confused by OP's post.

P.P.S. I also checked on my 2015 MBP and it supports the ARGB2101010 pixel format. So I assume that the 2015 model also supports wide-color video output.
 
Last edited:
Again, I am perfectly aware of what 10-bit color depth is. What I still don't understand what this has to do with the CPU.

Also, do we know for sure that OS X does not support 10-bit external monitors? Did anyone try?

We have been using 10 bit monitors on Macs for years without true support. As long as you know what you are doing you can get around limitations - only people working with specific colour precise work need it. For print, it hasn't been essential to have 10-bit output because CMYK has a lower gamut than sRGB or AdobeRGB due to the fact it can't produce native reds, greens and blues.

El Capitan was there first Mac OS to support 10bit output but ONLY if the graphics driver supports it.

You have to go to System Information app, look at the Graphics/Display section. Under Pixel Depth if it says:

32-Bit Color (ARGB8888)

Then that is 8-bit per channel. If it says:

30-Bit Color (RGB101010)

Then that is 10-bit per channel. 30-bit being wider gamut than 32-bit (because the latter counts the Alpha channel too).

Only two Macs have this 30-bit Color driver - the iMac 5K and the Mac Pro 6,1
 
Ey? The displays are P3, by definition this means they are 10bit... Is HDR an industry standard? Can you give me an example of another laptop with this feature so I can see what a real 'best in class' laptop display looks like?

Do you feel misinformed?
First explain to me what pro level computer would not have HDR support? Sounds like you're going through buyer's remorse because you spent 2800 on a mid-low to medium range laptop when it comes to specs.

Do you feel misinformed?
[doublepost=1480156358][/doublepost]
Professional is anyone who is making money in his field, not some elitist elite you are trying to imply.
Also Intel hasn't yet released Kaby Lake processor that would suit macbooks. Only one 15W released. So your post is basicaly useless.
You can make money in your field with a low powered portable low end laptop too. "Pro" has nothing to do with professional, and everything to do with performance. Something the Macbook "Pros" completely lack.
 
I would rather not pay more for the 10 bit color screen. I do software development so use multiple external monitors 90% of the time.
Same here. I bought the cheapest 4K monitor I could find (28" Samsung U28E590D) and that was that. I'm usually building business-related apps where look and feel is secondary (i.e. fugly). I want screen real estate and colors are really not that important.
 
"Pro" has nothing to do with professional, and everything to do with performance. Something the Macbook "Pros" completely lack.

Well, if the anonymous agent says so, it must be true :rolleyes: Still doesn't change the fact that the MBP has as much (or more ) performance than any other laptop in the same category (Dell XPS 15, MS SurfaceBook, thin-and-light workstation laptops).
 
  • Like
Reactions: Precursor
First explain to me what pro level computer would not have HDR support? Sounds like you're going through buyer's remorse because you spent 2800 on a mid-low to medium range laptop when it comes to specs.

Do you feel misinformed?

Nope, quite happy with my purchase thank you, I'm with you on the specs, shockingly bad, you can get IDENTICAL specs on a Windows machine for like $500 if you're an utter moron who can't read, right? Looking forward to having smooth gradients on the none existent P3 none HDR crappy consumer level screen that ships with it.
 
Nope, quite happy with my purchase thank you, I'm with you on the specs, shockingly bad, you can get IDENTICAL specs on a Windows machine for like $500 if you're an utter moron who can't read, right? Looking forward to having smooth gradients on the none existent P3 none HDR crappy consumer level screen that ships with it.

This went off the rails a little while I appreciate the detailed 10-bit or 8-bit scientific areguments all valid and have their place. So do the professionals who just do code work and don't care about colors.

It's all more simplistic, even a $249 Xbox One or PS4 Pro can output proper 4K and HIgh Dynamic Range Color HDR.

Not sure if it's because the image Out is being fed through a USB C port then attachment to a HDMI dongle.

But the simple answer is that for such an Expensive Pro device it can Not Out put 4K video with HDR support for all the brilliant imagery that delivers.

Just try connecting it to a OLED LG TV or a Samsung KS8000.

And the benefits of HDR can not be understated. Even if I want to hook up a MBP to my HDR capable TV, and Watch a Movie it's not gonna deliver HDR. And that's my point. I'd be pretty mad
 
This went off the rails a little while I appreciate the detailed 10-bit or 8-bit scientific areguments all valid and have their place. So do the professionals who just do code work and don't care about colors.

It's all more simplistic, even a $249 Xbox One or PS4 Pro can output proper 4K and HIgh Dynamic Range Color HDR.

Not sure if it's because the image Out is being fed through a USB C port then attachment to a HDMI dongle.

But the simple answer is that for such an Expensive Pro device it can Not Out put 4K video with HDR support for all the brilliant imagery that delivers.

Just try connecting it to a OLED LG TV or a Samsung KS8000.

And the benefits of HDR can not be understated. Even if I want to hook up a MBP to my HDR capable TV, and Watch a Movie it's not gonna deliver HDR. And that's my point. I'd be pretty mad

So "Professional" = hooking up laptop to watch movies with?
 
Apple MacBook Pros are for Pros, only a small fraction of users are what can be called Professionals in their fields.

I am not however, If I purchase a Pro level device I expect the best in class.

And The last gen Intel Chips in the 2016 Touch Bar MacBook Pros can not or do not support 10-bit color and HDR.

Now the display are quality no doubt, but even when connected to an OLED TV or External Monitor they don't support industry standards that are so important to enjoy a wide range of colors.

Do you guys feel limited or misinformed by Apple ?

If Intel doesn't support it yet, it's not an industry standard. Probably 95%+ of computers sold worldwide only have Intel GPUs.
 
Only two Macs have this 30-bit Color driver - the iMac 5K and the Mac Pro 6,1

So, from my system info:

Thunderbolt Display:
Display Type: LCD
Resolution: 2560 x 1440
Pixel Depth: 30-Bit Color (ARGB2101010)

means the thunderbolt display connected via Apples Thunderbolt 2 to USB-C adapter is also 10bit ?
 
No, iirc it has to say RGB plus some numbers instead of ARGB plus some numbers. However, on my iMac5k I have the same string so there is some inconsistency in the 30-bit stories here.
 
No, iirc it has to say RGB plus some numbers instead of ARGB plus some numbers. However, on my iMac5k I have the same string so there is some incorrectness in SoyCapitanSoyCapitan's story.

ARBG2101010 is certainly a 10-bit per channel pixel format. It just uses additional 2 bit for alpha.
 
That's what I meant: if it has it then there is more than the iMac5k and Mac Pro6,1 supporting 30-bit. If it doesn't have it then the string isn't correct.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.