Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
So, from my system info:

Thunderbolt Display:
Display Type: LCD
Resolution: 2560 x 1440
Pixel Depth: 30-Bit Color (ARGB2101010)

means the thunderbolt display connected via Apples Thunderbolt 2 to USB-C adapter is also 10bit ?

No this might be inaccurate because that display has never been renowned for colour. So the issue now would be to measure with a spectrometer to ensure apple's reporting is somewhat correct.
 
I seriously do not understand any argument when you start talking about connecting your laptop up to a TV to watch a film... I would assume a TV had Netflix and other means of playing movies. I've always streamed on the very rare instance I've needed to do that. So it isn't really any argument over a 'pro' device, it's a computer not a media player!

Complete troll thread.
 
No this might be inaccurate because that display has never been renowned for colour. So the issue now would be to measure with a spectrometer to ensure apple's reporting is somewhat correct.

The laptop screen is supposed to be renowned for colour though, but it is only showing 32-Bit Color (ARGB8888) which seems odd to my non expert brain.
 
I seriously do not understand any argument when you start talking about connecting your laptop up to a TV to watch a film... I would assume a TV had Netflix and other means of playing movies. I've always streamed on the very rare instance I've needed to do that. So it isn't really any argument over a 'pro' device, it's a computer not a media player!

Complete troll thread.
Some pros do that to check if their content works on such an ordinary display. The hint in this is "SOME" which means "NOT ALL". As can be seen in this thread, not everybody seems to be understanding the meaning of this and thus it indeed has potential of being a complete troll thread. Luckily it is rather on track with technical details. Let's keep it that way.
 
Some pros do that to check if their content works on such an ordinary display. The hint in this is "SOME" which means "NOT ALL". As can be seen in this thread, not everybody seems to be understanding the meaning of this and thus it indeed has potential of being a complete troll thread. Luckily it is rather on track with technical details. Let's keep it that way.

Yes, however the OP has mentioned that why would ANYONE buy a device that doesn't support HDR, as he likes to watch movies. I think you're assuming that they are creating media and testing it on a screen (In which case you would normally burn it to a disk to test it anyway, if you want to test how it works on an ordinary device then hooking up a laptop via HDMI is not an accurate test).

There is technical information in this thread yes, but the OP is continuously bringing it back to "I can't watch movies in HDR on my $3000 laptop" without specifying anything, hence troll thread. I asked what cable they were using to hook up the laptop to a TV and got no response, only "Go test it". So I have no idea if they have used a HDMI2.0a cable, what kind of adapter they are using or anything.
 
That's what I meant: if it has it then there is more than the iMac5k and Mac Pro6,1 supporting 30-bit. If it doesn't have it then the string isn't correct.
That's why Apple's spec reporting looks inaccurate.

I didnt expect this thread to go this deep. And Im just looking for an honest answer to a siple question no disrespect intended. To me at this point seems like we have uncovered an HDR Gate.
 
https://macperformanceguide.com/blog/2015/20151105_2300-OSX_ElCapitan-10bit-dualNEC.html

OS X supports 10-bit display output since El Capitan. I think Radeon Pro 450/455/460 in the new rMBP 2016 also support it. To take advantage of it, you need a very good quality professional screen that's actually capable of displaying 10-bits per color.

As far as Intel iGPU goes – ask Intel why.

And don't call 10-bit HDR, it's retarded.

I looked at 455 in the Apple Store. System Profiler reported 8bit per pixel.
 
Which screen did you connect it to?

(built-in screen is a 8-bit panel like most laptops have on the market so it obviously reports 8-8-8-8 on it)

Maybe that's the reason but if the MBP screen is supposed to support P3 then it is supposed to be a wide gamut monitor.

If someone has connected the new MBP to an Eizo or NEC they could confirm with a screenshot if the GPU drivers output 10 bit with a professional monitor.
 
Maybe that's the reason but if the MBP screen is supposed to support P3 then it is supposed to be a wide gamut monitor.
Gamut and bit depth are not the same
If someone has connected the new MBP to an Eizo or NEC they could confirm with a screenshot if the GPU drivers output 10 bit with a professional monitor.
C'mon.... it's Macrumors, most people here only whine about not being able to connect their iPhone with included cable.
 
That's why I prefer the term colour space and like to refer to the CIE 1931 chart. But that's too much for marketing departments to tell consumers so they invent nonsense such as 'HDR colour'

I respect everyone's point of view and being technical is great and all but thanks to this thread at least we now know the MBP 2016 display is only 8-bit. But my questions is simple the 4K signal coming out of the MBP 2016 is coming out of a USB-C port that crosses into Intels hands since it's on the chipset the Radeon Pro 460 even if it's capable of pushing out 10-bit color because thats what the HDR Specs callss For someone else asked what cable I was using and definitely a Apple dongle to An HDMI 2.0 cable.

The cold fact is unless I'm doing something wrong, that the MBP 2016 Touch Bar does NOT support HDR in either of its Specs HDR10 or Dolby Vision which is all the way up to 12-Bit color.

It's a shame it's confusing, and disheartening that a $249 Xbox One S or Sony PS4 Pro have no problems while a near $3,000 Laptop lacks this simple ability should I decide to watch a 4K Video source. On a big screen 4K TV.

http://www.howtogeek.com/260962/hdr-format-wars-whats-the-difference-between-hdr10-and-dolby-vision/
 
We have been using 10 bit monitors on Macs for years without true support. As long as you know what you are doing you can get around limitations - only people working with specific colour precise work need it. For print, it hasn't been essential to have 10-bit output because CMYK has a lower gamut than sRGB or AdobeRGB due to the fact it can't produce native reds, greens and blues.

El Capitan was there first Mac OS to support 10bit output but ONLY if the graphics driver supports it.

You have to go to System Information app, look at the Graphics/Display section. Under Pixel Depth if it says:

32-Bit Color (ARGB8888)

Then that is 8-bit per channel. If it says:

30-Bit Color (RGB101010)

Then that is 10-bit per channel. 30-bit being wider gamut than 32-bit (because the latter counts the Alpha channel too).

Only two Macs have this 30-bit Color driver - the iMac 5K and the Mac Pro 6,1

The new MacBook pros output 10bit colour to external displays, even the 13. It's right on apples tech specs page (billions of colours instead of millions).
 
I respect everyone's point of view and being technical is great and all but thanks to this thread at least we now know the MBP 2016 display is only 8-bit. But my questions is simple the 4K signal coming out of the MBP 2016 is coming out of a USB-C port that crosses into Intels hands since it's on the chipset the Radeon Pro 460 even if it's capable of pushing out 10-bit color because thats what the HDR Specs callss For someone else asked what cable I was using and definitely a Apple dongle to An HDMI 2.0 cable.

The cold fact is unless I'm doing something wrong, that the MBP 2016 Touch Bar does NOT support HDR in either of its Specs HDR10 or Dolby Vision which is all the way up to 12-Bit color.

It's a shame it's confusing, and disheartening that a $249 Xbox One S or Sony PS4 Pro have no problems while a near $3,000 Laptop lacks this simple ability should I decide to watch a 4K Video source. On a big screen 4K TV.

http://www.howtogeek.com/260962/hdr-format-wars-whats-the-difference-between-hdr10-and-dolby-vision/

Buy an Xbox One S then, problem solved. A dedicated multimedia device is probably a better device to be using to watch films on.
 
I respect everyone's point of view and being technical is great and all but thanks to this thread at least we now know the MBP 2016 display is only 8-bit. But my questions is simple the 4K signal coming out of the MBP 2016 is coming out of a USB-C port that crosses into Intels hands since it's on the chipset the Radeon Pro 460 even if it's capable of pushing out 10-bit color because thats what the HDR Specs callss For someone else asked what cable I was using and definitely a Apple dongle to An HDMI 2.0 cable.

The cold fact is unless I'm doing something wrong, that the MBP 2016 Touch Bar does NOT support HDR in either of its Specs HDR10 or Dolby Vision which is all the way up to 12-Bit color.

It's a shame it's confusing, and disheartening that a $249 Xbox One S or Sony PS4 Pro have no problems while a near $3,000 Laptop lacks this simple ability should I decide to watch a 4K Video source. On a big screen 4K TV.

http://www.howtogeek.com/260962/hdr-format-wars-whats-the-difference-between-hdr10-and-dolby-vision/

You've linked to two competing television standards hdr10 and DolbyVision. No professional display or computer should ever come with a display preconfigured for these entertainment standards otherwise our colour work for other medium will be screwed.

If you want to create content for ether of these standards then you would use a separate preview monitor connected to your editing suite. If you want to play video games and movies in either of these standards, you buy a console or video player.

Can we please get something straight on this forum. PRO laptop doesn't mean expensive entertainment laptop for gamers and film fans.
 
being technical is great and all
What's the point of wanting something you don't even understand?

But my questions is simple the 4K signal coming out of the MBP 2016 is coming out of a USB-C port that crosses into Intels hands since it's on the chipset the Radeon Pro 460 even if it's capable of pushing out 10-bit color because thats what the HDR Specs callss For someone else asked what cable I was using and definitely a Apple dongle to An HDMI 2.0 cable.
WUT?
 
What's the point of wanting something you don't even understand?


WUT?

It was rather grammatically confusing. You don't often see questions that don't end in a question mark either, nice touch.

Umm I think it was something along the lines of. Even thought the AMD cards support it, does the fact that the signal goes through the USB-C port block HDR. Or something like that, I have no idea anymore. I'm going to go ahead and bash my head on the keyboard for a bit if it's alright with everyone.

sadfh fs duf 230 had asfa dia fdfsa-adsfu0dafdsa u aduf2 u[aaf asfagh bt 8fet t yt4y8a dfsds adya sday9 8dh8 8. we 89dfahdfdsafhu dsafyfsddsds sddfsa8 j afdsyyduha2sdfapo dsa fdas
 
  • Like
Reactions: maratus
I am Pro software developer and macs are fantastic. For the Screen... its a lot better than those at the office (Dell) and sam with trackpad, portability, battery life, build quality... everything. And besides, OSX! :)

I guess other typ of artists (music, video) maybe have other needs...
 
Doesn't Samsung has the only other laptop with an "HDR" mode? That's it and it only came out a few months ago.


And I believe Kaby lake will bring 10 bit HDR encoding so is the question why wasn't Apple first to the market with it?
 
The simple answer is that even if you hook up your $3,000 ish MBP to and HDR TV set or Monitor it can not feed that display with enough data or color to produce HDR. And that is a problem.
By that statement it's pretty clear you haven't a clue about what you are talking about. Thanks for the laugh though...
 
Also HDR10 requires HDMI2.0a or DP1.4, and they're not in the Thunderbolt 3 standard. 10-bit support for HDR isn't enough, you need hardware compatibility with Rec.2020 color space and Perceptual Quantizer.

I don't think any of the current laptops support it yet. And the big question is whether we care?
 
This thread... Holy cow do people not understand what they're talking about. Nor how to read specs on Apple's own website, apparently. It says right on their spec page that the 2016 MacBook Pros can only display Millions (8-bit) not Billions (10-bit) of colors.

If you're disappointed now, I'll tell you why you'll be disappointed next year: Next year's Macs using Kaby Lake processors will not support hardware accelerated playback of the AV1 video codec, which Google and others are pushing as the standard video codec for UHD Premium content over the less efficient and royalty-fee ridden h.265. Blame Intel, the video standards bodies, ISPs, and HDMI group for getting us in this mess.
 
  • Like
Reactions: nol2001
Doesn't Samsung has the only other laptop with an "HDR" mode? That's it and it only came out a few months ago.


And I believe Kaby lake will bring 10 bit HDR encoding so is the question why wasn't Apple first to the market with it?
Maybe because HDR10 is an inferior spec compared to say HDR Dolby Vision, HLG, or as yet unnamed system from Phillips/Technicolor . Only advantage HDR10 has is that it's open source and being pushed by, you guessed it, TV makers. I have seen HDR Dolby Vision screens at 2000 nits brightness and it's amazing. Way to early in HDR standards wars to choose a side yet. But hey don't all "true professionals" use TV's as monitors? :)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.