Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

TheRealAlex

macrumors 68040
Original poster
Sep 2, 2015
3,037
2,316
Apple MacBook Pros are for Pros, only a small fraction of users are what can be called Professionals in their fields.

I am not however, If I purchase a Pro level device I expect the best in class.

And The last gen Intel Chips in the 2016 Touch Bar MacBook Pros can not or do not support 10-bit color and HDR.

Now the display are quality no doubt, but even when connected to an OLED TV or External Monitor they don't support industry standards that are so important to enjoy a wide range of colors.

Do you guys feel limited or misinformed by Apple ?
 
  • Like
Reactions: AgentAnonymous
Apple MacBook Pros are for Pros, only a small fraction of users are what can be called Professionals in their fields.

I am not however, If I purchase a Pro level device I expect the best in class.

And The last gen Intel Chips in the 2016 Touch Bar MacBook Pros can not or do not support 10-bit color and HDR.

Now the display are quality no doubt, but even when connected to an OLED TV or External Monitor they don't support industry standards that are so important to enjoy a wide range of colors.

Do you guys feel limited or misinformed by Apple ?

It can play it, just that it has to decode it software side, rather than having native hardware decoding.
 
Apple MacBook Pros are for Pros, only a small fraction of users are what can be called Professionals in their fields.

I am not however, If I purchase a Pro level device I expect the best in class.

And The last gen Intel Chips in the 2016 Touch Bar MacBook Pros can not or do not support 10-bit color and HDR.

Now the display are quality no doubt, but even when connected to an OLED TV or External Monitor they don't support industry standards that are so important to enjoy a wide range of colors.

Do you guys feel limited or misinformed by Apple ?

"Pro" to me is someone who writes and sells their own music.

Why would I care about the display?
 
  • Like
Reactions: AdonisSMU
Ey? The displays are P3, by definition this means they are 10bit... Is HDR an industry standard? Can you give me an example of another laptop with this feature so I can see what a real 'best in class' laptop display looks like?

Do you feel misinformed?

He means laptops have the ability to play 10bit via the Kabylake processors - and there are Kabylake laptops out there with 100%+ Adobe RGB screens I believe?

Also it isn't so much about the laptop screen, because you can connect them to HDR capable displays.
 
Apple MacBook Pros are for Pros, only a small fraction of users are what can be called Professionals in their fields.

I am not however, If I purchase a Pro level device I expect the best in class.

And The last gen Intel Chips in the 2016 Touch Bar MacBook Pros can not or do not support 10-bit color and HDR.

Now the display are quality no doubt, but even when connected to an OLED TV or External Monitor they don't support industry standards that are so important to enjoy a wide range of colors.

Do you guys feel limited or misinformed by Apple ?
There is no newer version CPU the 15 in could have and that's the real Pro device.
 
Professional is anyone who is making money in his field, not some elitist elite you are trying to imply.
Also Intel hasn't yet released Kaby Lake processor that would suit macbooks. Only one 15W released. So your post is basicaly useless.
 
Last edited:
Apple MacBook Pros are for Pros, only a small fraction of users are what can be called Professionals in their fields.

I am not however, If I purchase a Pro level device I expect the best in class.

And The last gen Intel Chips in the 2016 Touch Bar MacBook Pros can not or do not support 10-bit color and HDR.

Now the display are quality no doubt, but even when connected to an OLED TV or External Monitor they don't support industry standards that are so important to enjoy a wide range of colors.

Do you guys feel limited or misinformed by Apple ?

I am an engineer who needed CPU power with portability and doesn't care about 10-bit or HDR. Before anyone tells me I could have gone with another laptop or previous gen of MBP, yes I could have but I wanted something more future proof with better resale value. Not every "professional" does graphics or photography.
 
I am an engineer who needed CPU power with portability and doesn't care about 10-bit or HDR. Before anyone tells me I could have gone with another laptop or previous gen of MBP, yes I could have but I wanted something more future proof with better resale value. Not every "professional" does graphics or photography.

What's your point here? You say you don't care about it, and you wanted something with better resale value? These seems to contradict each other.
 
What's your point here? You say you don't care about it, and you wanted something with better resale value? These seems to contradict each other.
I don't see how a newer version of a MBP having better resale value and me not caring about HDR contradict each other. Please elaborate
 
I don't see how a newer version of a MBP having better resale value and me not caring about HDR contradict each other. Please elaborate

In order for something to have a better resale value it needs to support newer features, such as a P3 display. I appreciate you don't care about this display, but realistically it hasn't added any price onto the product, and you have to move forward one day.
 
And The last gen Intel Chips in the 2016 Touch Bar MacBook Pros can not or do not support 10-bit color and HDR.

What does this even mean? I've been playing around with 16-bit color and HDR for years now. Or do you mean built-in video decoding or something like that?
 
What does this even mean? I've been playing around with 16-bit color and HDR for years now. Or do you mean built-in video decoding or something like that?

Essentially 10bit display can show substantially more colours than an 8bit. And is most noticeable on things like gradients where there's really subtle transitions.
 
Ey? The displays are P3, by definition this means they are 10bit... Is HDR an industry standard? Can you give me an example of another laptop with this feature so I can see what a real 'best in class' laptop display looks like?

Do you feel misinformed?

The standard is called Ultra HD Premium.

http://www.trustedreviews.com/opinions/ultra-hd-premium

Minimum resolution of 3,840 x 2,160 – This is the simple part as this is the resolution – the number of pixels that make up the TV's screen – of 4K/Ultra HD TVs. There can be no confusion here.

10-bit colour depth – This means that the TV must be able to receive and process a 10-bit colour signal, which refers to the number of colours a video signal contains. Blu-rays use 8-bit colour, which equates to just over 16 million individual colours.

10-bit colour, often called 'deep colour', contains over a billion colours. This doesn't mean the TV has to be able to display all those colours, only that it can process the signal. Most decent ones can, so there's no problem here.

Minimum of 90% of P3 colours – 'P3' is what's known as a 'colour space', a standard that defines the colour information in a video stream. Colour spaces exist to ensure that the picture you see at home looks right. Think of it as the language of colour in the same way English is a language with rules people agree on.

To qualify as an Ultra HD Premium TV, a TV must be able to display 90% of the colours defined by the P3 colour space. This number is what's referred to as the colour gamut, or the number of colours a display can actually handle. So, a TV that can show '90% of P3 colours' would be said to have a 90% colour gamut.

The higher the number, the richer and more accurate the colours on a TV.

rgb-color-space-gamut-1x.png
This is a comparison of different colour spaces. sRGB / Rec. 709 is the standard for current TVs and it covers only 80% of the colours available using the DCI P3 colour space. (Image Credit: Noteloop)

Minimum dynamic range – If your head is hurting now then things are only getting worse from here on in. Sorry. To qualify, TVs have to meet a minimum standard for the maximum brightness they can reach and the lowest brightness – known as black level – they can achieve.

Sounds simple right? Wrong. That's because there are two different standards. They are:

OPTION 1: More than 1,000 nits peak brightness and less than 0.05nits black level

OPTION 2: More than 540 nits brightness and less than 0.0005 nits black level

The observant among you will notice that one demands higher peak brightness and accepts a higher (and therefore inferior) black level, while the other accepts a lower peak brightness but demands much lower (and therefore better) black level.

This is to accommodate the pros and cons of different TV technologies. LED TVs, which form the majority of TVs sold, support higher brightness but inferior black levels. OLED, meanwhile, can produce stunningly deep blacks, but aren't as bright.

In other words, the alliance has found a way to make everyone happy. Hurrah!
 
Essentially 10bit display can show substantially more colours than an 8bit. And is most noticeable on things like gradients where there's really subtle transitions.

I know what a 10-bit display is (btw, do we know for sure if these displays are true 10-bit or not)? I don't understand though what OP means with 'Intel chips' not supporting 10-bit.
 
I know what a 10-bit display is (btw, do we know for sure if these displays are true 10-bit or not)? I don't understand though what OP means with 'Intel chips' not supporting 10-bit.

The simple answer is that even if you hook up your $3,000 ish MBP to and HDR TV set or Monitor it can not feed that display with enough data or color to produce HDR. And that is a problem.
 
I seriously don't understand. Pros doc to a Thunderbolt Display. On the road you can't control the rooms ambient light and other things.

The display rocks as does the ssd and other aspects. Shees it's a laptop. Get an iMac.
 
The simple answer is that even if you hook up your $3,000 ish MBP to and HDR TV set or Monitor it can not feed that display with enough data or color to produce HDR. And that is a problem.

Intel doesn't have anything to do with it, they use AMD cards which support it as far as I know.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.