With the impending debut of AppleTV+ methinks that any focus on maintaining 220 ppi is, quite simply, sniffing down the wrong trail here. I've got to believe that Apple does not want to showcase their new TV+ service on their own computers with their now-dated, meager 500 nit max displays. The upcoming hardware releases, IMHO, are going to be all about resolution, brightness, HDR and, of course, speed. All certainly worthy of a full scale rollout event at some point, I should think, especially with the impending holiday season soon to be upon us.
While the Pro XDR Display will be providing 1000(sustained)/1600(max) nits brightness @ only 218ppi, methinks Apple tipped-their-hand a bit with that announcement in letting the world know that their "next big thing" was HDR display (research and) technology.
I feel that they further tipped-that-hand as the new iPhone Pros are delivering 800(max typical)/1200(max HDR) nits @ 458ppi, so it should not be any stretch to imagine a full 4K (4096x2160) 16" MBP display coming in at around 350ppi (rough estimate by my math, ha!) with similar or better nit performance. The way I'm seeing things is that the next hardware releases are all going to be about enjoying their latest AppleTV+, on the go, at resolutions and brightness-levels well-beyond what their competitor's can deliver on their hardware. "That's where the puck is going", IMHO.
Add to that...
My new Panasonic Lumix S1 can shoot both stills and video with Wide Color Gamuts and >10-bit HDR but I have no way to view, work with, or enjoy that content at it's intended brightness and color spaces short of plugging the camera into an HDR HLG-compliant TV. It's a real-life problem that needs immediate solving both for content creators (read: video/photo editors) and for your average consumer who is now getting their hands on this level of image/video capturing technology. That HDR "puck" is already in the net with no way to accurately "work with it" or "enjoy it properly" on any Macs.
Watching Apple's latest releases with AppleTV+, Apple Arcade, Metal API FCPX, Mac Pro w/ XDR Pro, iPhone 11 Pro, I'm feeling confident that the folks at Cupertino have been R&Ding "the HDR content creation and delivery problem" for some time now and that we're beginning to see those products that solve that roll out. A 16", 4K, 1000-nit, HDR display, MBP is my thinking...hoping the same for iMac's and iPad Pro's soon to follow!
Well, that's how I'm calling it.
The main content delivery devices for Apple to showcase and HDR content that they have with AppleTV+ is going to be the iPhone and the iPad. Those are their main drivers right now and for some time to come. Even an AppleTV 4K is subject to the whim of the ultra-expensive or the ultra-cheap 4K UltraHD ”HDR” TVs it is connected to currently.
HDR content is coming, but it is coming slowly because a 10-bit DCI 4K HDR workflow is pretty much in the nascent stages of life and is much more resource intensive than 2K content. The Mac Pro and the XDR Display are changing that, but I would say 99% of consumers that are on 2K or 4K TVs don’t even have the ability to view HDR content on the TV they own and that is unlikely to change for quite some time. Sure, building your workflow to deal with 10-bit DCI 4K HDR content is a great thing to do, but it’s not cheap and it’s not common yet. I suspect that 2020 is going to be a breakout year for this, but the majority of video editors/content creators are going to have to continue releasing SDR content for the majority of their paying projects even as they are asked to provide HDR content that they may or may not have the workflow to deal with now.
The Panasonic S1 is a great camera and I would love to have an S1H myself, but HDR content is going to take quite a bit of time to mature and filter into the mainstream for most everyone and its going to be mostly limited to niche distribution and streaming services where you pay an upcharge for that content.
That the 16” MacBook Pro will not have a display capable of displaying UHD or DCI 4K HDR content at a 1:1, I say “NUTS”. Plenty of people deal with lower res proxies on a day in/day out basis. Having a decent mastering monitor ( RE: one you can actually afford) that is color managed and someone who can actually grade colors for HDR and SDR content is probably your weakest link in the whole chain from a cost and talent perspective. All the technology in the world can only get you so far. Color management from end to end is not for the faint of heart or the weekend wedding photographer. EDIT: https://blog.frame.io/2019/10/14/grading-mixed-delivery/
The “average” consumer has had 10-bit and HDR for a few years now and they aren’t any closer to knowing what the hell they are doing than when they first discovered and paid for LOG upgrades. Color grading and knowing how to make decent SDR content from HDR content is not something most people just suddenly gain the skills for overnight with that shiny new camera they bought.
Besides a 16” 4K, 1000-nit HDR display MacBook Pro would start at around $5K easily and only last around 4-5 hours on battery at those sustained brightness levels, not to mention the display itself is going to get very hot until the technology catches up to the DisplayHDR specifications - https://displayhdr.org/
The tech will get there. The question is how many consumers will actually see the benefit and will content creators bother to take the time and money necessary to provide widespread, mainstream content in HDR. Or rather, will their budget and their clients allow for it?
That’s how I’m calling it.
Last edited: