Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Wouldn't make a difference because there isn't any 8k content available.

----------



No, because again there isn't any content being produced in this format.

Why does everyone think you need native resolution content to fully use a display? 4K or even 8K resolution for a computer monitor offers much more than just the ability to view 4K and 8K content.
 
I love that it is super fast, but can I really ask them to handle ten times the content so that I can watch a slightly sharper version of the walking dead?


The Walking Dead is shot on 16mm film for that grainy look we all enjoy and appreciate. Which is just about the equivalent of 2K maximum resolution you get out of that.
 
And here I am still contemplating on getting a 2160 x 1440 external monitor for my MBPr. Technology is moving way too fast!!!!
 
8K displays. Finally room to edit with two 4K source/program monitors side by side on one display. Just because 8K doesn't work for you doesn't mean we should stop progressing forward with display tech! By the logic I've seen so far in this thread we should all be using 200mhz laptops, that have 14 day battery life and 800x600 displays.

I get it, you want better batteries. Well I spend my life plugged in at a desk so I'd prefer a ultra high res display and many high speed cores in my CPU.

Let's try and meet in the middle. You can get better battery life, and I'll hold off on the 16K display and settle for 8K, and everyone wins.
 
I have the same question. Retina is awesome, 5K is too but the difference between Retina and 5K to the average user is very little. And what will 8k do? I am all for better technology, but it has to be useful for something other than boasting that I have more pixels than you do. So my question is what is the use case for 8k?

5K is retina, 2560x1440 HiDPI, you talk as if they're different things. As for 8K, one could only speculate as to whether it will make a huge difference, but the difference certainly won't be as big as retina and non-retina.

Edit: Unless you're talking about 5K on a rMBP. If so I doubt there will be a huge noticeable difference between that and 8K and not at all from normal viewing distance
 
Last edited:
8K displays. Finally room to edit with two 4K source/program monitors side by side on one display.

This is the case if the displays are not-Retina'd.

A 40" 8K non-Retina display will have very small icons...

A 27" 8K non-Retina will be impossible to use. A 27" 5K non-Retina is too small.
The current 27" 5K Retina'd looks great.

Maybe 8K will give the option between Retina 2x and Retina 1x?
Screens > 50" could use "native" 8K...
 
Consumers took a huge step backwards quality wise by opting instead for convenience. It's part of the lazy culture we have become. Why get off your ass and rent a movie on Blu-ray and instead rent something twice the price on your TV or set-top device. Same happened with music when consumers opted first for CD "quality" and then chose to want highly compressed MP3's. The average consumer is oblivious to how much better things can look and sound.

I agree with you on principle, and I certainly do make the effort to get Blu-Ray for all the things I truly want the best experience from.

OTOH, I grew up listening to music I recorded on a cassette tape off of FM radio played back on my walkman, and I loved every bit of those songs. It's a tough call to say that my life or feelings about the tunes at that time would have been made better by higher quality recordings.

Sure, it matters... I just think there is a tipping point. When I rent a movie off of iTunes, I assure you midway through I'm thinking "cool film... like the heroine... Ooo! Nifty plot twist!" I've never found myself thinking, "Damn, I'd be enjoying this so much more if I could hear more low end on the sub."
 
This is the case if the displays are not-Retina'd.

A 40" 8K non-Retina display will have very small icons...

A 27" 8K non-Retina will be impossible to use. A 27" 5K non-Retina is too small.
The current 27" 5K Retina'd looks great.

Maybe 8K will give the option between Retina 2x and Retina 1x?
Screens > 50" could use "native" 8K...

2x or even 4x GUI elements similar to what they use now on the 5K display. Getting nearly all of my preview/program monitor on one display would be super.


Edit:

You all want more 4k footage? Help out us video people and speed up the development of 4K or larger monitors (and especially GPU). It's a bit of a pain to edit 8k video on a 1080p screen.
I'm not asking to game at 120fps on 8K, but I'd love consistent 60hz at any size for video editing.
 
Last edited:
I mean, I get it that 8k is better than 4k, but is this something realistic for a laptop? Put another way, given the distance at which laptops are used and the realistic constraints on their size, does a 15" 8k display even make sense?

Maybe not.. But an 8K MBP 17"

:eek::eek::eek:
 
What's the point. How about something useful like FINALLY updating Cinema Display with Thunderbolt 2 and 4k display and adding GPU over Thunderbolt support to OSX.
 
8K displays. Finally room to edit with two 4K source/program monitors side by side on one display. Just because 8K doesn't work for you doesn't mean we should stop progressing forward with display tech! By the logic I've seen so far in this thread we should all be using 200mhz laptops, that have 14 day battery life and 800x600 displays.

I get it, you want better batteries. Well I spend my life plugged in at a desk so I'd prefer a ultra high res display and many high speed cores in my CPU.

Let's try and meet in the middle. You can get better battery life, and I'll hold off on the 16K display and settle for 8K, and everyone wins.

99% of current computers could even get decent frame rate on one 4K screen, let alone two. Maybe we need to try this before moving on two 8K (though I'm all for forward looking standards). Not many computer need gigabit internet either but we got that too.
 
Is it just me, or is this pointless?

8K resolution displays? Why?

I'm not saying you can't find some niche case where it's really useful -- but this reminds me a lot of the race to have more megapixels of resolution with digital cameras. Diminishing returns past a certain point, where it becomes "bragging rights" and marketing rather than a useful improvement for most uses.

We've still got the problem where at 4K resolution, most video chipsets aren't capable of pushing that many pixels around at speeds fast enough to please people doing 3D animation work or playing 3D games. Now we're racing to double those requirements?
 
8K displays. Finally room to edit with two 4K source/program monitors side by side on one display. Just because 8K doesn't work for you doesn't mean we should stop progressing forward with display tech! By the logic I've seen so far in this thread we should all be using 200mhz laptops, that have 14 day battery life and 800x600 displays.

I get it, you want better batteries. Well I spend my life plugged in at a desk so I'd prefer a ultra high res display and many high speed cores in my CPU.

Let's try and meet in the middle. You can get better battery life, and I'll hold off on the 16K display and settle for 8K, and everyone wins.

What you want to do can be better achieved by two 4K displays anyway, I don't see why you would need an 8K.
 
Ok, YOU'RE the niche use-case.. but ....

As soon as 8K rolls out, we'll hear folks like you demanding 16K screens so you can "edit two 8K displays side by side on one screen"! :)

I'm not suggesting we should halt progress, but I'm suggesting some of these "advancements" come about because it's the easiest part of the equation to advance - without regard for what's really practical.

(EG. It's relatively easy to create new standard for cabling, to say "This connector and cable is now officially certified as carrying THIS high a resolution of a signal." Boom ... new standard defined. Except now you're left figuring out how you're going to build all the surrounding hardware to actually use all of that capability, and THAT'S where the bottlenecks and incompatibility issues come about that frustrate real users every day.)


8K displays. Finally room to edit with two 4K source/program monitors side by side on one display. Just because 8K doesn't work for you doesn't mean we should stop progressing forward with display tech! By the logic I've seen so far in this thread we should all be using 200mhz laptops, that have 14 day battery life and 800x600 displays.

I get it, you want better batteries. Well I spend my life plugged in at a desk so I'd prefer a ultra high res display and many high speed cores in my CPU.

Let's try and meet in the middle. You can get better battery life, and I'll hold off on the 16K display and settle for 8K, and everyone wins.
 
Since Apple can make their own chips, I suppose they don't have to wait for Intel to ramp a general purpose chip for 1.4a. I do wonder therefore if Apple can make an external display connector that can support 8k or with a splitter 8x1k or 2x4k? Now that 4K content has gone "to consumer" (Netflix...), I suspect theatrical movie display resolution will migrate to 8k+ and with Red Cine offering up to 24k cameras, and the ability to natively edit that in Final Cut Pro already, we may see a high resolution rennissance.

Rocketman
 
Again. The short-term need for this is to be able to drive multiple 4 and 5k displays over a single Displayport cable. This is already a limitation as most of the currently available laptops on the market can drive only a single 4k display. People are already running into the limitations of Displayport bandwidth.
 
This is the case if the displays are not-Retina'd.

A 40" 8K non-Retina display will have very small icons...

A 27" 8K non-Retina will be impossible to use. A 27" 5K non-Retina is too small.
The current 27" 5K Retina'd looks great.

Maybe 8K will give the option between Retina 2x and Retina 1x?
Screens > 50" could use "native" 8K...

Retina is just a marketing term used by Apple when some device gets to a certain level of pixel density. Past a certain resolution everything becomes 'Retina' it's not a technology it's a marketing term !
 
The problem is that consumers are still happily and obviously consuming only poor quality content than their TV's are capable, and Apple is to blame...

Don't know why, but this just came to mind.

audiophiles.png


Though we have a fairly 'above avarage' setup (2014 "high end" TV and NAD + B&W sound), I'll have to admit that we settled for convience (iTunes + Netflix). We do watch the occasional Blu-Ray movie from time to time, but on a day-to-day basis, less will certainly do.

I'm having a hard time understanding this urge for more resolution - being it visual or audio. My personal gripe is more focused on compressed sound having less definition and "omph", than whether or not New Spock is in HD, FullHD or 8k.

It doesn't - at least for me - make the movie better or worse. The story is the same.
 
Retina is just a marketing term used by Apple when some device gets to a certain level of pixel density. Past a certain resolution everything becomes 'Retina' it's not a technology it's a marketing term !

Not true.
A "Retina'd" screen has the pixel-density which is very high, but has the screen real-estate which is considered "normal".

In the iMac's case:
The Retina 5K: 5120 x 2880 (= 2 x (2560 x 1440) )
Without "Retina" the screen will have a "real-estate" of 5120 x 2880 = very small icons.
It actually uses the same screen real-estate as the normal 27": 2560 x 1440, but 4 x sharper.

That is what the term "Retina" means.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.