Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I hope you’re right, but I fear the industries in play have other plans...

You literally won’t be able to tell the difference between 4K and 8K if you’re watching content on a 65” or less at around 10 feet or more. It’s a really hard sell because the jump in quality is nothing like how we went from SD to HD, and from HD to UHD. Both those transitions were markably noticeable on standard sized TV sets. 77” sets and larger will primarily benefit from 8K but then again, how many households would want to even go that big?

But you’re right about the industry having other plans. I’m sure they’re working up “the next big thing” in TV technology to sell...whether it’s a higher form of HDR, or newer types of display panels, I just don’t believe it’s 8K as we’ve pretty much reached that theoretical limit of what the eye can perceive in terms of resolution.
According to proponents 8K offers more than just more pixels. Less aliasing, higher brightness and contrast perception, depth of field and tonality. Less noise and artifacts. A 8k HEVC stream is 84 Mbps. VVC when released may reduce that requirement.

https://www.soundandvision.com/content/8k-it-s-about-hyper-realism-not-just-more-pixels

Interesting read. Thanks for posting this!
 
Stop watching broadcast TV! 99% of the content on streaming services is 1080 or above. Any film made or remastered in the last 4-5 years is going to be available in 4K. Similar situation with video games, 4K consoles came out in 2017.

It's not a content issue for broadcast TV. It's that the signal has to be compatible with over the air broadcasts, and the ATSC 1.0 standard required by law only allows 720p or 1080i. So it's a question of which is worse between lower resolution or interlacing.
Eh, the truth is somewhere in between.
The problem is that we are having issues driving the sheer number of pixels for 4K. Remember, it’s 4x 1080p.

4K films on Blu-ray are legitimate, but streaming services are only kinda-sorta 4K. They are delivering 4K resolution (4x the number of pixels) but only through massive compression- Netflix’s 4K content is only twice the nitrate of their 1080p, so technically lower bitrate per pixels. It’s like exporting a higher resolution JPG but lowering the quality settings.

Same deal with consoles. They are capable of outputting 4K, but not great at driving it. The PS4 Pro only had roughly twice the performance of the PS4, but is expected to draw the same games with 4x the resolution. The vast majority of games (excepting low poly games, remakes, or 2D) actually run in 2K (1440p) and real time upscale to 4K and “pretend” to be 4K.

Still looks much better than 1080p- but our consoles are mostly driving up scaled 2K content because the performance requirements are too high.

8K is four times 4K. It will literally eat every performance increase afforded by the next generation of console to support. Netflix just CAN’T push the bandwidth without cutting bitrate per pixel further and I’d rather just see higher bitrate 4K than 8K streamed.
Movies on physical media are the only place true 8K might be possible for at least a few years and a full console generation.
 
I wonder what homekit support means. If they got Apple to agree to let them use the TV as a homekit hub, does that mean they had to remove all the advertising from the menus and data collection?

It means you can control it via Siri. I love my LG!
 
Last year when Samsung revealed their 8K TVs they framed it as the AI was used for improving upscaling 720p/1080i content to 8K. Not changing 4K/native content.
Improving upscaling is probably an appropriate use of AI. I get so annoyed when companies overuse technology to the point that it degrades the experience.
 
Netflix’s 4K content is only twice the nitrate of their 1080p, so technically lower bitrate per pixels. It’s like exporting a higher resolution JPG but lowering the quality settings.

You can't directly compare bitrate to bitrate because 4K is compressed in HEVC (or VP9 for Google) and 1080p traditionally has been in H.264. Conversely, 4K comes with HDR 10 and 12 bit formats, and complex sound like Dolby Atmos.

Similarly, we're going to see 8K adopting newer compression like VVC.
 
  • Like
Reactions: RichardGroves
Stop watching broadcast TV! 99% of the content on streaming services is 1080 or above. Any film made or remastered in the last 4-5 years is going to be available in 4K. Similar situation with video games, 4K consoles came out in 2017.

It's not a content issue for broadcast TV. It's that the signal has to be compatible with over the air broadcasts, and the ATSC 1.0 standard required by law only allows 720p or 1080i. So it's a question of which is worse between lower resolution or interlacing.
Most films shot on film can be converted to 4K too. It’s just up to the studio.
 
Stop watching broadcast TV! 99% of the content on streaming services is 1080 or above. Any film made or remastered in the last 4-5 years is going to be available in 4K. Similar situation with video games, 4K consoles came out in 2017.

It's not a content issue for broadcast TV. It's that the signal has to be compatible with over the air broadcasts, and the ATSC 1.0 standard required by law only allows 720p or 1080i. So it's a question of which is worse between lower resolution or interlacing.
Streamed 1080p or 4K? Did you see the difference between this and a non-streamed (aka non compressed non-internet) version?

Unless you’re watching a RAW FHD/4K/8K BluRay, those screens wont be utilised in more than 50%.

I’m personally waiting for an 8K TV good enough to replace my 3 × 4K PC screens.
 
Someone posted on another site that their recording studio was going to replace one of their 43” 4K displays with an 8K TV display on a trial bases and replace all their 4K displays if they like the results. They’re looking for more room and better clarity.
 
Stop watching broadcast TV! 99% of the content on streaming services is 1080 or above. Any film made or remastered in the last 4-5 years is going to be available in 4K. Similar situation with video games, 4K consoles came out in 2017.

It's not a content issue for broadcast TV. It's that the signal has to be compatible with over the air broadcasts, and the ATSC 1.0 standard required by law only allows 720p or 1080i. So it's a question of which is worse between lower resolution or interlacing.

Ironically it's actually most of the newer movies (shot digitally) that are not in 4K, but the older ones shot on film that are. Some of the best looking 4K movies out there are decades old. The original Blade Runner, for example - looks sensational in 4K, - real 4K at that. Your favorite new blockbusters? Likely not real 4K but upscales. Avengers series etc... Kind of sad.
 
  • Like
Reactions: mozumder
Stop watching broadcast TV! 99% of the content on streaming services is 1080 or above. Any film made or remastered in the last 4-5 years is going to be available in 4K. Similar situation with video games, 4K consoles came out in 2017.

It's not a content issue for broadcast TV. It's that the signal has to be compatible with over the air broadcasts, and the ATSC 1.0 standard required by law only allows 720p or 1080i. So it's a question of which is worse between lower resolution or interlacing.

All true. BUT
ATSC. 3.0 (HEVC, 4K, 120p, HDR) has been defined, has been tested, is is use in South Korea, and CAN be used in the US as long as a standard ATSC signal is simulcast.
There is (not yet anyway) legislation mandating a transition. But I could see that, once the installed base of 4K capable TV‘s becomes high enough, stations start broadcasting on 4K, with a lousy ultra-compressed SD signal (you know what I mean, the stuff you get when 8 SD channels are crammed into one HD-channel’s bandwidth) as their mandatory ATSC. 1.0 support. Which is not a bad outcome — the high end gets better, but the low end still works for people who simply refuse to upgrade.

So when’s that “enough 4K TVs” expected to happen, at least in major metro’s? Not a clue.
Apparently there are test transmissions in Phoenix and Dallas.
The main takeaway is: what ever new TV you buy from now on, make sure it is ATSC 3.0 compliant...
 
There’s no change regarding airplay and HomeKit compared to last years models, right?

Now, time for someone to sell me eye upgrades! 😆 I don’t even notice a benefit in 4K moving pictures at comfortable viewing distances. I think we might be reaching “videophile” territory here …
 
88” OLED cost will be over $10k. Unaffordable for 95% of the population.
Well duh.
The same was true of flatscreens 20 yrs ago, and 4K 5 yrs ago...
[automerge]1578041738[/automerge]
Eh, the truth is somewhere in between.
The problem is that we are having issues driving the sheer number of pixels for 4K. Remember, it’s 4x 1080p.

4K films on Blu-ray are legitimate, but streaming services are only kinda-sorta 4K. They are delivering 4K resolution (4x the number of pixels) but only through massive compression- Netflix’s 4K content is only twice the nitrate of their 1080p, so technically lower bitrate per pixels. It’s like exporting a higher resolution JPG but lowering the quality settings.

Same deal with consoles. They are capable of outputting 4K, but not great at driving it. The PS4 Pro only had roughly twice the performance of the PS4, but is expected to draw the same games with 4x the resolution. The vast majority of games (excepting low poly games, remakes, or 2D) actually run in 2K (1440p) and real time upscale to 4K and “pretend” to be 4K.

Still looks much better than 1080p- but our consoles are mostly driving up scaled 2K content because the performance requirements are too high.

8K is four times 4K. It will literally eat every performance increase afforded by the next generation of console to support. Netflix just CAN’T push the bandwidth without cutting bitrate per pixel further and I’d rather just see higher bitrate 4K than 8K streamed.
Movies on physical media are the only place true 8K might be possible for at least a few years and a full console generation.

Broadcast TV is, remember, still using MPEG2.
h.264 is (good enough) 2x as efficient as MPEG2, and hevc is 2x as efficient again.
In other words, if you move from 1080p on h.264 to 4K on hevc, you only need 2x the bandwidth for a totally legitimate 4K “experience”.
Obviously any Apple kit from the past few years supports hevc. But generic Netflix streaming to some random 4K TV or streaming box? I have no idea how widespread is their hevc support.
 
Last edited:
8K??? Well Australian TV is useless as we still broadcast in 576i it's pathetic.

Not really sure what you mean.

All Australian primary free to air channels broadcast in 720/1080. Cable channels are 1080 along with 4K for movies and sports while you can stream most of the free to air and cable channels at 720/1080 as well.
 
About half the movies that are labeled 4k are upscaled from 2K masters. Even most of the recent Marvel (MCU) movies were only mastered at 2K. I believe Black Panther is the only one mastered in 4K, and the camera used to film that was 3.4k

And even though recent consoles support 4K, most AAA games aren't actually rendered at that resolution.
2K movies released as 4K discs often look better than 2K discs upscaled. One big reason is HDR.
 
....


Broadcast TV is not the source of content for these TVs.
What else? Blueray (LOL)?

A 8k HEVC stream is 84 Mbps. If this becomes the standard this will lead to massive internet bandwidth problems in urban areas.

Currently, people are happy if you have a reasonably stable intercontinental connection with e.g. 1080p Facetime. Globally, this is the maximum resolution that can be transmitted individually without major problems.
Nobody will like buggy 8K connections with interruptions with no significant visual improvement.

Of course, 8K only makes sense for large projection surfaces, projector or TV. For that you have to buy the premises first ;-)

Cost: Larger displays have the disadvantage of production costs growing exponentially to the screen diagonal.
 
Last edited:
Broadcast TV is, remember, still using MPEG2.
h.264 is (good enough) 2x as efficient as MPEG2, and hevc is 2x as efficient again.
In other words, if you move from 1080p on h.264 to 4K on hevc, you only need 2x the bandwidth for a totally legitimate 4K “experience”.
Obviously any Apple kit from the past few years supports hevc. But generic Netflix streaming to some random 4K TV or streaming box? I have no idea how widespread is their hevc support.
ALL 4K TVs support HEVC. It’s a basic requirement for 4K TVs.

(That doesn’t necessarily mean it can play HEVC off a USB drive though.)


What else? Blueray (LOL)?
Already posted in this thread. Streaming services support 4K, with HDR. And it looks totally amazing. For example, check out Lost In Space on Netflix in 4K. The show itself gets decent but not great reviews, but the image quality is simply stellar. On an OLED, it’s jaw dropping.


To put this in perspective, I saw Rise of Skywalker in one of the top IMAX theatres in North America (Scotiabank IMAX Toronto). In terms of pure image quality, my OLED at home playing Lost In Space easily beat it.
 
Last edited:
ALL 4K TVs support HEVC. It’s a basic requirement for 4K TVs.

(That doesn’t necessarily mean it can play HEVC off a USB drive though.)



Already posted in this thread. Streaming services support 4K, with HDR. And it looks totally amazing. For example, check out Lost In Space on Netflix in 4K. The show itself gets mixed reviews, but the image quality is simply stellar.
I know, I have an 4K projector and AppleTV 4K HDR. Nevertheless, 8K is not comparable.
It is currently not possible to stream 8K in good quality without interruption and thus in a consumer-friendly manner.

Anything else is cheating the customer. LG knows that.
 
Last edited:
This is great news with the introduction of 8K TV's 4K TV's will get cheaper, which is what most consumers will be buying for the next decade.
 
  • Like
Reactions: Morgenland
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.