Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Lets figure that a real compression difference is maxed at 30 percent for H.265.
The scientific paper says it's almost double of that:

IEEE Xplore said:
It was found for the investigated test cases that the HEVC Main profile can achieve the same subjective quality as the AVC High profile while requiring on average approximately 59% fewer bits. The PSNR-based BD-rate average over the same sequences was calculated to be 44%. This confirms that the subjective quality improvements of HEVC are typically greater than the objective quality improvements measured by the method that was primarily used during the standardization process of HEVC.

It can therefore be concluded that the HEVC standard is able to deliver the same subjective quality as AVC, while on average (and in the vast majority of typical sequences) requir- ing only half or even less than half of the bit rate used by AVC. This means that the initial objective of the HEVC development (substantial improvement in compression compared with the previous state of the art) has been successfully achieved.
Source: IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, VOL. 26, NO. 1, JANUARY 2016
 


How soon is this ready for deployment and adoption though? IEEE delves into both theoretical and practical. lets not forget We are in a TCP/IP based world which edged out OSI model (both in IEEE standards as well iirc). As OSI model had the slight issue of being too complex to deal with for implementation. TCP/IP a simpler structure that mashes things up to get it done a quick and dirty way to say it.

Sadly someone never let that go. So IT people learn a theoretical way of how its supposed to be be (OSI model, loved on cert tests) and then translate that into the model actually, you know, used (TCP/IP).

Take away is when theory is good but impractical (to enough peoples tastes) something else can win that race regardless.
 

Alas, I wish I could find the two articles that exposed the "on paper" formal compression scheme vs reality. One of the articles shows some real footage and measures which places it where I mentioned. Incidentally, H.264 can further compress than what we see today but being lossy, you get what you pay for with more compression. I would be quite happy if greater compression could be used without significant loss but the real measure is both the viewing of such files and a contrast and compare to uncompressed. Thanks for the input and always worth the read.
 
fac1fe3974bdfd67ef9e09780e10c1c3.jpg


Just to set the record straight 7 feet from 65 inches is well within the benefit from 4K. I don't agree with this chart as I think it understates the real world differences, but nonetheless I wanted to set the record straight.
 
  • Like
Reactions: MarkCollette
fac1fe3974bdfd67ef9e09780e10c1c3.jpg


Just to set the record straight 7 feet from 65 inches is well within the benefit from 4K. I don't agree with this chart as I think it understates the real world differences, but nonetheless I wanted to set the record straight.
Don't know where you got that chart, so I can't see their explanation, but you lose the ability to see full resolution from the bottom of the UHD line to the top of the UHD line. You may have a very slight gain over 1080p at 7 feet (8 feet is ideal for a 65 inch TV at 1080p), but that difference gets to be less and less as you move closer from the bottom (4 feet) to the top (8 feet). You have to be at or inside the bottom line to see the full difference. In other words, you are right near the edge of being able to see the full resolution of 1080p, but nowhere near the distance to see the full resolution of 4k (assuming 20/20 vision).

To put it another way, you are 3 feet from the ideal 4k position and 1 foot away from the ideal 1080p position with that screen size.
 
While there's probably some merit and some science behind this chart stuff, the same kinds of charts were being slung around here to argue why 720p was "good enough" before Apple rolled out the 3 "now with 1080p". Here's a link back to 2008 where much of the very same rationale against 4K can be found (subbing in 1080p back then): https://forums.macrumors.com/threads/apple-tv-1080i.584041/#post-6462686 See the chart in post #14 and a very similarly-sounding argument against 1080p.

There are TONS of pre-:apple:TV 3 threads full of anti-1080p arguments that are pretty much the very same anti-4K arguments (lots of them inevitably offering up that same "the chart" to "prove" the point)... until Apple rolled out the "3" and then all that just stopped. I guess suddenly we could see the difference. Or suddenly the whole Internet was upgraded to be able to serve 1080p to everyone. Or our eyes evolved just in time to be able to see the difference from the same seating distances in our homes. Etc. :rolleyes:

2 things:
  • Head out to a store with 4K on display- ideally running the same demo next to a 1080p set and look at them from various distances. I bet it's very likely you'll see a difference and you won't even have doubts. Even if you are so locked into an anti-4K mentality that you'd rather pretend that you can't see it than admit the truth, take someone who doesn't give a hoot about this topic and ask them which screen looks the best. I bet they'll easily see it too... even at distances beyond what "the chart" says they should. It's dazzling when you go see it- just like 1080p screens were when we were still arguing that 720p was "good enough" while Apple clung to 720p as a HD max in the pre-"3" years.
  • As soon as Apple rolls out the "5", the chart won't be slung around to imply how stupid Apple is for embracing a gimmick that "no one can see". Stuff like "the chart" only flies while some of us are trying to imply or fool others of us into believing that what Apple has for sale now is perfection. Of course, once Apple shifts, the new thing will be the new perfection. In other words, all this pile of anti-4K rationale won't persist and be directed against Apple once they roll out the "5". It just dies... just as it did for 1080p when Apple rolled out the "3".
We are the same crowd who argued passionately for the original retina which was spun to us as "pixels are not visible to the naked eye"... and then later embraced "Retina HD" which, being even denser pixels, means it's even more beyond what our eyes can see. And yet the same people will cling to 1080p because apparently "retina" for the biggest screen in our homes makes little-to-no sense and here's a chart or two to "prove it" (resolution numbers changed to fit today's argument). :rolleyes:

Soon Apple will embrace 4K in this one product where they have not already embraced it (note how you don't see the anti-4K crowd over in threads bashing the 4K recording capability of iPhone or 4K editing capability of the iPad Pro/iMovie/FCPx or on Retina 5K iMacs- just here... with this ONE Apple product). In short, Apple is not (and never) wrong- just those of us wishing for anything not included in an Apple product for sale now are wrong. After Apple launches the 5, "the chart" will be retired again, hiding on the sidelines until 8K starts picking up. Then, someone will scratch out some older resolutions and write in 8K and we'll do all this again. The evidence is in the history. Search pre-:apple:TV3 threads and see for yourself.
 
Last edited:
  • Like
Reactions: off_piste
Just to be clear, I think Apple should have included 4K, but you have to move closer or move to a bigger screen to see the full resolution.

The big box stores have their TVs in torch mode and can very easily make the 4K look better under bright lights than the 1080p no matter how far you are from the screen. It has nothing to do with how it would perform side by side with proper calibration of both TVs.

If you are saying you can see the 1080p TV's pixels from any distance, I have to disagree.

Back before HD, my TV was bigger than most. It was a ~200 pound Sony 36 inch television. Most people I know had 27 inch TVs. I now have a 65 inch TV which does 1080p. I made adjustments in screen size to take advantage of the pixels. I would have to do the same for 4K or at least move closer.

As I said, I will eventually move all my displays so 4K, so I think it is the future. I think we agree on that point. I just think we disagree on what tighter pixels can do for picture quality from any distance.
 
If anyone is familiar with edge adjacency effect then they know 4k at 7' will look "sharper" than 1080p. Though the differences might appear small, in some instances it can be quite a bit different between the two and similar for certain scenes and dithering.
 
If you are saying you can see the 1080p TV's pixels from any distance, I have to disagree.

I didn't say I can see the pixels. Best I know, nobody interested in this topic cares about seeing the pixels. What (I think) they care about is "better picture." Whether they can see a better picture at 4K or whether they only think they can see one is what matters (to them). Those that can or think they can wish for a 4K :apple:TV.

As mentioned earlier, had Apple gone there with this "4" or goes there with the "5" anyone who knows better- or thinks they know better- won't be affected at all. Those people can still download only 1080p and just smugly enjoy that they are seeing every bit of picture detail from their "average seating distances" on their particular choice of television that any of the 4K crowd is seeing from their "average seating distances" on their particular choice of television. Those still clinging to 720p as "good enough" per those old threads I just referenced can do the same (download 720p and just smugly laugh at the fools who embraced 1080p where "no one can see the difference" and even more so at the 4K dummies "who definitely can't see the difference" because only 720p is perfection (or it was until Apple went 1080p)). And should any guest reference 4K (or 1080p for the latter segment), they can still whip out a printout of "the chart" and try to convince their guest that 4K (or 1080p) could look no better than the 1080p (or 720p) showing right now on their set in their home.

Furthermore, I make no such argument about "at any distance". Get far enough away from any size TV screen and you can't even see the screen anymore at all. What I did offer up as a challenge to the "proof" of "the chart" was to head for a local store that is currently selling both. And no, they are not showing only the 4K models in "torch mode"; both models will be in "torch mode" because they want to sell ANY TVs. Else, break out the remote and be sure both are either in or out of "torch mode". Then do the test. See for yourself with your own eyes. Or if bias is too much that you can't acknowledge what you see with your own eyes, take an unbiased person not locked in on a side of this discussion and ask them which picture looks better. Try it from beyond the distance of "the chart" where no one should be able to see any difference, then at the limit, then inside the limit. I bet all 3 distances will all yield the same vote by the unbiased judge.

Again, if we believe what we are saying, we should feel the same when Apple rolls out the 5 "now with 4K." And that means criticizing Apple for embracing the gimmick, offering up "the chart" as proof of Apple's ignorance, etc. I happen to know from before and after the launch of the "3" that THAT doesn't happen. Instead, we transform from spinning anti-4K sentiment to "shut up and take my money." And the chart gets retired... to be resurrected with the numbers changed when the 8K thing starts getting some legs... ahead of Apple going there.
 
Last edited:
People tend to think of detail seen from a farther distance but another byproduct of higher resolution is the ability to have a larger screen in a more confined room. I actually had to downsize one of my TVs in my old house that I wouldn't have if I had a 4K tv at the time.
 
People here forget that Apple is never at the forefront of technology, Apple is at the forefront of user experience. AppleTV was 720p when all other streamers were 1080p. Eventually Apple moved to 1080p when there was more content available. Now we have the same situation with 4k. There is very little programming available in 4k, therefore it makes little sense to develop for that until we reach a tipping point.

The Apple TV 4 hardware is fully capable of streaming 4k video.
 
  • Like
Reactions: 2010mini
I find Google searches to be invaluable when researching products, no digging -- just clicking -- 5 minutes tops. As an alternative, maybe just take a look at some next time you're in a shop, research the different ones that may be of interest. Consumer Reports online (and print) has a recent well-done and informative article. Good luck! :)


Ah , now your reply makes more sense. You thought I was shopping for a TV. I'm not.

I really was asking what you personally preferred about a curved TV screen.

And for the record, I do know how to do a Google search.
 
There would be no compromise of "user experience" if this 4" had 4K hardware. It would still play 1080p and 720p and SD to it's maximum. It would not force anyone to ONLY utilize 4K content or buy a new TV or throw out their current TV, but it would give an easy Apple solution to playing 4K shot on iPhones on 4K TVs for those who already own the latter. iPhones now shoot and play 4K but that has no effect on the iPhone "user experience" either- it's just something else that it can do beyond the "status quo."

I could write a bunch about "chicken & egg" here: what motivates the content creators to make 4K versions of their content available until there is a reasonable chance of profit by there being millions of hardware boxes in homes ready to play that 4K? Or shorter: the hardware MUST LEAD. We did not have BD discs before there were BD players. We did not have DVD discs before there were DVD players. We did not have SVHS or VHS or Beta tape before there were SVHS, VHS or Beta players. Why? Because software before there is hardware that can play it makes no sense at all- it just sounds good in trying to rationalize in support of why Apple still clings to 1080p when pretty much the rest of the players are already offering 4K. The hardware must come first. The software then either accompanies it or follows. EVERY SINGLE TIME- no exceptions.

How many apps that exclusively run only in iOS10 on iPhone 7 are already for sale in the iOS store? None. Well why not? Shouldn't Apple be waiting on developing the iPhone 7 and iOS10 until there is a multitude of apps ready to take advantage of that unique hardware & software? Of course not. And same here. Hardware first, then software. Hardware sets the bar for maximum potential, then software can exploit that max potential. It is impossible for that to work the other way.

Even more simple: wave our magic wand and add 4K versions for :apple:TV for every single video available in the iTunes store right now. How much money can be made on those videos? Not a cent. Why not? See above.

And while the "4" does have a chip inside that is known to be able to play 4K, 4K commercial video requires a HDMI standard that is not built into this "4"... and that is apparently not able to be addressed with any kind of software update. Thus, 4K for commercial video will require an :apple:TV5.
 
Last edited:
There would be no compromise of "user experience" if this 4" had 4K hardware. It would still play 1080p and 720p and SD to it's maximum. It would not force anyone to ONLY utilize 4K content or buy a new TV or throw out their current TV, but it would give an easy Apple solution to playing 4K shot on iPhones on 4K TVs for those who already own the latter. iPhones now shoot and play 4K but that has no effect on the iPhone "user experience" either- it's just something else that it can do beyond the "status quo."

I could write a bunch about "chicken & egg" here: what motivates the content creators to make 4K versions of their content available until there is a reasonable chance of profit by there being millions of hardware boxes in homes ready to play that 4K? Or shorter: the hardware MUST LEAD. We did not have BD discs before there were BD players. We did not have DVD discs before there were DVD players. We did not have SVHS or VHS or Beta tape before there were SVHS, VHS or Beta players. Why? Because software before there is hardware that can play it makes no sense at all- it just sounds good in trying to rationalize in support of why Apple still clings to 1080p when pretty much the rest of the players are already offering 4K. The hardware must come first. The software then either accompanies it or follows. EVERY SINGLE TIME- no exceptions.

How many apps that exclusively run only in iOS10 on iPhone 7 are already for sale in the iOS store? None. Well why not? Shouldn't Apple be waiting on developing the iPhone 7 and iOS10 until there is a multitude of apps ready to take advantage of that unique hardware & software? Of course not. And same here. Hardware first, then software. Hardware sets the bar for maximum potential, then software can exploit that max potential. It is impossible for that to work the other way.

Even more simple: wave our magic wand and add 4K versions for :apple:TV for every single video available in the iTunes store right now. How much money can be made on those videos? Not a cent. Why not? See above.

And while the "4" does have a chip inside that is known to be able to play 4K, 4K commercial video requires a HDMI standard that is not built into this "4"... and that is apparently not able to be addressed with any kind of software update. Thus, 4K for commercial video will require an :apple:TV5.

Wrong. The companies that made BD players also made BDs. There was no chicken or egg situation there.

The companies making 4K, HDR, 1080p hardware are not the same ones creating 4K, HDR or 1080p content delivery system.

The most efficient 4K UHD delivery system will be ATSC 3.0 which is still probably 2 years away. It will transmit both terrestrial and over Internet at the same time. Thereby allowing those without sufficient broadband to watch OTA.
 
Believe what you wish. Apple has already chosen a 4K standard for Apple. The new iPhone shoots it. Is it the ultimate incarnation of 4K? No. But Apple has already chosen a 4K standard. We can use our iPhones to shoot it and we can use our Macs or iPads to edit it, render it into an Apple-chosen Quicktime container and database it right into Apple's iTunes. Of course, you find no fault with Apple for all that.

Had Apple rolled out 4K hardware with this "4", millions of boxes capable of Apple's choice for a 4K standard would have moved into homes. Some Studio would have been tempted by those millions and tested some 4K content. If they made a buck, more 4K content would have been released... and more Studios would have piled on.

Instead, everybody else has 4K hardware and it looks like the Discs will lead the way into mainstream 4K- not 2 years from now but ASAP. Apple can just come in about last again... when it would have been easy for them to step ahead of the curve and that would have even made sense given how just about everything else announced in the same launch session could shoot and/or edit 4K... and Apple proudly touted those abilities.

But I know where you are on this topic... even though it would have no effect on you whatsoever had Apple embraced it in this "4". Again, a 4K :apple:TV would have no effect whatsoever on those who believe 1080p (or 720p) is "good enough" or good enough or them. Just as the 1080p "3" worked fine with 720p sets, a 4K :apple:TV would work fine with 1080p or 720p sets. Users could still pick their preferred video file format they deem best from the iTunes store and it would have let everyone IN, instead of excluding a segment... for nothing.
 
Last edited:
There would be no compromise of "user experience" if this 4" had 4K hardware. It would still play 1080p and 720p and SD to it's maximum. It would not force anyone to ONLY utilize 4K content or buy a new TV or throw out their current TV, but it would give an easy Apple solution to playing 4K shot on iPhones on 4K TVs for those who already own the latter. iPhones now shoot and play 4K but that has no effect on the iPhone "user experience" either- it's just something else that it can do beyond the "status quo."

I could write a bunch about "chicken & egg" here: what motivates the content creators to make 4K versions of their content available until there is a reasonable chance of profit by there being millions of hardware boxes in homes ready to play that 4K? Or shorter: the hardware MUST LEAD. We did not have BD discs before there were BD players. We did not have DVD discs before there were DVD players. We did not have SVHS or VHS or Beta tape before there were SVHS, VHS or Beta players. Why? Because software before there is hardware that can play it makes no sense at all- it just sounds good in trying to rationalize in support of why Apple still clings to 1080p when pretty much the rest of the players are already offering 4K. The hardware must come first. The software then either accompanies it or follows. EVERY SINGLE TIME- no exceptions.

...

This is wrong on several levels. I will deal with the user experience. If I insert a commercial DVD into a DVD player, then that DVD will play. No if's, no and's, and no but's. My experience with Blu-ray is somewhat different. I have very nice Sharp BD player. It plays every commercial DVD title without issue.

The same cannot be said of Blu-ray. What many may not know is that Blu-ray players are Java-based computers that play Blu-ray, DVDs, and, in the case of my Sharp, NetFlix streaming video. I had no issue with my first few Blu-rays. Then, I purchased the Avatar Blu-ray disc. I managed to play it though, but the experience was awful.

A few weeks later, I found a firmware upgrade on Sharp's website. I downloaded and installed the upgrade. As a test, I replayed Avatar. This time, it worked great. However, that is not the point. My Blu-ray player out of the box could not properly handle Avatar. The firmware upgrade needed to handle Avatar was not available when the Blu-ray disc was released.

Would you knowingly buy a consumer entertainment product for your parents or grandparents if they required upgrades to play their favorite content? At the time that Avatar was released on Blu-ray, 1080p HD video was a well-defined standard. Yet, the nerds responsible for producing the Avatar Blu-ray continue to tinker with their commercial product. The result was a bad user experience. It is my understanding that 4K is still a work in progress. As such, it is almost certain to have some gotcha's if it is deployed outside the nerd world.
 
I don't get it, man. Why the attitude? I've given you none.

Not cool.


Seems to be a tough room here...

At least we all got fresh lesson on how to do a Google search.

And personally, I like the curved screen because I am sitting fairly close to my TV and feel the screen coming "around" me more than the regular flat screen...
 
Last edited:
This is wrong on several levels. I will deal with the user experience. If I insert a commercial DVD into a DVD player, then that DVD will play. No if's, no and's, and no but's. My experience with Blu-ray is somewhat different. I have very nice Sharp BD player. It plays every commercial DVD title without issue.

The same cannot be said of Blu-ray. What many may not know is that Blu-ray players are Java-based computers that play Blu-ray, DVDs, and, in the case of my Sharp, NetFlix streaming video. I had no issue with my first few Blu-rays. Then, I purchased the Avatar Blu-ray disc. I managed to play it though, but the experience was awful.

A few weeks later, I found a firmware upgrade on Sharp's website. I downloaded and installed the upgrade. As a test, I replayed Avatar. This time, it worked great. However, that is not the point. My Blu-ray player out of the box could not properly handle Avatar. The firmware upgrade needed to handle Avatar was not available when the Blu-ray disc was released.

Would you knowingly buy a consumer entertainment product for your parents or grandparents if they required upgrades to play their favorite content? At the time that Avatar was released on Blu-ray, 1080p HD video was a well-defined standard. Yet, the nerds responsible for producing the Avatar Blu-ray continue to tinker with their commercial product. The result was a bad user experience. It is my understanding that 4K is still a work in progress. As such, it is almost certain to have some gotcha's if it is deployed outside the nerd world.
Perhaps I'm not understanding your point but this is an odd argument in the context of 4K on an Apple TV, a consumer product that frequently undergoes software updates. There aren't many consumer electronics today that don't require periodic software updates. Most are far, far easier to accomplish than the early blu-ray players where you had to load the update via USB stick.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.