Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
No reasonable person would consider these to be the exact same resolution just because they are both 1440p.

Well, no, but only because most people would confuse the aspect ratio for the resolution. If you could cut the one on the right to be 2560 pixels wide a reasonable person would say they were the same resolution. If you said that was because they essentially function at the same resolution but one is wider and has more pixels per device than you would be correct.
 
Uh....no.

Think about it. 4k itself is a number that only refers to the number of vertical lines in the image. There are twice as many lines in a 4k image as there are in a 1080p image. It is confusing because 1080p refers to the number of horizontal lines. You have four times as many pixels because twice the resolution in each direction gets you four times the surface area.

Here you go


Here is more on why it is only 2x the resolution:

https://www.redsharknews.com/technology/item/1650-is-4k-twice-or-four-times-as-good-as-hd


we could avoid a lot of confusion by keeping the same units/naming convention when discussing this stuff.

4K is a horizontal resolution (yes, the number of vertical lines). 2K is a horizontal resolution.

1080 is a vertical resolution (the number of horizontal lines). 2160 is a vertical resolution.

1080 is almost equal to 2K, and 2160 is almost equal to 4K. those are the equations. in the broadcast business things are referred to by their vertical resolution and scan scheme and frame rate, so:

1080p/24 (or 23.98/25/30/60) = standard HD resolution. there are usually 1920 horizontal pixels, or almost 2K. there are 2.07M pixels in this raster.

2160p/24 (or 23.98/25/30/60) = standard UHD resolution. there are usually 3840 horizontal pixels, or almost 4K. there are 8.29M pixels in this raster. 4x as many pixels as 2K/1080.
 
Last edited:
The definition of a display resolution is the number of pixels in each dimension that can be displayed. You're throwing out one of the dimensions. Throwing out one dimension to come up with "1080P" or "4K" is okay for marketing terms, but once you start doing things like math (doubling or quadrupling), you need to take both dimensions into account.

Let's say the standard TV display was 1000x500 pixels and it was called "1K". By your definition of resolution doubling, 2000x1 pixels would be "2K" and hence double the resolution. "1K" is correct and "2K" is correct, but nobody in their right mind would call that second screen double the resolution. What happens to the horizontal lines cannot be ignored.

If that's too silly of an example, here is a real life example:

View attachment 714409

No reasonable person would consider these to be the exact same resolution just because they are both 1440p.
This is it. When talking about LCD screens resolution == number of pixels.
 
4K... and a fast connection (25Mpbs minimum). Unfortunately that would leave out most FTTN in Australia, if you wanna believe the media about slow speeds.
 
Oh I can't wait for a 4k "Ghost and Mr. Chicken" /s (Actually I bought that movie on iTunes for watching when traveling and it happens to be a rainy night. Fun movie.)

Screen Shot 2017-08-24 at 8.38.13 PM.png
 
What they really need is an affordable subscription model to actually view the content. They need to compete with Netflix the same way Apple Music competes with Spotify.
 
You forgot the required adapter charges...
Yup. It only has lightning output. Lightning to HDMI adapter will be supplied.

That will be the true courage!
[doublepost=1503626817][/doublepost]
I hope it supports HDR10, Dolby Vision and HLG formats.
That's what Tim Cook's ATV pipeline looks: ATV 5, 6 and 7.

Where are those people who say Apple will turn on all these features just by a mere software update!
 
Read some of the other comments. It's twice the resolution, four times the pixels.

Most sources, and the math, conflict with Shapton's blog post. The implicit math is that "resolution" is the total number of pixels and their aspect ratio as represented by an xy or xyz equation (the Z being for actual three dimensional displays - not to be confused with two dimensional stereoscopic displays). After all, an equation is merely a representation of a value.

Maybe visualizing will help:
(1920x1080)x2 ≠ (1920x2)x(1080x2)
and
(1920x2)x(1080x2) = (1920x1080)x4

With UHD, and as the blog post by the contradictory David Shapton even states, we are going twice the resolution IN TWO DIMENSIONS. This makes it 4 times the total resolution, or two 2x the horizontal resolution times 2x the vertical resolution. As this is a two dimensional display, the 3rd dimension's value is 1, and remains unchanged.

Think of it this way. Let's say my 2D screen has a total resolution of 100 pixels. It's written in the equation 10x10. It's not only cleaner when you get into high pixel counts, but communicates that the aspect ratio is 1:1, of which "megapixels" would not. We could say 10x10x1 since it's a 2D screen, but that's redundant. Another 2D screen has 400 pixels in a 1:4 aspect ratio - we write this as 10x40. In this instance we have 4 times the resolution of my original screen; it's exactly the same as having four 10x10 screens all lined up next to each other. Another screen has the same number of pixels at a resolution of 20x20 pixels. Is this twice the resolution of my original screen or four times? A screen with pixels at two depths has a resolution of 40x40x2. Is this double the resolution of the 20x20x1 screen? Four times? Eight times?
 
Does your TV have optical out on the back? Most do, I think. Since the HDMI connection will be feeding the TV audio, the optical can then just re-output it to whatever you need.

I don't know about now, but within the last few years, that was a highly YMMV prospect. TVs varied quite a bit in what they would output over optical for HDMI inputs.
 
I doubt that any streaming service is going to look better than that of a well-matched Blu-Ray and HDTV. I'd be surprised if the stream-providers are not recompressing content. Amazon Prime streaming most often has a visible grid (like faded graph paper) that is most visible when you're close to the screen. So, if I want reduced judder and reduced frame-skipping, I just download whatever I want to see. My internet speed is fine.
You didn't get my point. The blu rays are perfect and that's the problem. Perfect 1080p and perfect 4K look virtually identical. Imperfect 1080p and imperfect 4K look drastically different with the 4K being noticeably superior. This is why streamed content 1080p vs. 4K will produce a massive enhanced experience.
 
Uh... 4K is 4x the resolution of 1080p, not 2x.

1920x1080 x2 = 3840x2160 which is 4k... Twice the resolution is correct. It's 4x the pixels since resolution is a combination of vertical and horizontal pixels,
 
So I’m theorty I can play all of my VUDU 4k Dolby Vision movies with no problem on this new Apple TV. It’s such a pain because my tv is a Samsung which doesn’t support Dolby Vision but instead supports hdr 10. Such a pain.

No. You will need to buy a new TV that supports Dolby Vision. The ATV will likely output it, but your TV won't know what to do with it, and will fall back to HDR10, which all DV video is required to support.
 
  • Like
Reactions: marvin_h
So I’m theorty I can play all of my VUDU 4k Dolby Vision movies with no problem on this new Apple TV. It’s such a pain because my tv is a Samsung which doesn’t support Dolby Vision but instead supports hdr 10. Such a pain.

In the Blu-Ray format, HDR10 is the standard for 4K with Dolby Vision being optional, and I think Netflix follows this too by dropping back to HDR10 if it doesn't detect Dolby Vision on Dolby Vision supported titles. My understanding is Vudu will be doing the same later this year. And who knows they could even add HDR10+ for your Samsung down the road.
 
I personally watched the 4K and 1080P blu ray disk of multiple movies, including Deadpool, Oblivion, and Star Trek (2009). By watched, I mean we switched back and forth, and I can tell you unequivocally, the difference is extremely minute. It's hardly noticeable, but colors did seem ever so slightly better. That is from blu rays so they are the uncompressed maximum quality you can get.

Now, the caveat is that 4K video is significantly better when you're talking about streaming movies or TV shows. The reason is (I believe), that when you are streaming, unlike watching the uncompressed blu rays, you are getting a greatly compressed stream. What happens is the 1080p streams are always far below 1080p, and the 4K streams are always far below 4K, but the thing is, when you stream the 4K version, you are getting well over 1080p, and therefore it looks significantly better because it is actually noticeable.

Not surprising you didn’t see much of a difference because none of those movie were mastered in 4K except for Deadpool. Even though Deadpool was mastered in 4K, most of the footage was captured in 3.4K.

Blu-rays are still compressed, just way way less than streaming so you get a near visually lossless image. I wish Hollywood would start shelling out for 4K masters and stop releasing stuff in 4K bluray if it’s just an upconvert.
 
  • Like
Reactions: Cigsm
a little too late - I now am quite comftorable using my amazon fire for the 4k content. I will hold on to my money.
 
There are calculators inside my eyeballs that show that 4k does in fact look much better on my 55 inch TV from more than 8 feet away. Don't fall for the hype - they just want to trick you into spending more money on a bigger TV ;)
Believe me, it’s not TV makers. They want you to waste money on smaller 4K TVs all day because it’s just marketing hype. It’s just science and the fact that your retinas, even with perfect vision, cannot discern the pixels from that far at that size. What can they discern? Better color, contrast, and brightness. But not better sharpness. As a matter of fact, from 9ft away, a 55” TV isn’t even full 1080p. If those other features are important to you, then that’s fine. But most people would be better off getting a 1080p TV and saving they money and data usage (many home internet connections have caps now in the U.S.). In a few years large 4K TVs will be much more affordable. People like to argue with me a lot about this on the forums, often because they don’t want to admit that they wasted money on something irrelevant. We can only grow as humans when we learn from our mistakes and move on. You very likely do not have super human vision, and science has proven the limits of our vision, so it’s quite easy to calculate.

Try out the calculator at the bottom of this page to see for yourself: https://referencehometheater.com/2013/commentary/4k-calculator/
 
1920x1080 x2 = 3840x2160 which is 4k... Twice the resolution is correct. It's 4x the pixels since resolution is a combination of vertical and horizontal pixels,
Please check your math. It's not a combination (addition), it's multiplication. See my examples above. You are multiplying each dimension by two, which is the same as two times two (aka times 4).
(1920x1080)x2 = 4,147,200, but 4k has 8,294,400 pixels

What you are saying is (1920x2)x(1080x2) which is the same as (1920x1080)x4 or 3840×2160, but not the same as 1920x1080 times 2.
 
Last edited:
Believe me, it’s not TV makers. They want you to waste money on smaller 4K TVs all day because it’s just marketing hype. It’s just science and the fact that your retinas, even with perfect vision, cannot discern the pixels from that far at that size. What can they discern? Better color, contrast, and brightness. But not better sharpness. As a matter of fact, from 9ft away, a 55” TV isn’t even full 1080p. If those other features are important to you, then that’s fine. But most people would be better off getting a 1080p TV and saving they money and data usage (many home internet connections have caps now in the U.S.). In a few years large 4K TVs will be much more affordable. People like to argue with me a lot about this on the forums, often because they don’t want to admit that they wasted money on something irrelevant. We can only grow as humans when we learn from our mistakes and move on. You very likely do not have super human vision, and science has proven the limits of our vision, so it’s quite easy to calculate.

Try out the calculator at the bottom of this page to see for yourself: https://referencehometheater.com/2013/commentary/4k-calculator/

The only TV's worth buying anymore are 4K models. Don't get me wrong, someone shouldn't necessarily go out and buy a new TV just for 4K. But if someone is already in the market for a new TV, it's either going to be 4K or a POS.
 
  • Like
Reactions: ErikGrim
Not surprising you didn’t see much of a difference because none of those movie were mastered in 4K except for Deadpool. Even though Deadpool was mastered in 4K, most of the footage was captured in 3.4K.

Blu-rays are still compressed, just way way less than streaming so you get a near visually lossless image. I wish Hollywood would start shelling out for 4K masters and stop releasing stuff in 4K bluray if it’s just an upconvert.
I know for a fact Oblivion was shot on 4 and 6k (maybe 8k I forget) cameras. I watched an interview with the director and he stated that. So if that movie isn't native 4k then nothing is.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.