Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Anyone tested on a 4th gen Apple TV to see whether auto framerate switching is introduced on that model?
 
That's what upscaling is.

A 4k TV displaying a 1080p image that is not upconverted will not fit the screen. You just get a shrunk down image in the middle of the screen.

https://www.cnet.com/news/can-4k-tvs-make-1080p-look-better/

No, thats not what upscaling is. If a 4k tv is just using 4 pixels to display 1 pixel of data, it isn’t upscaling. Its just mimicking a 1080p tv.

The article you linked explains this well. It also says that uoscaling involves not just making the image fit, but also at attempt to “add detail”.

In addition to upconverting, upscaling also applies various antialising, sharpening, and other algorithms to fill in the data and make it look like a 1080p image is actually a 4k image.
 
No, thats not what upscaling is. If a 4k tv is just using 4 pixels to display 1 pixel of data, it isn’t upscaling. Its just mimicking a 1080p tv.

The article you linked explains this well. It also says that uoscaling involves not just making the image fit, but also at attempt to “add detail”.

In addition to upconverting, upscaling also applies various antialising, sharpening, and other algorithms to fill in the data and make it look like a 1080p image is actually a 4k image.

Hahahahaha, ahhhh no.
It is definitely upscaling.
The upscaling process can add antialiasing, and sharpening, and whatever else is programmed. But none of those things make it any more or less upscaled.
ANY
image that is displayed on a 4K TV that is not natively 4K is upscaled.
 
Hahahahaha, ahhhh no.
It is definitely upscaling.
The upscaling process can add antialiasing, and sharpening, and whatever else is programmed. But none of those things make it any more or less upscaled.
ANY
image that is displayed on a 4K TV that is not natively 4K is upscaled.
Alright, there is no point to arguing semantics. Especially since it sounds like we're saying the same thing.

If the image is displayed with no real-time on-tv post-processing, then it's fine by me. If there is any antialiasing, sharpening, de-noising, etc, I don't want it. In other words, I want the TV to display the data in the file, and nothing else.
 
The article you linked explains this well. It also says that uoscaling involves not just making the image fit, but also at attempt to “add detail”.
This is some of the modern marketing BS. You can see it everywhere. Some HiRes audio upscalers, for instance, are supposed to "add detail" to otherwise heavily compressed MP3 audio.
I just don't understand, how can it be possible to later reinvent missing information that has once been discarded during the lossy compression. It sounds as if making something out of nothing.
 
This is some of the modern marketing BS. You can see it everywhere. Some HiRes audio upscalers, for instance, are supposed to "add detail" to otherwise heavily compressed MP3 audio.
I just don't understand, how can it be possible to later reinvent missing information that has once been discarded during the lossy compression. It sounds as if making something out of nothing.

Well, for images it seems to work
https://techxplore.com/news/2017-10-small-pixel-perfect-large.html
 
Alright, there is no point to arguing semantics. Especially since it sounds like we're saying the same thing.

If the image is displayed with no real-time on-tv post-processing, then it's fine by me. If there is any antialiasing, sharpening, de-noising, etc, I don't want it. In other words, I want the TV to display the data in the file, and nothing else.
Most people turn all that stuff off anyways, but it's still being upscaled. And by adding pixels to fit it in a 4k screen it is adding data to the file. It's really a non issue. My LG makes 1080p look like 4k. You'd be hard pressed to tell the difference.
 
This is some of the modern marketing BS. You can see it everywhere. Some HiRes audio upscalers, for instance, are supposed to "add detail" to otherwise heavily compressed MP3 audio.
I just don't understand, how can it be possible to later reinvent missing information that has once been discarded during the lossy compression. It sounds as if making something out of nothing.

Isn't that the whole point of algorithms?
Figuring out what is missing using the information we do have is nothing new.
That's how compression works.
That's how CD's/DVD's still play with scratches.
That's how hard drives can still work with bad sectors.
2...4...6...8...?...12...14... What's missing?

I can understand the urge to be a 'purist', but sometimes it is taken a little too far - usually based on ignorance.
 
Most people turn all that stuff off anyways, but it's still being upscaled. And by adding pixels to fit it in a 4k screen it is adding data to the file. It's really a non issue. My LG makes 1080p look like 4k. You'd be hard pressed to tell the difference.

I understand what OneMadrssn is saying.
Basically there is 2 way to do the upscaling :

1) the tv is adding fake pixels to fit the 4k screen with the help of algorithm and or post processing. What you see is not true to the source. You don’t know what color the algorithm will choose for the added pixels.

2) for 1 pixel in the 1080p source the tv is simply generating 4 perfectly identical pixels to fill the screen. Let’s say there is a red pixel in the 1080p picture, now you have a perfect square of 2x2 red pixels. The picture is completely unaltered and so true to the source, like a magnified 1080p picture.

I prefer by far the second method like OneMadrssn.
But unfortunately last time I checked most of the TV set are doing the first technique even with all processing/upscaling options turned off.
At the time I found that only Panasonic upscaler offered the choice of using the second technique.
This is why I wonder what is the brand and model of his TV set.
 
Last edited:
This is some of the modern marketing BS. You can see it everywhere. Some HiRes audio upscalers, for instance, are supposed to "add detail" to otherwise heavily compressed MP3 audio.
I just don't understand, how can it be possible to later reinvent missing information that has once been discarded during the lossy compression. It sounds as if making something out of nothing.

Exactly! Yes.

Sometimes it is a necessary evil. For example, these small popular bluetooth speakers don't have the physical required power and drivers to play music at the fidelity and range normally required. So they use DSP to shift or spread certain tones and volumes away from where there would be distortion to where it sounds passable. For example, this is how you get a small speaker, such as the UE speakers, to have good bass. They aren't actually playing bass-frequency notes, they're playing bass somewhere closer to the mids.

But in a home theater setup, there aren't any such compromises - especially not with video.
[doublepost=1509459849][/doublepost]
Most people turn all that stuff off anyways, but it's still being upscaled. And by adding pixels to fit it in a 4k screen it is adding data to the file. It's really a non issue. My LG makes 1080p look like 4k. You'd be hard pressed to tell the difference.

That's the problem - I don't want it to "make" anything. I want it to just display the data it is given. I want 1080p to look like 1080p.
[doublepost=1509460005][/doublepost]
I understand what OneMadrssn is saying.
Basically there is 2 way to do the upscaling :

1) the tv is adding fake pixels to fit the 4k screen with the help of algorithm and or post processing. What you see is not true to the source. You don’t know what color the algorithm will choose for the added pixels.

2) for 1 pixel in the 1080p source the tv is simply generating 4 perfectly identical pixels to fill the screen. Let’s say there is a red pixel in the 1080p picture, now you have a perfect square of 2x2 red pixels. The picture is completely unaltered and so true to the source, like a magnified 1080p picture.

I prefer by far the second method like OneMadrssn.
But unfortunately last time I checked most of the TV set are doing the first technique even with all processing/upscaling options turned off.
At the time I found that only Panasonic upscaler offered the choice of using the second technique.
This is why I wonder what is the brand and model of his TV set.

That's a good point. I haven't actually checked which technique is used. I just assumed that if all those features are turned off, then it would be #2.
 
That's the problem - I don't want it to "make" anything. I want it to just display the data it is given. I want 1080p to look like 1080p.

Well you can't on a 4k TV as it is adding pixels. Either a movie is shot in 1080p and when you buy the UHD it's just been upconverted for you or it's shot in 4k and then downconverted when put on a 1080p disc. Either way the video is being altered. On disc or on the TV. The only way to display a true, unaltered 1080p feed on a 4k tv is for the image to be shrunk down to original size with no pixels added.

I honestly just don't get it. I think you're taking the purist thing wayyyy too far.
 
Well you can't on a 4k TV as it is adding pixels. Either a movie is shot in 1080p and when you buy the UHD it's just been upconverted for you or it's shot in 4k and then downconverted when put on a 1080p disc. Either way the video is being altered. On disc or on the TV. The only way to display a true, unaltered 1080p feed on a 4k tv is for the image to be shrunk down to original size with no pixels added.

I honestly just don't get it. I think you're taking the purist thing wayyyy too far.

What is so difficult to understand about pixels?

If in 1080p, the data is a 4x2 grid like:

XOXO
OXOX

Then if the 4k tv displays it as:

XXOOXXOO
XXOOXXOO
OOXXOOXX
OOXXOOXX

No data is being added. It is still a 4x2 grid even in the 4k display.

I care only when the 4k TV tries to guess what data is missing to make it look more 4k-ish, such as:

XXOOXXOO
XOOXXOOX
OOXXOOXX
OXXOOXXO
 
What is so difficult to understand about pixels?

If in 1080p, the data is a 4x2 grid like:

XOXO
OXOX

Then if the 4k tv displays it as:

XXOOXXOO
XXOOXXOO
OOXXOOXX
OOXXOOXX

No data is being added. It is still a 4x2 grid even in the 4k display.

I care only when the 4k TV tries to guess what data is missing to make it look more 4k-ish, such as:

XXOOXXOO
XOOXXOOX
OOXXOOXX
OXXOOXXO

Both of these are up-scaling. As has been said you can't show a 1080p image on a 4k screen without some sort of up-scaling unless you want it to fill 1/4 of the screen. The former method you describe would produce the most pixelation. You really don't want your source device or TV doing that. The best upscalers use complex algorithms to attempt to fill in missing data and keep edges sharp. We went through all this with the transition from SD to HD. Same thing now with 4K. It's just the nature of these resolution transitions. In general high quality 1080p sources will look good and sometimes better on 4K. But low quality sources tend to look worse. No way around this issue. The only way to show an unaltered 1080p signal is on a 1080p display.
 
  • Like
Reactions: ErikGrim
Both of these are up-scaling. As has been said you can't show a 1080p image on a 4k screen without some sort of up-scaling unless you want it to fill 1/4 of the screen. The former method you describe would produce the most pixelation. You really don't want your source device or TV doing that. The best upscalers use complex algorithms to attempt to fill in missing data and keep edges sharp. We went through all this with the transition from SD to HD. Same thing now with 4K. It's just the nature of these resolution transitions. In general high quality 1080p sources will look good and sometimes better on 4K. But low quality sources tend to look worse. No way around this issue. The only way to show an unaltered 1080p signal is on a 1080p display.

Agree to disagree then.

I do want my tv doing that. I don't want them to use complex algorithms to fill in missing data.

I disagree with your last sentence though. Pixels have dimensions, right? If a single pixel that is 1mm x 1mm showing red any different than 4 pixels that are .25mm x .25mm each showing red? Either way, I get a 1mm x 1mm red dot, right?

In other words, a 1080p source file should look the same on a 50" 1080p tv as a 50" 5k tv.

So we can argue semantics all day long about upscaling, upconverting, whatever. But I don't consider showing 1 pixel of a 1080p source file as 4 pixels on a 4k tv to be any kind of fancy upconverting.
 
What is so difficult to understand about pixels?
It's not about the pixels. It's about your obsession with this. That's what I don't get. Wanting something to look worse than it can simply because that's what's on the disc. When it can easily look better.
It's like buying a movie on UHD that was not shot in 4k. They process it the same way and put it on a disc. You certainly aren't getting the original as it was shot. But it looks better.

To each their own I guess.
 
Last edited:
  • Like
Reactions: ErikGrim
Agree to disagree then.

I do want my tv doing that. I don't want them to use complex algorithms to fill in missing data.

I disagree with your last sentence though. Pixels have dimensions, right? If a single pixel that is 1mm x 1mm showing red any different than 4 pixels that are .25mm x .25mm each showing red? Either way, I get a 1mm x 1mm red dot, right?

In other words, a 1080p source file should look the same on a 50" 1080p tv as a 50" 5k tv.

So we can argue semantics all day long about upscaling, upconverting, whatever. But I don't consider showing 1 pixel of a 1080p source file as 4 pixels on a 4k tv to be any kind of fancy upconverting.

This is not semantics or opinion. When you try to display lower res images on higher res screens there are limitations. When we moved from SD to HD there was a whole market for external processors to try and upscale to get the best possible image. The simple method of just duplicating pixels just doesn't look good. The simplest example is taking a lower res photo and enlarging it to fill a higher res screen which is just duplicating pixels - it looks bad and pixelated. It can be somewhat improved upon by using image editing software to resize but there are still limits.
 
Isn't that the whole point of algorithms?
Figuring out what is missing using the information we do have is nothing new.
That's how compression works.
Well, there are compressions and there are compressions. Some discard data (for whatever reason, for humans mostly for perceptual incapability), some do not.
That's how CD's/DVD's still play with scratches.
That's how hard drives can still work with bad sectors.
2...4...6...8...?...12...14... What's missing?
These are bad examples, because for these usecases we have actually stored some special extra data - the checksum. So, there is more information stored than was in original payload. And it does not make the discarded frequencies reappear in the MP3/ATRAC compressed audio.
I have no obsession whatsoever with the compression, but I consider this claim of quality restoration out of nowhere (in the case of digital images and music outright discarded parts of source signal) to be a marketing gimmick.
Easy test - try to pull the shadow areas from a JPEG file and a RAW file. In JPEG we concsiously discarded some of the information (as not perceivable sublety), so it can not be restored artificially, by algorithm.
 
Anyone tested on a 4th gen Apple TV to see whether auto framerate switching is introduced on that model?

I did, and it doesn't. The non-4k ATVs don't have any option for 24p at all, which just plain sucks.

*changes your avatar to a "crying man" one* ;)
 
And i thought my explanation was pretty clear. What a fool I was :p
Ok the 2 method are “upscaling” there is no choice, the screen have to be completely filled with picture.
But there is one method that involves alterations to the picture with algorithms guessing what to do and trying to embellish the quality in a false way.
And there is the other method that is a pixel perfect representation of the 1080p picture. The equivalent of 1 pixel is now 4 identical pixels for the purpose of filling the screen, but here the picture is not altered at all.

Let’s say you have a 55” 1080p tv and a 55” 4k tv. With the last method you will see the exact same picture on both tv. Exactly the same !
And there is nothing wrong with a good quality 55” 1080p tv, so there is absolutely no reason for a 1080p picture to look bad on a 55” 4k TV set using the second method.
 
Last edited:
  • Like
Reactions: oneMadRssn
It's not about the pixels. It's about your obsession with this. That's what I don't get. Wanting something to look worse than it can simply because that's what's on the disc. When it can easily look better.
It's like buying a movie on UHD that was not shot in 4k. They process it the same way and put it on a disc. You certainly aren't getting the original as it was shot. But it looks better.

To each their own I guess.

No, I don't want it to look worse. I want it to look like it's supposed to. My issue with the real-time on-tv processing is it often makes things look worse. Sometimes it's the soap-opera effect of frame interpolation, sometimes it removes film-grain that should be there, sometimes it jacks up colors in scenes that were designed to look dull, sometimes anti-aliasing can make lines sort of vibrate or jitter, sometimes it add artifacts or bands in gradients. All of that is worse than just accepting the fact you only have a 1080p source and just view it as is.

When professionals go through a frame-by-frame upgrade of a lower-quality source, that is different than on-tv real-time processing. At least there, a human with eyes and discretion is applying the algorithms.
[doublepost=1509467231][/doublepost]
This is not semantics or opinion. When you try to display lower res images on higher res screens there are limitations. When we moved from SD to HD there was a whole market for external processors to try and upscale to get the best possible image. The simple method of just duplicating pixels just doesn't look good. The simplest example is taking a lower res photo and enlarging it to fill a higher res screen which is just duplicating pixels - it looks bad and pixelated. It can be somewhat improved upon by using image editing software to resize but there are still limits.

You're assuming a 4k tv is necessarily bigger than a 1080p tv. Using pixel duplication, a 1080p movie on a 50" 1080p tv should look the same on a 50" 4k tv. No blowing up anything - same size.

Of course if I take a 2MP photo and blow it up to be 10 feet wide it will look terrible. That's not what I'm talking about, at all. I'm no filter, no algorithm, nothing will make a 3x5 2MP photo better on a 3x5.
[doublepost=1509467432][/doublepost]
You guys have ruined this thread, I was loving reading about the TvOS update and you guys have spoilt it with utter geek stupidity. Watch an image that looks great 'to your eyes'. You've all gone too far on this! You belong in some AV forum.

Problem is, what we learned yesterday about the update seems to be all there is to learn so far. I'm very happy with this change, but there aren't exactly daily developments, ya know?
 
Last edited:
No, I don't want it to look worse. I want it to look like it's supposed to. My issue with the real-time on-tv processing is it often makes things look worse. Sometimes it's the soap-opera effect of frame interpolation, sometimes it removes film-grain that should be there, sometimes it jacks up colors in scenes that were designed to look dull, sometimes anti-aliasing can make lines sort of vibrate or jitter, sometimes it add artifacts or bands in gradients. All of that is worse than just accepting the fact you only have a 1080p source and just view it as is.

I've never seen it look worse. Not ever. Maybe the LG OLED just has better processing but my 1080p content always looks better in every way. My rips streamed though my oppo look great as well.
 
  • Like
Reactions: Poontaco
I've never seen it look worse. Not ever. Maybe the LG OLED just has better processing but my 1080p content always looks better in every way. My rips streamed though my oppo look great as well.

I'm glad you're happy with it. I prefer to keep source as is.
 
This is my last comment about this so we don't continue to de-rail this thread. In theory I do understand what some of you are saying that combining 4 smaller pixels into 1 pixel on a 4k tv should be equivalent to watching the same content on a 1080p display. While I don't know what the technical reasons are but in practice there seems to be a difference between a single larger pixel and combining pixels. It just doesn't look the same. Probably has to due with space in between pixels.
 
Has anyone noticed if this Beta addresses the issues with the remote, especially on Apple TV 4k?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.