That's what upscaling is.
A 4k TV displaying a 1080p image that is not upconverted will not fit the screen. You just get a shrunk down image in the middle of the screen.
https://www.cnet.com/news/can-4k-tvs-make-1080p-look-better/
No, thats not what upscaling is. If a 4k tv is just using 4 pixels to display 1 pixel of data, it isn’t upscaling. Its just mimicking a 1080p tv.
The article you linked explains this well. It also says that uoscaling involves not just making the image fit, but also at attempt to “add detail”.
In addition to upconverting, upscaling also applies various antialising, sharpening, and other algorithms to fill in the data and make it look like a 1080p image is actually a 4k image.
Alright, there is no point to arguing semantics. Especially since it sounds like we're saying the same thing.Hahahahaha, ahhhh no.
It is definitely upscaling.
The upscaling process can add antialiasing, and sharpening, and whatever else is programmed. But none of those things make it any more or less upscaled.
ANY image that is displayed on a 4K TV that is not natively 4K is upscaled.
This is some of the modern marketing BS. You can see it everywhere. Some HiRes audio upscalers, for instance, are supposed to "add detail" to otherwise heavily compressed MP3 audio.The article you linked explains this well. It also says that uoscaling involves not just making the image fit, but also at attempt to “add detail”.
This is some of the modern marketing BS. You can see it everywhere. Some HiRes audio upscalers, for instance, are supposed to "add detail" to otherwise heavily compressed MP3 audio.
I just don't understand, how can it be possible to later reinvent missing information that has once been discarded during the lossy compression. It sounds as if making something out of nothing.
Most people turn all that stuff off anyways, but it's still being upscaled. And by adding pixels to fit it in a 4k screen it is adding data to the file. It's really a non issue. My LG makes 1080p look like 4k. You'd be hard pressed to tell the difference.Alright, there is no point to arguing semantics. Especially since it sounds like we're saying the same thing.
If the image is displayed with no real-time on-tv post-processing, then it's fine by me. If there is any antialiasing, sharpening, de-noising, etc, I don't want it. In other words, I want the TV to display the data in the file, and nothing else.
This is some of the modern marketing BS. You can see it everywhere. Some HiRes audio upscalers, for instance, are supposed to "add detail" to otherwise heavily compressed MP3 audio.
I just don't understand, how can it be possible to later reinvent missing information that has once been discarded during the lossy compression. It sounds as if making something out of nothing.
Most people turn all that stuff off anyways, but it's still being upscaled. And by adding pixels to fit it in a 4k screen it is adding data to the file. It's really a non issue. My LG makes 1080p look like 4k. You'd be hard pressed to tell the difference.
This is some of the modern marketing BS. You can see it everywhere. Some HiRes audio upscalers, for instance, are supposed to "add detail" to otherwise heavily compressed MP3 audio.
I just don't understand, how can it be possible to later reinvent missing information that has once been discarded during the lossy compression. It sounds as if making something out of nothing.
Most people turn all that stuff off anyways, but it's still being upscaled. And by adding pixels to fit it in a 4k screen it is adding data to the file. It's really a non issue. My LG makes 1080p look like 4k. You'd be hard pressed to tell the difference.
I understand what OneMadrssn is saying.
Basically there is 2 way to do the upscaling :
1) the tv is adding fake pixels to fit the 4k screen with the help of algorithm and or post processing. What you see is not true to the source. You don’t know what color the algorithm will choose for the added pixels.
2) for 1 pixel in the 1080p source the tv is simply generating 4 perfectly identical pixels to fill the screen. Let’s say there is a red pixel in the 1080p picture, now you have a perfect square of 2x2 red pixels. The picture is completely unaltered and so true to the source, like a magnified 1080p picture.
I prefer by far the second method like OneMadrssn.
But unfortunately last time I checked most of the TV set are doing the first technique even with all processing/upscaling options turned off.
At the time I found that only Panasonic upscaler offered the choice of using the second technique.
This is why I wonder what is the brand and model of his TV set.
That's the problem - I don't want it to "make" anything. I want it to just display the data it is given. I want 1080p to look like 1080p.
Well you can't on a 4k TV as it is adding pixels. Either a movie is shot in 1080p and when you buy the UHD it's just been upconverted for you or it's shot in 4k and then downconverted when put on a 1080p disc. Either way the video is being altered. On disc or on the TV. The only way to display a true, unaltered 1080p feed on a 4k tv is for the image to be shrunk down to original size with no pixels added.
I honestly just don't get it. I think you're taking the purist thing wayyyy too far.
What is so difficult to understand about pixels?
If in 1080p, the data is a 4x2 grid like:
XOXO
OXOX
Then if the 4k tv displays it as:
XXOOXXOO
XXOOXXOO
OOXXOOXX
OOXXOOXX
No data is being added. It is still a 4x2 grid even in the 4k display.
I care only when the 4k TV tries to guess what data is missing to make it look more 4k-ish, such as:
XXOOXXOO
XOOXXOOX
OOXXOOXX
OXXOOXXO
Both of these are up-scaling. As has been said you can't show a 1080p image on a 4k screen without some sort of up-scaling unless you want it to fill 1/4 of the screen. The former method you describe would produce the most pixelation. You really don't want your source device or TV doing that. The best upscalers use complex algorithms to attempt to fill in missing data and keep edges sharp. We went through all this with the transition from SD to HD. Same thing now with 4K. It's just the nature of these resolution transitions. In general high quality 1080p sources will look good and sometimes better on 4K. But low quality sources tend to look worse. No way around this issue. The only way to show an unaltered 1080p signal is on a 1080p display.
It's not about the pixels. It's about your obsession with this. That's what I don't get. Wanting something to look worse than it can simply because that's what's on the disc. When it can easily look better.What is so difficult to understand about pixels?
Agree to disagree then.
I do want my tv doing that. I don't want them to use complex algorithms to fill in missing data.
I disagree with your last sentence though. Pixels have dimensions, right? If a single pixel that is 1mm x 1mm showing red any different than 4 pixels that are .25mm x .25mm each showing red? Either way, I get a 1mm x 1mm red dot, right?
In other words, a 1080p source file should look the same on a 50" 1080p tv as a 50" 5k tv.
So we can argue semantics all day long about upscaling, upconverting, whatever. But I don't consider showing 1 pixel of a 1080p source file as 4 pixels on a 4k tv to be any kind of fancy upconverting.
Well, there are compressions and there are compressions. Some discard data (for whatever reason, for humans mostly for perceptual incapability), some do not.Isn't that the whole point of algorithms?
Figuring out what is missing using the information we do have is nothing new.
That's how compression works.
These are bad examples, because for these usecases we have actually stored some special extra data - the checksum. So, there is more information stored than was in original payload. And it does not make the discarded frequencies reappear in the MP3/ATRAC compressed audio.That's how CD's/DVD's still play with scratches.
That's how hard drives can still work with bad sectors.
2...4...6...8...?...12...14... What's missing?
Anyone tested on a 4th gen Apple TV to see whether auto framerate switching is introduced on that model?
It's not about the pixels. It's about your obsession with this. That's what I don't get. Wanting something to look worse than it can simply because that's what's on the disc. When it can easily look better.
It's like buying a movie on UHD that was not shot in 4k. They process it the same way and put it on a disc. You certainly aren't getting the original as it was shot. But it looks better.
To each their own I guess.
This is not semantics or opinion. When you try to display lower res images on higher res screens there are limitations. When we moved from SD to HD there was a whole market for external processors to try and upscale to get the best possible image. The simple method of just duplicating pixels just doesn't look good. The simplest example is taking a lower res photo and enlarging it to fill a higher res screen which is just duplicating pixels - it looks bad and pixelated. It can be somewhat improved upon by using image editing software to resize but there are still limits.
You guys have ruined this thread, I was loving reading about the TvOS update and you guys have spoilt it with utter geek stupidity. Watch an image that looks great 'to your eyes'. You've all gone too far on this! You belong in some AV forum.
No, I don't want it to look worse. I want it to look like it's supposed to. My issue with the real-time on-tv processing is it often makes things look worse. Sometimes it's the soap-opera effect of frame interpolation, sometimes it removes film-grain that should be there, sometimes it jacks up colors in scenes that were designed to look dull, sometimes anti-aliasing can make lines sort of vibrate or jitter, sometimes it add artifacts or bands in gradients. All of that is worse than just accepting the fact you only have a 1080p source and just view it as is.
I've never seen it look worse. Not ever. Maybe the LG OLED just has better processing but my 1080p content always looks better in every way. My rips streamed though my oppo look great as well.