If i want to watch a 4k video do i need an app to change the native resolution? does it stay in that simulated 1400x900 to get the retina effect or does it scale the video to the panels native resolution?
hmmm. not really the answer Im looking for. If I play a 4k video in full screen does it scale the video to native 2880 x 1800 or is it using the 1400 x 900 "retina" 4 subpixel
hmmm. not really the answer Im looking for. If I play a 4k video in full screen does it scale the video to native 2880 x 1800 or is it using the 1400 x 900 "retina" 4 subpixel
but then of course people will forget about pixels...and that might be a bad thing.
Thats the same thing, as the retina sub pixel in this case is the pixel. If you have HiDPI content (such as 4K), it is rendered using all available physical pixels. So yes, drawing 4K content on retina screen will give you best possible image quality. Otherwise there wouldn't be much point to the retina display, would there?
----------
I think thats not a bad thing at all, even more, its a next natural step. People are too fixated on pixels and they usually forget that pixels are a dirty hack in the first place, something that was only introduced because of our limited technology to display image data. Once the resolution of the displays (and the capability f processing hardware) is high enough, pixels and all the corresponding nonsense like multi-sampling will be forgotten. What is important for image data is its spatial resolution - how much information is encoded per area unit of data. We are on our best way to move away from pixels and use natural units (such as cm) in the UI. That would make so much more sense!
P.S. Regarding computer games - I wonder why people are still so fixated on resolutions. I mean, a non-native resolution will be blurry anyway, so why not make it more flexible? A much better option would be a performance/blurriness slider. First, it would allow the user to select exactly the IQ/fps tradeoff they are willing to take; second, it would prevent all the mess dealing with screen resolution changing within the application, as it will always run at native resolution. In fact, this is the preferred way to run OpenGL apps on OS X and Apple has been advocating this for a while.
Thats the same thing, as the retina sub pixel in this case is the pixel. If you have HiDPI content (such as 4K), it is rendered using all available physical pixels. So yes, drawing 4K content on retina screen will give you best possible image quality. Otherwise there wouldn't be much point to the retina display, would there?
----------
I think thats not a bad thing at all, even more, its a next natural step. People are too fixated on pixels and they usually forget that pixels are a dirty hack in the first place, something that was only introduced because of our limited technology to display image data. Once the resolution of the displays (and the capability f processing hardware) is high enough, pixels and all the corresponding nonsense like multi-sampling will be forgotten. What is important for image data is its spatial resolution - how much information is encoded per area unit of data. We are on our best way to move away from pixels and use natural units (such as cm) in the UI. That would make so much more sense!
P.S. Regarding computer games - I wonder why people are still so fixated on resolutions. I mean, a non-native resolution will be blurry anyway, so why not make it more flexible? A much better option would be a performance/blurriness slider. First, it would allow the user to select exactly the IQ/fps tradeoff they are willing to take; second, it would prevent all the mess dealing with screen resolution changing within the application, as it will always run at native resolution. In fact, this is the preferred way to run OpenGL apps on OS X and Apple has been advocating this for a while.