Hypothetical situation for the sake of my understanding:
Two computers, both completely identical in terms of hardware except one has a native display resolution of 1440x900 (A) and the other is 2880x1800 (B).
Now, a series of thought questions assuming both are running at a resolution of 1440x900:
1. How would the display on computer A differ from that of computer B? Would one look blurrier than the other?
2. What would be the difference in terms of performance between the two computers? Obviously this is negligible for standard tasks, but on intensive 3D applications, a 20% increase in performance can mean the difference between smooth and studdery.
3. How would battery usage differ between the two computers? (again, they are on the same resolution)
Two computers, both completely identical in terms of hardware except one has a native display resolution of 1440x900 (A) and the other is 2880x1800 (B).
Now, a series of thought questions assuming both are running at a resolution of 1440x900:
1. How would the display on computer A differ from that of computer B? Would one look blurrier than the other?
2. What would be the difference in terms of performance between the two computers? Obviously this is negligible for standard tasks, but on intensive 3D applications, a 20% increase in performance can mean the difference between smooth and studdery.
3. How would battery usage differ between the two computers? (again, they are on the same resolution)