Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
KnightWRX - question to help me get my head round all this resolution stuff.

If I drop the resolution of the high-res 15" MacBook Pro (call it A) from 1680x1050 to 1440x900, and compare that to the standard-res 15" MacBook Pro of 1440x900 (call it B), then the clarity on 'A' won't be as sharp as 'B' even though they are the same size of screen. I know the LCD isn't at its native resolution in 'A', but given that the elements are the same size and resolution as 'B' why does 'A' not look as sharp? Or does images appear as sharp, it is the font scaling that suffers for what ever reason?

That's where my confusion has come in - as LCDs only look good at their native resolution and dropping them down below that impacts sharpness/clarity. That's what I based my previous augment on, because I assumed that elements would not look as sharp as they would on a 1440x900 display if the elements are the same resolution when upscaled to a 2880x1800 display of the same size.

It looks blurry because the pixels don't match up. 1680 : 1440 = 7 : 6. That means when you switch your display to 1440x900, it has to display six pixels in seven physical pixels on the screen which is impossible to do well. With double the resolution, it has to display one pixel in two physical pixels on the screen, which means it just displays that pixel twice.

On the other hand, let's say you have this hi res MBP, and you want to display 1680 x 1050. In that case the hardware has to map 7 pixels to 12 pixels, which cannot be done perfectly, but with _less_ blur than with your current hardware because the pixels are only about half the size, so the blur is much less visible.
 
To handle the same graphic fidelity (max settings, max res) the GPU will have to push 3x more pixels than before. What I'm hoping is Apple stick in an incredible graphics chip, what I'm dreading is Apple don't and games will be played at half (or 4x, whichever way you work things out) lower resolution.

Gaming, high resolution video editing, graphics work (or anything that's GPU intensive) at 5120x2880 wouldn't be a fun thing on existing iMac GPUs.
1) This thread is about MBP and current (or 3-4 year old MBP) can already drive the proposed retina display.

2) Depending on what kind of path Apple is following, if they take the same approach as iPhone 4 and at first just render text and special UI element in retina resolution you wouldnt need to push 3x more pixels than before (3x wouldnt be the right number anyway).

3) Yeah, and if we move to talk about games (seems like alot of people in this thread only focus on games when talking about GPU:s), thats not up to Apple, and i doubt that nVidia or AMD could make a 4x faster GPU in time for the next iMac, no chance at all i would say.

4) High resolution video editing? "high resolution video editing, graphics work (or anything that's GPU intensive) at 5120x2880" That works with current iMac, but what does 5120x2880 have to do with high resolution videoediting? Yeah 5120x2880 would be the pixelcount for your proposed retina iMac but that has nothing to do with how much GPU power a high resolution video is using....also, graphic works isnt dependent on the pixels your screen is pushing...

So i really dont get why you are draggin in the retina iMacs displays pixel in how hard it would be to use the current iMacs GPU for high resolution video editing. A 4k movie would require just as much horsepower no mather what resolution your screen have.....the same goes for graphic work.
 
There is a considerable concern for "pushing pixels" with a modern OS, because there are 3D graphics effects and pixel shaders being used all over the place. Apple has been utilizing the graphics card for several OSes, now. The integrated graphics in the i5/i7 chips already limits you on how many pixels you can output on an external display. The frame buffer is not high enough any anything less than the discrete card on the upgraded current-generation 15" to handle something like a 2.8k screen.

Uh ? That is so false I don't know where you got it from. The MBP 13" can power 2 2560x1440 monitors for 1.

30 MB of Frame buffer ram are required to power a 2560x1440 double buffered screen at 32 bits per pixel (since the frame buffer is going to retain the alpha value of RGBA pixels).

No, frankly, the integrated graphics in the i5/i7 and even a lot of integrated graphics over the last few years have had no such limitation.
 
There is so much dumb in this thread that it makes my head hurt. Starting with the initial story.

There is no logical reason for Apple to double the resolution of a desktop display. The current "standard" 15 inch mbp resolution clocks in at 110 dpi. The high-res option is 128 dpi, which is the same as the 13 inch mba.
True, but there is a threshold where you have to consider if the dpi is too high to be usable, and then designate it "HiDPI" (like Retina on iOS)

This the reason why I'd rather have true resolution independence support in the OS. Let Apple decide how large (in real units) UI elements should be and draw them the same size on all displays. Let the user adjust that size as they see fit. The only UI elements that wouldn't change, of course, are rulers, which would always show the correct length on all displays.
 
I like the concept of a retina display, but like another poster said, I will not buy a new MBP unless it has a 1680x1050 (3360x2100) retina option. I will still buy a redesign with a 1680x1050 regular option. But going to any variation on the 1400 resolution isn't an option.
 
Movies have nothing to do with Desktops. Why you are trying to equate both is beyond me.
It's exactly the same thing. The number of pixels in a given area determines the resolution of the displayed image, period. Doesn't matter if it's a picture, a movie, or a desktop image. Higher resolution is always better.

Plus, tons of people watch movies on their computers. It's a completely valid comparison.
 
Actually games are some of the few pieces of software out there that often already implement decent rez independence already, so it'll look fine

he's talking about the fact that games often cannot run in native resolution on a laptop, due to a lack of gpu or cpu power, and are thus forced to output to a non-native lcd screen resolution in order to maintain frame rates.

If you open up something like Portal 2 on a 13 inch macbook air, it will default to something lower than the native pixel size of the display. In a CRT, that is not an issue, because they could deliver any number of "native" resolutions.

A 720p screen cannot display a 620p signal cleanly, though, so you end up with fuzzy pixels. This happens all the time on games, and is one of the reasons people end up wanting faster graphics cards with larger frame buffers.

it is true that if you must select a lower resolution, doing an exact 1/2 of the native res is a good solution, because then each projected pixel can take up exactly 4 real pixels. Anything other than that would look fuzzy. Even 1/2 ratio will probably look a little fuzzy, but nothing so bad as being just a couple hundred pixels less.

The main reason that this isn't happening is because Apple hasn't completely converted OS X to handle high dpi screens, yet. There are still low-res pixel-based elements all over the UI.
 
Could someone explain to the simpletons like me who dont quite get it. A 2880x1800 display will not give us that actual screen real estate (shame) it will just make things appear clearer, is that right?

So in theory you could open lots of windows on the screen that are really small but with this resolution you could actually read the text whereas if you did it currently the text would be too small and unreadable?

Is the gist of it?
 
cool option that i would never choose cause i will just use my HD cinema display for serious work anyway .. i will soon get a 13 inch mac book pro
 
Take a look at the image from the original news post. The Apple icon on the top left for example, that is how all web images will look on a 2880x1800 15" display, not sharp at all. All web images will have to be blown up to keep the same physical real estate on screen as a 1440x display. Text will obviously scale beautifully but all images will look nasty. The internet world is not ready for this. I have my doubts.
 
what I would like to know is

what does this proposed retina display pixels per inch size mean to someone who is near sighted?

the 1900 x... resolution on my 2010 iMac is terrible, very tiny. I've played with the resolution and don't like make the resolution at a lower setting. The zoom in or enlarging text is no better for me.

I want to be able to read and write at comfortable distance without having to tweak settings.
 
Could someone explain to the simpletons like me who dont quite get it. A 2880x1800 display will not give us that actual screen real estate (shame) it will just make things appear clearer, is that right?

So in theory you could open lots of windows on the screen that are really small but with this resolution you could actually read the text whereas if you did it currently the text would be too small and unreadable?

Is the gist of it?
It's however they want to develop it. With more pixels, you can do either: designs icons and graphical elements at the same scale they were previously (which will appear sharper) or design them at a smaller scale (which nets more screen real estate. Of course, you can do a combination of both as well.

It's likely they will just double the pixel density (which keeps screen real estate the same) as they did with the iPhone 4 to keep things nice and simple. But one could develop it however they wish.
 
It's exactly the same thing. The number of pixels in a given area determines the resolution of the displayed image, period. Doesn't matter if it's a picture, a movie, or a desktop image. Higher resolution is always better.

Plus, tons of people watch movies on their computers. It's a completely valid comparison.

No, not when talking about screen real-estate it's not. Movies have a fixed resolution. They are either upscaled or downscaled. When talking about real-estate, I don't want "upscaling" of my fonts, I want more letters on screen.

So in reply to my original comment, your video example was atrocious. I'm saying there's not enough text showing up on my screen at 1440x900, I want more text visible without scrolling. 2880x1800 HiDPI doesn't fix that, it just makes the text that's already there sharper. Nice. Give us more real-estate instead. 1680x1050 by default on the 15" MBP. 1920x1200 option on the 15" MBP.
 
Uh ? That is so false I don't know where you got it from. The MBP 13" can power 2 2560x1440 monitors for 1.

30 MB of Frame buffer ram are required to power a 2560x1440 double buffered screen at 32 bits per pixel (since the frame buffer is going to retain the alpha value of RGBA pixels).

No, frankly, the integrated graphics in the i5/i7 and even a lot of integrated graphics over the last few years have had no such limitation.

Which 13 MBP allows two external monitors? There is no daisy chaining at all with the thunderbolt port for displays. None.

And while 30 MB of frame buffer might be all you need to fill 2560x1440 pixels with a series of static 2560x1440 images, that is not all that the frame buffer of a modern GPU on a modern OS is asked to do.

http://www.anandtech.com/show/2804
 
It's however they want to develop it. With more pixels, you can do either: designs icons and graphical elements at the same scale they were previously (which will appear sharper) or design them at a smaller scale (which nets more screen real estate. Of course, you can do a combination of both as well.

It's likely they will just double the pixel density (which keeps screen real estate the same) as they did with the iPhone 4 to keep things nice and simple. But one could develop it however they wish.

It's all nice and dandy for mac apps but you people tend to forget about the all powerful internet. Sites are not designed for retina screens that large. Every image on the web that was designed for 72pdi will need to be upscaled meaning it will not look great. Text can upscale fine but images not so much.
 
Text will obviously scale beautifully but all images will look nasty. The internet world is not ready for this. I have my doubts.
The apple looks nasty because it has been scaled. If you bother to read the article, it states quite clearly "Apple has even added ultra-high resolution artwork in Lion with desktop images at 3200x2000 pixels and icons at 1024x1024 pixels." All that needs to happen is for designers and developers to create new elements at the new, higher resolution (just like they did for the iPhone 4). Sure, it took a few months before most apps took advantage of that benefit, but now nearly everything is available in glorious 320 ppi. A move to high pixel density on a laptop would be no different.
 
Which 13 MBP allows two external monitors? There is no daisy chaining at all with the thunderbolt port for displays. None.

Hum... you missed the part where it works if you put a device between the 2 TB displays. ;)

And while 30 MB of frame buffer might be all you need to fill 2560x1440 pixels with a series of static 2560x1440 images, that is not all that the frame buffer of a modern GPU on a modern OS is asked to do.

http://www.anandtech.com/show/2804

Sure it's not, but that's why we haven't had 32 MB of VRAM for quite a few years now. Don't worry your little head, the current GPUs aren't taxed at all.
 
Just as a thought, are people confusing the ability of graphics cards to push certain resolutions with the strain placed by anti-aliasing, etc? Traditional gaming would struggle with high resolutions with the inclusion of AA, MVAA, and everything else using current hardware. Retina displays would remove the need for that in many cases though (presumably). For anything else, even video, current hardware ought to be able to handle such resolutions just fine.

That being said, I would undoubtedly be severely tempted to buy a machine like this. The screen on my iPhone 4 is very nice, and a similar computing experience would be wonderful for reading text and looking at figures in journal articles.
 
It's all nice and dandy for mac apps but you people tend to forget about the all powerful internet. Sites are not designed for retina screens that large. Every image on the web that was designed for 72pdi will need to be upscaled meaning it will not look great. Text can upscale fine but images not so much.

The browser will simply blow up the images. It will look the same as it does currently. We've had that discussion 2 pages ago.

A 1440x900 15" display showing a 100x100 image will look the same as a 2880x1800 15" display showing the same 100x100 image scaled to 200x200.
 
A 1440x900 15" display showing a 100x100 image will look the same as a 2880x1800 15" display showing the same 100x100 image scaled to 200x200.
But it will look vastly inferior to everything surrounding it that has been developed for a higher pixel density. I understand his point, but it assumes that no one is going to update their graphics to the new standard.

I disagree; would you want to be the ugly website in a sea of much better looking ones? People will get on the ball. I won't even download apps that aren't 320 ppi anymore (and obviously the number of those are fading quickly in time).
 
Sigh.

Games will be unplayable unless Apple puts some double core new GPU's in it to compensate for the workload.

And don't come at me with "lower resolution" kind of ignorant response...
Downscaling resolution to ANY NON NATIVE resolution leads to BLURRY image because there is hardware, or worse software level interpolation involved.

Unless Apple cuts his margin to put behind the screen a top notch interpolation chip (akin to those found in high end flat screens and pro monitors : really expensive) you'll get a blurry result out of that screen at any resolution that's not the original one.
 
But it will look vastly inferior to everything surrounding it that has been developed for a higher pixel density. I understand his point, but it assumes that no one is going to update their graphics to the new standard.

It'll look the same as it does on the iPhone 4 right now compared to a 3GS. Do you find images "vastly" inferior to the fonts ? I don't even notice frankly.
 
But it will look vastly inferior to everything surrounding it that has been developed for a higher pixel density. I understand his point, but it assumes that no one is going to update their graphics to the new standard.

It won't look as bad as you think.

I disagree; would you want to be the ugly website in a sea of much better looking ones? People will get on the ball. I won't even download apps that aren't 320 ppi anymore (and obviously the number of those are fading quickly in time).

It isn't just Apple that's on the bandwagon for HiDPI displays. Once they start becoming more common, website developers will start creating HiDPI graphics. But they need a customer base worth doing that for, so it will probably be a slow process.
 
Sigh.

Games will be unplayable unless Apple puts some double core new GPU's in it to compensate for the workload.

And don't come at me with "lower resolution" kind of ignorant response...
Downscaling resolution to ANY NON NATIVE resolution leads to BLURRY image because there is hardware, or worse software level interpolation involved.

Games are already "blurry" anyhow. I never actually use the LCD resolution in games because frankly, the effect of using non-native resolution is already off-setted by the intense filtering, blending and anti-aliasing that most games do.

Lowering the resolution on a game does not have as pronounced an effect as it does on the desktop.

Not to mention you could just run at 1440x900 on a 2880x1800 display and it'll look fine.
 
Do you find images "vastly" inferior to the fonts ? I don't even notice frankly.
Of course I do, it's like looking through a screen door. It's funny how once you get used to something superior, what once was acceptable now is completely off-putting.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.