Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Don't focus too much about the actual resolution but think about pixel density.

Current screen: one black pixel
Retina screen: 4 smaller black pixels and in HiDPI mode on Lion, it'll look like 1 black pixel.

It's about showing more details at smaller scale but showing it the same way to the user. Because it has more details, the text will be much sharper.

If you were to think about the resolution, then it will be one black pixel on current screen and one black pixel on Retina screen and three unused pixels.

Does that help?

So I should look at it as higher pixel density instead of thinking that the higher resolution will be showing more information on the same size screen? It's not going to double what fits on a 1440x900 now?
 
So I should look at it as higher pixel density instead of thinking that the higher resolution will be showing more information on the same size screen? It's not going to double what fits on a 1440x900 now?

That's right, this does not double the amount of usable area for you, it's exactly the same area but 2x sharper.
 
Maybe I'm just not understanding resolution independence. A 2880x1800 screen would be insanely clear, as long as the text sin't downsized. Currently if you lower the resolution of a high res screen, you lose some of the clarity, and the lower the resolution, the less sharp it becomes. So are you saying that with this much higher resolution you will be able to lower the resolution to make the text larger yet the screen will be much sharper than it is with today's panels? I hope that made sense.

I think you had it right the first time.

Here's the full concept.
Let's say you started with a 15" 1280x800 screen. Horrible, right?
The pixels are loosely packed. As a result, the text looks very big, but also blurry and pixelated. Let's say this hypothetical text is 1"x1".

Then we move up the chain to a 15" 1440x900 screen.
The pixels are more densely packed, and the text are displayed on the same amount of pixels. This causes the text to shrink. Now the text is something like 0.8" x 0.8".

On a 2560x1600 15" screen, the pixels would be so dense that the text would appear nearly microscopic. 0.1" x 0.1".


But with resolution independence, the 15" 2560x1600 screen would raise the size of the font automatically to match the desired size. These sizes are defined by the resolution. So if you set the 2560x1600 screen to display text at the size of the 1440x900 screen, instead of the text being 0.1" x 0.1", it would be 0.8" x 0.8".

HOWEVER! They also raise the font size and scale it correspondingly. So for text that is 10pt, which hypothetically would be 0.1" x 0.1", the computer would recognize this and raise the font size to match the physical size of the 1440x900, or 0.8" x 0.8". So the computer would recognize the 10pt text and raise the font size to something like 36pt.

But this would cause discrepancies if the font size actually changed. Imagine editing a 10pt document, then opening it on a 2560x1600 15" MBP and the computer raising the font to 36pt automatically! To avoid this, the computer raises the font size on the DISPLAY only. The actual font size remains 10pt, but the displayed font size is 36pt.

Because the displayed font size is raised, the text appears crisp and clean while retaining the size of 0.8" x 0.8" on a 2560x1600 15" display.
 
That's right, this does not double the amount of usable area for you, it's exactly the same area but 2x sharper.

So it would be like comparing three tv's, all the same size, one standard definition, one 720p and one 1080p. All would have the exact same size picture, text, etc, but the sharpness and clarity would get better with each step up. So this new 2880x1800 would have the same size text and onscreen real estate as the current 1440x900?
 
Depends on what technologies they decide to use, we just don't know until they release it first. AMOLED has the capability of being more brighter without requiring more power.

1. Using AMOLED would either get rid of the glowing apple or require a separate LED for the apple logo.
2. AMOLED needs to be as bright as the current LCD panels, which it is not.
3. Bright AMOLED displays tend to burn out faster, especially the blue OLEDS.
4. Technologies that solve the problem above have been developed by other companies, mainly Samsung.
5. AMOLED is expensive, TN panels are not.
 
I think you had it right the first time.

Here's the full concept.
Let's say you started with a 15" 1280x800 screen. Horrible, right?
The pixels are loosely packed. As a result, the text looks very big, but also blurry and pixelated. Let's say this hypothetical text is 1"x1".

Then we move up the chain to a 15" 1440x900 screen.
The pixels are more densely packed, and the text are displayed on the same amount of pixels. This causes the text to shrink. Now the text is something like 0.8" x 0.8".

On a 2560x1600 15" screen, the pixels would be so dense that the text would appear nearly microscopic. 0.1" x 0.1".


But with resolution independence, the 15" 2560x1600 screen would raise the size of the font automatically to match the desired size. These sizes are defined by the resolution. So if you set the 2560x1600 screen to display text at the size of the 1440x900 screen, instead of the text being 0.1" x 0.1", it would be 0.8" x 0.8".

HOWEVER! They also raise the font size and scale it correspondingly. So for text that is 10pt, which hypothetically would be 0.1" x 0.1", the computer would recognize this and raise the font size to match the physical size of the 1440x900, or 0.8" x 0.8". So the computer would recognize the 10pt text and raise the font size to something like 36pt.

But this would cause discrepancies if the font size actually changed. Imagine editing a 10pt document, then opening it on a 2560x1600 15" MBP and the computer raising the font to 36pt automatically! To avoid this, the computer raises the font size on the DISPLAY only. The actual font size remains 10pt, but the displayed font size is 36pt.

Because the displayed font size is raised, the text appears crisp and clean while retaining the size of 0.8" x 0.8" on a 2560x1600 15" display.

I think I'm understanding this now. You and MikhailT have been a big help. I hate to admit it, but I am most comfortable working at 1440x900 on a 15". I was worried that the retina display would have text like the 17" MBA and I just cannot use something that small. The 1440x900 screen is ok for me on the 13" MBA, and even better on the 15" MBP. The 11" Air, the 15" high res and 17" MBP are really a no go with these tired eyes. I should have listened to my mother, it does make you go blind. :D:D
 
So it would be like comparing three tv's, all the same size, one standard definition, one 720p and one 1080p. All would have the exact same size picture, text, etc, but the sharpness and clarity would get better with each step up. So this new 2880x1800 would have the same size text and onscreen real estate as the current 1440x900?

That's comparing apples to oranges because that's not how it works in reality with the TVs but if we were to take your word that the picture/text is the same size to each TV, yes that's the idea. It'll be the same size but much shaper because there are more details showing up for each piece of information.

This is a microscopic look at the same text with iPhone 3G and iPhone 4 with Retina display. You see how there's more pixels but showing the same size? Look at how sharp it is.
iphone-microscopy-01.jpg


----------

1. Using AMOLED would either get rid of the glowing apple or require a separate LED for the apple logo.
2. AMOLED needs to be as bright as the current LCD panels, which it is not.
3. Bright AMOLED displays tend to burn out faster, especially the blue OLEDS.
4. Technologies that solve the problem above have been developed by other companies, mainly Samsung.
5. AMOLED is expensive, TN panels are not.

My bad, I was thinking of OLED with several different branches and because AMOLED is more popular I used that instead.

I didn't suggest that Apple would switch to OLED, I'm just saying that there are several technologies at the market that can show brighter pixels without consuming more power. OLED was one of them, we don't know the full scale of all the research being done and of which is not being shown to us yet.

As for your list of points, all are fixable with the next generation of AMOLED technologies. AMOLED is still young, everything starts out the same way.
 
I thank you both for you patience and explanations. I just had it stuck in my head that any resolution increase would make things smaller. Now the pixel density will be much higher making things much clearer. I've seen this with the iPhone, but for some reason I never have resolution in mind when thinking about a phone like I do with a monitor.
Again, thank you for helping me understand this better.
 
They won't - they'll look pixelated.

If a graphic is 100x100 pixels on a standard resolution display, it will have to be stretched to 200x200 pixels on a retina display to remain the same size as other elements if Apple is doubling everything.

The high DPI might make the pixelation less noticeable, but website graphics will suffer as they are being stretched beyond their native resolution.

Shouldn't all the iPhone 4 owners be able to answer that? I would imagine the pictures on the websites even scaled up look the same or better because the physical size on the screen didn't change. It is lake scaling up a picture to double its size in photoshop and then look at it while having the same dimensions on the screen. Welll not exactly but somehow?

In any case - a better screen is always welcome and it would be one of those features that would make me thinking about getting a new MBP.
 
The practicality issue is not with the consumers, but with the manufacturers.
Apple would have to pay gigantic premiums for these custom panels to be made, and at a non-standard resolution too.
It's much simpler for Apple to buy 1080p or 1600p 15" panels that are already in production instead of use this Retina Display standard.

Apple tries to scate where the puck will be and not where it is . . .;)
 
That's right, this does not double the amount of usable area for you, it's exactly the same area but 2x sharper.

That isn't even close to true.

A better explanation is that there is double the usable space, but all the screen elements are blown up 200% so they appear to be the same relative size as before.
 
https://www.macrumors.com/2011/12/1...lution-retina-display-macbook-pro-in-q2-2012/Image

In a report sure to excite our readers, DigiTimes is saying that Apple may be readying an ultra high-resolution MacBook Pro for as early as second quarter (Q2) 2012:The publication cites supply chain partners as the source for the information which would double the resolution of the MacBook Pro to 2880x1800 pixels.

Not to piss on anyone's parade here, but am I alone when I say that DigiTimes seems to consistently spew S%@*T about what :apple:'s plans are.
They guess, like everyone. It's just that since they have been making these rumors like 2-3 times a week, of coarse it's no surprise if one of them has similarities to the product being launched. But that's called :cool:

I for one am getting mad with it because they can't confirm any of these reports with any credible source. They're like the tabloids of Chinese technology manufacturing gossip.
 
1. Using AMOLED would either get rid of the glowing apple or require a separate LED for the apple logo.
2. AMOLED needs to be as bright as the current LCD panels, which it is not.
3. Bright AMOLED displays tend to burn out faster, especially the blue OLEDS.
4. Technologies that solve the problem above have been developed by other companies, mainly Samsung.

Well, not even Samsung's AMOLED screens are free of the problems listed under 1-3. Even their latest screens are known for burn-ins.
 
Yes they have.



Let me quote someone that answers you perfectly why you're wrong :



See, you're not talking about the GPU. I believe the topic was that "GPU power was not sufficient to push this kind of resolution". I say it is, you say I'm wrong based not on GPUs but on the type of display port they use.

I don't think anymore needs to be said, stick to the topic. GPU power is sufficient. That until Dual link DVI, HDMI and DP we didn't have enough bandwidth to get these pixels to a monitor is not related to the discussion of future monitors and GPU power.

You bear your forum nickname well indeed.

DVI HDMI or DP have nothing to do with the graphic card's rendering controller.
Let me show you some basics :

(GPU operations : renders 3D polygons : first bottleneck since rendering more polygons = more workload) => (GPU image buffer : flattens the polygons as a 2D image (60-75hz), here is the second bottleneck) => (Output controller to monitor : converts the 2D image to numerical signal to the monitor which is a grid of a precise number of pixels, here is the third bottleneck that is blurry interpolation of a smaller rendering to a fixed larger grid if you don't run a native output).

Increasing resolution that high : Slows you on step 1 / Is theorically impossible with current and past hardware on step two / Renders as poop (like crappy VGA cables) at step 3
 
Last edited:
DVI HDMI or DP have nothing to do with the graphic card's rendering controller.

Removed the insults. Quoting this part because that's basically what I just told you. The output bus and bandwidth to the monitor has nothing to do with the actual processing unit ability to push out the pixels. It's 1 bottleneck (which we have solved around 2008 with DP on Macs).

The initial premise was about GPU power, not everything around the GPU. My point was that GPU power is here and has been here for a while. You have yet to contradict this.

BTW, I have reported your post for the insults, but feel free to edit it before a mods get a hold of it.
 
I second this

I thank you both for you patience and explanations. I just had it stuck in my head that any resolution increase would make things smaller. Now the pixel density will be much higher making things much clearer. I've seen this with the iPhone, but for some reason I never have resolution in mind when thinking about a phone like I do with a monitor.
Again, thank you for helping me understand this better.

Appreciate the detailed responses.
 
Removed the insults. Quoting this part because that's basically what I just told you. The output bus and bandwidth to the monitor has nothing to do with the actual processing unit ability to push out the pixels. It's 1 bottleneck (which we have solved around 2008 with DP on Macs).

The initial premise was about GPU power, not everything around the GPU. My point was that GPU power is here and has been here for a while. You have yet to contradict this.

BTW, I have reported your post for the insults, but feel free to edit it before a mods get a hold of it.

Where your premise is wrong or dishonest is that you claim "GPU power has been here since ever... for 2D renderings"

And from there I proved you with spec numbers that even now, there is not enough GPU buffer bandwidth to push a 2D rendering (originated from the flattened 3D calculations) above 2480*1560 to the VGA/DVI/HDMI controller.
Secondly your argument is even further flawed by the simple fact that resolution is a bottleneck of 3D processing (read : GPU power).

Your arguments are utterly flawed and idiotic in the way that you consider GPU power to be either enough or not enough. That is not the fact, there is a continuum that is framerate in the GPU power consideration, and "it works" versus "it doesn't work" situation retarding the width of the gpu output buffer.
So unless you do enjoy running your games or any 3D software at 10 frames per second as long as it comes out of an Apple's ass, please stop bragging that such high resolutions are or have been whatsoever the source of limitations.

Btw I am amused that you are still confusing DP/HDMI/VGA/DVI with the GPU buffer...

As for the screen estate debate :
For a given real screen estate : 15"
Having a higher resolution will shrink UI elements (icons/images/whatever that is a constant file somewhere in the hard drive) if the OS or APP has none of these ressources available at an equally higher resolution.
That means that if you wanna keep same sized icons for example your icons have to be twice bigger if you are using a twice bigger resolution, or else they'll be displayed twice smaller or upscaled (thus "pixelated" and ugly).
Dynamic UI elements (elements that are rendered by the CPU or GPU through calculations like fonts or lines or windows) are not subject to this shrinking.
 
Last edited:
Where your premise is wrong or dishonest is that you claim "GPU power has been here since ever... for 2D renderings"

It's neither wrong nor dishonest. I have never claimed to be talking about 3D games. I always stated I was talking purely about the frame buffer for a desktop, with compositing effects.

And from there I proved you with spec numbers

Spec numbers having nothing to do with the actual Graphical Processing Unit, but with all the controllers surrounding it.

that even now, there is not enough GPU buffer bandwidth to push a 2D rendering (originated from the flattened 3D calculations) above 2480*1560 to the VGA/DVI/HDMI controller.

Again, not related to the capabilities of the GPU itself. Also, you're confusing rendering to a texture mapped onto polygons with compositing effects which are what desktop uses.

The GPU has plenty of internal bandwidth to push it's fill-rate to the display controller. You haven't provided any evidence that it doesn't, only your own pure "I don't want to be wrong" conjecture.
 
That isn't even close to true.

A better explanation is that there is double the usable space, but all the screen elements are blown up 200% so they appear to be the same relative size as before.

More like 400%, they're blown up 200% in two ways.

That specific explanation is what causing people to be confused about what we're talking about. The fact that it is blown up to 400% is going to make people think it'll look bad because of their experience with changing the resolution from the native resolution and adjusting DPI.

I'm just explaining what HiDPI mode is and Apple's intention with Retina displays. This is confirmed in their Lion's release, there are plans to support HiDPI. Nobody is going to be able to use 2880x1800 at 15", it would virtually be impossible to read anything on the screen, increasing the DPI for the text is not going to help and decreasing the resolution is going to make everything blurry until you go down to the HiDPI mode.
 
I think you had it right the first time.

Here's the full concept.
Let's say you started with a 15" 1280x800 screen. Horrible, right?
The pixels are loosely packed. As a result, the text looks very big, but also blurry and pixelated. Let's say this hypothetical text is 1"x1".

Then we move up the chain to a 15" 1440x900 screen.
The pixels are more densely packed, and the text are displayed on the same amount of pixels. This causes the text to shrink. Now the text is something like 0.8" x 0.8".

On a 2560x1600 15" screen, the pixels would be so dense that the text would appear nearly microscopic. 0.1" x 0.1".


But with resolution independence, the 15" 2560x1600 screen would raise the size of the font automatically to match the desired size. These sizes are defined by the resolution. So if you set the 2560x1600 screen to display text at the size of the 1440x900 screen, instead of the text being 0.1" x 0.1", it would be 0.8" x 0.8".

HOWEVER! They also raise the font size and scale it correspondingly. So for text that is 10pt, which hypothetically would be 0.1" x 0.1", the computer would recognize this and raise the font size to match the physical size of the 1440x900, or 0.8" x 0.8". So the computer would recognize the 10pt text and raise the font size to something like 36pt.

But this would cause discrepancies if the font size actually changed. Imagine editing a 10pt document, then opening it on a 2560x1600 15" MBP and the computer raising the font to 36pt automatically! To avoid this, the computer raises the font size on the DISPLAY only. The actual font size remains 10pt, but the displayed font size is 36pt.

Because the displayed font size is raised, the text appears crisp and clean while retaining the size of 0.8" x 0.8" on a 2560x1600 15" display.

So in other words, say goodbye to WYSIWYG as we know it...which is not exactly a good thing. :rolleyes:
 
Anyway, there IS room for some innovation in case of MacBook Pro displays. 1280x800 on 13" panel is ludicrous these days, even though it's IPS. And some kind of "Retina display" on a MacBook would be a killer feature indeed.
 
Good news at last

For all of us pixel junkies, this is the first bit of good news we've heard in a long time. Finally, someone might be pushing back against the trend of dumbing down computer displays. The industry have convinced most of the public of two things. First - that flat-panel factories only know how to make displays in one aspect ratio and that has to be 16:9 because that's high definition. Second - that 1366x768 is the best resolution available because it can be called "high-def." I do have to give them credit for finding a way to give their customers so much less and still convince them that they're getting more. When the orange juice industry started putting 59 ounces into a 64-ounce container, they hoped no one would notice. But when the display manufacturers took away 10 percent of the pixels on their 16:10 displays (and often switched to generally lower-resolution while they were at it), they promoted it as a breakthrough. Who wants an old-school 1920x1200 display when you can have a shiny new High-Def 1366x768 display?

I've never really been a Mac fan myself. I'm used to Windows and it does everything I need and I don't have time for a learning curve. I bought one of the last PCs with a 1920x1200 display when I didn't really need one because I expected we were in for a long dry spell in display resolution. I'm hoping that PC manufacturers will see fit to introduce similar high-res displays to keep up. But if Apple comes out with these and the PC makers don't, my next computer is going to be an MBP.

I can only assume that Steve Jobs sent back the specs for these from Heaven.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.