Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
A screenshot will be shown on your screen's PPI. Unless you can mimic the proper PPI ratio of the 15 MBP on your own screen, you won't be able to see it "at full quality" without simply seeing it straight on the device.

I think the important part is the UI element size in comparison to your screen size (which is really all that is changing here). If you posted a file of the screenshot, you could still look at it in photoshop (at a higher zoom level, or zoom out to see the whole thing on your lower res screen) and get the idea. How do you think anyone who takes nice dSLR photos looks at their giant resolution photos?

Question though - could I theoretically play a full screen game (i.e. Diablo 3) and use the "full" 2880 pixels? seems like yes, there should be a setting in D3 that will use the full resolution.
 
People complaining that you can't select the full resolution of this display by normal means are really really silly. Seriously.

Apple could release a 15" display with 5000x3000 pixel resolution and people would STILL complain when they didn't allow you to set your resolution to it. It's silly.

Apple has changed the paradigm for how they do computer graphical interface displaying by making the actual GUI scale itself smoothly instead of actually changing the resolution that the display is outputting. In other words, they're putting the resolution changing in software instead of hardware. Resolution no longer matters now that the pixels are small enough that you can't see them. It's no longer about getting a high resolution, it's about how clear the image is. In other words, it's now about making an image so clear that it looks natural. Apple has done a wonderful amazing thing. And people STILL complain.

They still list the full resolution on the Tech Specs page because it IS still available. It is available to any application that needs it. For instance, games. Any game can use that resolution with no problem. It can display its pixels at 1:1 and have a wonderful picture. But the OS itself doesn't need to utilize that resolution for the exact reasons stated. Things are just too small for most people.

Stop thinking with old testament resolution terms and embrace the future of computer displays.
 
So bascically, the current OS is like the opposite of the upscaled DVD players when BluRay came out. It has a ton of pixels, but isn't showing the content.
 
So bascically, the current OS is like the opposite of the upscaled DVD players when BluRay came out. It has a ton of pixels, but isn't showing the content.
No. That's NOTHING like what it is. It's showing MORE content. Your analogy is very incorrect.
 
The Resolution "Hack" to 2880 is absolutley great for me !
As i can sort and adjust in a much higher resolution within
Lightroom. I know 90 % of the keyboard shortcuts of Lightroom,
so the tiny buttons are no problem for me.
 

Attachments

  • 1440.jpg
    1440.jpg
    335.9 KB · Views: 229
  • 2880.jpg
    2880.jpg
    367.4 KB · Views: 261
No. That's NOTHING like what it is. It's showing MORE content. Your analogy is very incorrect.
Actually it shows more details, but not necessarily more content. Most content, even in the near future will not even approach 2880x1800.
 
People complaining that you can't select the full resolution of this display by normal means are really really silly. Seriously.

Apple could release a 15" display with 5000x3000 pixel resolution and people would STILL complain when they didn't allow you to set your resolution to it. It's silly.

Apple has changed the paradigm for how they do computer graphical interface displaying by making the actual GUI scale itself smoothly instead of actually changing the resolution that the display is outputting. In other words, they're putting the resolution changing in software instead of hardware.

No—what you're describing actually happened many years ago when we switched from CRT monitors to LCD displays. An LCD screen has always had a fixed resolution (it can never change) whereas a CRT could change to a lower resolution. So really, the idea that you can change your resolution in system preferences is a very old concept and one that should have gone out many years before now.

Sadly, this article is only perpetuating these very outdated concepts, adding to people's confusion, and by MacRumors standards really is a very poorly written and misleading piece—as can be seen by all the confusion evident in people's comments. It was an article that never needed to be written, because as far as I can see, there's no logical requirement for anyone to scale down the entire UI by 50%, just because the resolution of the display happens to be 200% that of its predecessor. No one seemed to want such tiny text and UI elements before this display, so why the sudden need for them now?
 
I dont get it, the desktop is normally 1440x900, I guess doubled up? So why not just use a normal 1440x900screen in it? :confused:

... well the rMPB screen resolution is ALWAYS at 2880x1800, but what the OS is doing is scalling the size of fonts, icons, ... so they look like similar in size than on a regular former 15" screens.

For instance,
When icons are displayed, if they appear on the screen with a size of 10 millimeter x 10 millimeter, then
on a regular 1440 x 900 15" screen they are made of approximately 45 x 45 pixels.
If you want them to look 10 mm x 10 mm (same size) on a 2880 x 1800 15" screen they will have to be scaled to 90 x 90 pixels.

Having a higher screen resolution then allows you to put much more details in the picture of the icons, simply because for the same size (10 mm x 10 mm) you have a lot more pixel available. If apple is reworking the icons picture accordingly (as they did) the icons will look much more detailed and sharper.

What Apple is doing is then giving you the choise of displaying big, normal and small icons, fonts ... namely the standard icons size of 1440 x 900 , 1600 x 1050 and 1920 x 1200 former icon size.

But don t worry, the screen is always at max resosultion 2880 x 1800.
 
Last edited:
People seem to not get that even with the high resolution, its still just a 15" screen. Sure you get more screen real estate, but the extra details are so small, that it would actually hinder productivity. Even people with good eyesight are only human. The limit isn't the eyesight but the way our eyes work.

That's your opinion though about your capabilities. To me, a 15" 1920x1200 display is perfectly usable with the size of elements. 2880x1800 would be stretching it I agree.
 
Hope they will allow us to use that resolution natively… For some things, it could be very useful, such as Photoshop: you would have so much more space for palettes! And it would not slow down your computer since there is no scaling required.
 
Yes, your example is true. But your "retina" example looks great on my 11.5in MacBook Air screen and no matter how many more pixels is added, it won't look any better. All this retina stuff is doing is bloating and slowing apps, games, and so forth.

Now if I had a movie theater size screen, this whole ridiculously high resolution retina idea might make since. But not on screens as small as your average "tv size" or monitor size. Or whatever people are using for their Macs.

If you can't see the difference, then count yourself lucky. Now that I have an MBP with a Retina screen, I don't see how I could go back to a regular one. I can't believe I used to use computers with 320x200 displays.

On the other hand, I count myself lucky that I never developed a taste for expensive wine. People tell me you have to "acquire" a taste for it. I've probably saved more than enough money over the years of not drinking wine to be able to afford this really cool computer with Retina display.
 
Diablo III

Wait wait a second here. If you remember back to the WWDC 2012 images of Diablo III running on the Retina MacBook Pro. " So it was actually running 1440x900 with UI elements rendered at 2880x1080? or it was running native Retina resolution?

"Phil Schiller, Apple’s senior vice president of Worldwide Product Marketing, said “You’re going to see a gaming experience with this resolution unlike any you have ever seen before.” This is obviously thanks to the Retina MacBook Pro’s whopping 2,880 x 1,800 pixels that are packed into a 15.4″ screen."
 
Hope they will allow us to use that resolution natively… For some things, it could be very useful, such as Photoshop: you would have so much more space for palettes! And it would not slow down your computer since there is no scaling required.

The screen is always at 2880 x 1800.

The real estate available in the case of Photoshop does not depend on Apple,
it will be Adobe to decide. If they chose to design the palete with 100 pix x 500 pix for instance. There is nothing Apple can do.
In terms of UI, Apple is probably only having control (and there fore you have control) on font size (make it smaller of bigger). Every other graphical elements of the Photoshop UI is beyong Apple control.
 
So bascically, the current OS is like the opposite of the upscaled DVD players when BluRay came out. It has a ton of pixels, but isn't showing the content.

Why is this so hard to understand? If you take the default settings for the retina display and you look at a photo or a movie and view it at 100% (in most viewers there is a setting for displaying it in original size), each pixel of this content will be mapped to a single physical pixel of the retina display.

This content will be shown at the same size and same detail no matter what setting you choose for the scaling.
The only thing that is scaled here, is the UI such as the menu mars, the button etc. The content is displayed as clear and crisp as it would be if you hacked the OS (as described in this article) in order to render also the UI at the original size.

The problem might just be, that some older apps do not have support for the retina display because they don't take advantage of the new API.
I assume that the OS will scale such older applications as a whole (Content AND GUI). So the benchmark for judging the display quality are the apps that already support the new API (i.e. iPhoto, Aperture, iMovie etc.). But eventually most other apps will update this.

But perhaps someone with a new rMBP can tell me, how the content (just the displayed photo) in an older Photoshop does look like with the default display settings (Setting: Best for Retina display). Is this content scaled as well?
 
Is the screen still glossy as the old MBP?

How is the retina display compared to the old glossy high res MBP display. I could not get use to the glossy screen (Old MBP) because of e.g. the reflecting light and so on....

Is it still as bad. I am use to the Antiglare screen on the 15" MBP.

Thanks

/ Kasper
 
The screen is always at 2880 x 1800.

The real estate available in the case of Photoshop does not depend on Apple,
it will be Adobe to decide. If they chose to design the palete with 100 pix x 500 pix for instance. There is nothing Apple can do.
In terms of UI, Apple is probably only having control (and there fore you have control) on font size (make it smaller of bigger). Every other graphical elements of the Photoshop UI is beyong Apple control.

More confusion.

The most important thing to remember is that with the Retina display, "pixels" (the dots on the screen) are not the same as "points" (the units that the programmer uses to design the size and position of things). A normal MPB 15" has 1440 x 900 pixels displaying 1440 x 900 points. The Retina MBP has 2880 x 1800 pixels displaying 1440 x 900 points.

All the commands that the programmer uses for drawing things are using "points", not "pixels". If a programmer said "put this button 50 away from the bottom of the window, make it 20 high and 100 wide", the numbers mean points, not pixels. So the button will be on exactly the same position with exactly the same size on both Macs. The one on the Retina MBP will just look sharper. Apple user interface elements, all text, all graphics that was higher quality than the screen (like multi megapixel photos), and vector graphics, will be in the exact same position as before, but at higher quality, if the programmer does nothing.

Now Adobe _could_ detect that their program is running on a Retina display, and in that case show smaller UI elements. They _could_ say "we couldn't make this button only 10 units high because nobody could read it, but on a Retina display we _can_ make it that small because it is still readable". And that is of course outside Apple's control. The button will appear with the number of points that Adobe wants it to have. Text will appear in the point size that Adobe wants it to have. Apple has no control over that. What Apple will do is drawing the text at the point size Adobe wants, but using twice as many vertical and horizontal pixels to make it look better.


Why is this so hard to understand? If you take the default settings for the retina display and you look at a photo or a movie and view it at 100% (in most viewers there is a setting for displaying it in original size), each pixel of this content will be mapped to a single physical pixel of the retina display.

That is absolutely confused.

A photo doesn't have an "original size". It has a number of pixels, and then can be printed or displayed at any size you want. Your photo viewer may have a setting that allows you to show the original pixels of the photo. In that case, if the photo is lets say 4000 x 3000 pixels, the photo viewer will use a command that tells the Mac to draw the photo at a size of 4000 x 3000 units.

The units are _points_, not pixels. The photo will be of course massively big so you can only see a tiny part, but you will see exactly the same part on a MBP and a Retina MBP. Since the viewer used a command that will draw one photo pixel = one unit, one photo pixel will be one point = one pixel on the MBP, but it will be one point = 2 x 2 pixel on the Retina MBP.

(You may of course tell your photo editing software that you want to print the photo at 7 x 5 inch, and then the photo editing software may remember this and display the photo at a size of 7 x 5 inch. Again, the photo will have the same size on both Macs, but the Retina MBP will have many more pixels and therefore display the photo at higher quality).
 
Last edited:
So if you can't set it to 2880x1800 directly without the hack, then the specs are false.
Lawyers have sued over much, much less. (eg. the recent fine from the Aussie government).

So, where's the free hack I can get to make the iPhone '4G' work with Australian/European 4G?

Anyway, the resolution of the rMBP is 2880x1800. Even in default mode, any compatible application is displayed at the full resolution, and can take advantage of that to offer smaller text (to fit more on) or show more of your photo at 1:1 pixels.

They haven't restricted the resolution - they've changed the logical pixels-per-inch parameter that (well behaved) applications use to relate pixel sizes to physical size and scale bitmap assets, made the icons bigger so that they're still readable, and added a pixel-doubling fallback mode for non-well-behaved applications that assume a fixed PPI.
 
Last edited:
Okay, people. Here's how it works.

This is an icon:

google_chrome_1.png


An icon such as this is a bitmap, or a picture made up of a bunch of pixels. Yeah, I know. This is condescending, but bear with me here. In this case, our icon is 128x128, or 16,384 individual pixels. We'll say this is the default size of an icon on a 1440x900 desktop.

Now on a Mac, the vast majority of your UI elements set to a certain size that never changes. Your stoplight icons on the top left of the screen might be 32x32, your little folder icons along the left of Finder 16x16. And so on and so on. If you drop down a couple of steps in resolution, say from 1440x900 to 1280x960, these UI elements will suddenly appear bigger. Why? Because the individual pixels aren't as tightly packed. They're being spread out across what's effectively a larger space.

So on a retina display MBP, or 128x128 bitmap icon will appear to be a quarter of the size. It's being displayed on a much more densely packed array of pixels. Now Apple thinks the general desktop size of 1440x900 is the best fit for a 15 inch screen, so what do they do? They look at resolution of the elements on a desktop that size, and quadrupled the resolution, so they're the same size on a much higher res screen.

google_chrome_2.png
.

This icon is 256x256, or 65,536 individual pixels. Quadruple the resolution. Because this fancy new retina screen Apple just released is 4 times larger than the previous lower res 15" MBP, this larger icon will appear to be exactly the same size there as our icon up above.

So the screen has the same effective size as the 1440x900 screen due to the larger UI elements, but is much higher res in actuality.

Geddit?
 
The units are _points_, not pixels. The photo will be of course massively big so you can only see a tiny part, but you will see exactly the same part on a MBP and a Retina MBP. Since the viewer used a command that will draw one photo pixel = one unit, one photo pixel will be one point = one pixel on the MBP, but it will be one point = 2 x 2 pixel on the Retina MBP.

Can someone please confirm/refute this? It would seem crazy for ps/aperture/etc to display 2x2pixel "points" for a photo (unless you're upscaling a tiny image to make it fill more screen, but that's not what you do with "massively big" images)
 
Can someone please confirm/refute this? It would seem crazy for ps/aperture/etc to display 2x2pixel "points" for a photo (unless you're upscaling a tiny image to make it fill more screen, but that's not what you do with "massively big" images)

If you display an image at that scale, you do it because you _want_ to see the individual pixels of the image. And the whole point of a Retina display is that the pixels on the screen are so tiny that you _can't_ see them. So if you tell your software "I want to see the individual pixels of my photo" then displaying one photo pixel as one pixel (0.5 x 0.5 points) on a Retina display completely defeats the purpose.

So it only seems crazy to you because you didn't think it through.
 
Can someone please confirm/refute this? It would seem crazy for ps/aperture/etc to display 2x2pixel "points" for a photo (unless you're upscaling a tiny image to make it fill more screen, but that's not what you do with "massively big" images)

It's as he says it is :

https://developer.apple.com/library...ple_ref/doc/uid/TP30001066-CH202-CJBBAEEC[/B]

Because different devices have different underlying imaging capabilities, the locations and sizes of graphics must be defined in a device-independent manner. For example, a screen display device might be capable of displaying no more than 96 pixels per inch, while a printer might be capable of displaying 300 pixels per inch. If you define the coordinate system at the device level (in this example, either 96 pixels or 300 pixels), objects drawn in that space cannot be reproduced on other devices without visible distortion. They will appear too large or too small.

Quartz accomplishes device independence with a separate coordinate system—user space—mapping it to the coordinate system of the output device—device space—using the current transformation matrix, or CTM. A matrix is a mathematical construct used to efficiently describe a set of related equations. The current transformation matrix is a particular type of matrix called an affine transform, which maps points from one coordinate space to another by applying translation, rotation, and scaling operations (calculations that move, rotate, and resize a coordinate system).
 
The screen is always at 2880 x 1800.

The real estate available in the case of Photoshop does not depend on Apple,
it will be Adobe to decide. If they chose to design the palete with 100 pix x 500 pix for instance. There is nothing Apple can do.
In terms of UI, Apple is probably only having control (and there fore you have control) on font size (make it smaller of bigger). Every other graphical elements of the Photoshop UI is beyong Apple control.

Yes, but the article is talking about the ability to achieve a "traditional" 2880 x 1800 resolution, without scaling graphics to increase size and detail. This allows applications which have NOT been updated for Retina graphics to make use of every single pixel on the screen, albeit at the cost of UI elements being tiny.

This is not a case of it being Adobe's or Apple's decision, it's just the way things change with resolution. If Photoshop does NOT run in "HiDPI" or "Retina" mode, it is by default being pixel-doubled. The hack in this article is talking about how you can disable pixel doubling, basically, and gain a huge amount of space at the cost of a tiny UI.

The problem is that you have to choose between tiny UI elements (hard to see and click, but more space) and normal UI elements (easy to click and see, and more detail, but same amount of space as non-retina screens).
 
That is absolutely confused.

A photo doesn't have an "original size". It has a number of pixels, and then can be printed or displayed at any size you want. Your photo viewer may have a setting that allows you to show the original pixels of the photo. In that case, if the photo is lets say 4000 x 3000 pixels, the photo viewer will use a command that tells the Mac to draw the photo at a size of 4000 x 3000 units.

Well that might even be more confusing.
By "original size" of course I meant the setting that renders one pixel of the photo (that was taken by one pixel of your cameras sensor) to one physical pixel on your display. Original size might be confusing but I'm not the one that gave this name - it's just there, even in Apple Preview. Just have a look yourself.

The units are _points_, not pixels. The photo will be of course massively big so you can only see a tiny part, but you will see exactly the same part on a MBP and a Retina MBP.

I have never heard of this interpretation. The unit points originated from letter printing and is still used for describing font sizes. It is an absolute length unit: one point corresponds to 352.778 μm.
As programmers we talk about pixels of a 2D bitmap. And if we are processing volume data (i.e. of a MRI scan) we talk about voxels. And a photo consists as much of pixels as a movie does.
There might be some intermediate representations involved in the process of scaling and mapping those virtual pixels to physical pixels, but that doesn't prohibit using the term pixels in the context of photos.

Since the viewer used a command that will draw one photo pixel = one unit, one photo pixel will be one point = one pixel on the MBP, but it will be one point = 2 x 2 pixel on the Retina MBP.

If this was the case, and one point (to use your terminology) was drawn onto 2x2 pixels on the Retina MBP, there wouldn't be any improvement of quality.
This method is only applied to not yet supported applications (as for the previous versions of Chrome), in order to display old bad-res icons at a more suitable size.
My point was: regardless of which display scaling you select in the display settings, the important content (photos and videos) make full use of the retina display's physical capabilities, provided they have an adequate resolution. So there is no reason for worrying that Apple would prevent you from using the native resolution - you are always using it. All you do with this "hack", is fiddling with the rendering size of UI elements.
 
I have never heard of this interpretation. The unit points originated from letter printing and is still used for describing font sizes. It is an absolute length unit: one point corresponds to 352.778 μm.
As programmers we talk about pixels of a 2D bitmap.

As OS X/iOS programmers, we actually use CGPoint, which is a struct made up of 2 CGFloats representing X and Y values. This is a device independant mecanism that is latter on mapped to the device-dependant coordinate system after apply the drawing context's transformation matrix for that device.

IE, the Retina display has a different CTM than a normal display, you as a programmer don't have to know about this. Your software will work with the same 1440x900 coordinates that it does on a non-Retina display.

Read the documentation about Quartz I posted earlier, all will be made clear. Gnasher has it right.

If this was the case, and one point (to use your terminology) was drawn onto 2x2 pixels on the Retina MBP, there wouldn't be any improvement of quality.

That's false, since most of your drawing is not pixel based under Quartz or using the upper level frameworks built on top of it. You issue drawing commands and the framework does the required scaling. The frameworks knows about HiDPI mode and scaling factor of the device (you can to. In fact, with the iOS SDK it's easy to test. Use the [UIScreen mainScreen] class method to obtain an handle to the device-dependant screen object, test for the selector [mainScreen scale] and if it works, check its value. If it is 2, you're running a Retina display).

So while you may be a programmer (and a good one at that), you've obviously not worked with OS X or iOS, or not in depth enough with their frameworks to work on a pixel level, or anything beyond Interface Builder.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.