Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I think an April release would be superior to waiting until June, but I hope they're not going to make everything into an AirBook.. AirBook doesn't have the functionality of a Pro!!

If they include Thunderbolt, it won't make any difference. You'd be able to add any 'Pro' features you want: DVD, Blu-ray, 27" monitor, hard disks, raid, etc.
 
I wonder if Apple makes these little "mistakes" on purpose. Do they want us to discover it? Or is it a genuine mistake to already include the high resolution images?

They must know by now that some people on this planet will try and unveil every secret they are trying to keep... :rolleyes:

I’d say it’s “accidentally on purpose” :)

I think they don’t make it conspicuous, but don’t go through extra effort to hide things. So it winds up being found, gets all sort of press/hype/discussion, and Apple doesn’t look like they’re promoting vapor-ware.
 
Nice.

I can't wait to see Retina displays on Apple computers. :)

(there's a reason I didn't say "Macs... what will they be renamed as?)

How about iMAC, MACbook, MACmini, MAC pro.

What are you talking about?
 
Just looking forward for a larger than 30" cinema display with matte screen. Oh boy I love those displays.
 
Yeah, you're getting it:

When an optmized app (includes "@2x" images) runs on a retina display everything looks very sharp.

If an unoptimized app runs on a retina display OR an optimized app runs on a non-retina display it is not so sharp. BUT... it looks no worse than before there were retina displays.

And actually, even an unoptimized app often gets benefits from running on a retina display. That's because stuff that the OS draws, like text, will take advantage of the display.

You're right to be concerned about the quality of unoptimized apps running on a retina display: unoptimized images have to be upscaled to take up the same amount of physical space on the screen. In general upscaling can make an image look worse. But Apple thought that. By making retina displays exactly double the resolution in each direction, the upscaling needed will be exactly 2 x 2. In that special case upscaling looks very good because each pixel in the impage maps exactly to 2x2 pixels on the screen.

But I think the question is, if you open a 800x600 picture in Preview in a retina-display, would OS X try to make it take up the "same space" as a not-so-retina-display or would it show it pixel-by-pxiel (at which point it would look smaller)?
 
How about iMAC, MACbook, MACmini, MAC pro.

What are you talking about?

That poster is probably alluding to the idea that Apple might brand future computers as something other than "Macs". This idea got a lot of play yesterday when it because clear that Apple has dropped the Mac branding from OS X.

I guess the marketing idea behind it would be: this computer is so much different and better that you can't even compare it to what came before. Like all marketing that's hyperbole, but I guess you could at least make that argument for computers with retina displays. Throw in multi-touch screens (and the OS updates to support it really well -- this was always the fatal flaw with Windows tablets IMHO) and I'll buy it.
 
Great

It doesn't make any difference if you cant keep the app open to use it. Since loading it all it does is crash constantly.
 
But I think the question is, if you open a 800x600 picture in Preview in a retina-display, would OS X try to make it take up the "same space" as a not-so-retina-display or would it show it pixel-by-pxiel (at which point it would look smaller)?

No, I believe it would take up the same space.

(Of course Preview lets you scale the image view up and down, so you could set it to display at "half size" and then the image would be small and sharp. But when you tell it to display at normal size, the 800 x 600 picture will be displayed over 1600 x 1200 display pixels.)

One way to think of it is this: If you have a 2880 x 1800 retina display you can think of it as a 1440 x 900 display where each pixel is made up of 2 x 2 "sub-pixels". Apps are all still working in terms of 1440 x 900, but when drawing the OS (and optimized apps) can use the "sub-pixels" to make things appear sharper.
 
This is enough proof for me. Retina macs for 2012.

The 2x modification - maybe.

The larger icons have been in the OS for years. You need these larger icons when you turn on magnification in your dock.
 
Only thing missing than to my happiness is a version of iBooks on my Mac (yes, I think there is a use for at least some books on the computer).

Hi
You can read from iBooks on your MBP! download Adobe Digital Editions and install. Go to iTunes, highlight the book that you want to read, click on it and click 'open in finder'. When the small box opens, highlight it again, and click on 'open with 'Adobe Digital Editions'. Hey presto, you can read your book,[/QUOTE]

Thank you!
 
No, I believe it would take up the same space.

(Of course Preview lets you scale the image view up and down, so you could set it to display at "half size" and then the image would be small and sharp. But when you tell it to display at normal size, the 800 x 600 picture will be displayed over 1600 x 1200 display pixels.)

One way to think of it is this: If you have a 2880 x 1800 retina display you can think of it as a 1440 x 900 display where each pixel is made up of 2 x 2 "sub-pixels". Apps are all still working in terms of 1440 x 900, but when drawing the OS (and optimized apps) can use the "sub-pixels" to make things appear sharper.

Yep, this seems to be a nice explanation…

I was wondering that maybe Preview (and OS X apps in general, like Safari and other 3rd-party apps) could just honour DPI data in images, much like it does already with ColorSync profiles…

So if you opened an image in Preview originally created with (or processed in) Photoshop and set in 72 DPI, it would be rendered pixel-doubled, pretty much like unoptimized apps are rendered on Retina-display devices, whereas an image set in 144 DPI would, accordingly, be rendered at 1:1 scale. Is that where you're getting at?

Also, it would be very logical if, for plain-vanilla photos imported directly from cameras, smartphones or memory cards, the OS decided on how to render them based on their EXIF data, specifically the camera sensor resolution (I'm guessing that for most modern dedicated point-and-shoot cameras and smartphone cameras the high-DPI mode would be the default anyway, but that could be an useful fallback for older photos).
 
Last edited:
Yep, this seems to be a nice explanation…

I was wondering that maybe Preview (and OS X apps in general, like Safari and other 3rd-party apps) could just honour DPI data in images, much like it does already with ColorSync profiles…

So if you opened an image in Preview originally created with (or processed in) Photoshop and set in 72 DPI, it would be rendered pixel-doubled, pretty much like unoptimized apps are rendered on Retina-display devices, whereas an image set in 144 DPI would, accordingly, be rendered at 1:1 scale. Is that where you're getting at?

Also, it would be very logical if, for plain-vanilla photos imported directly from cameras, smartphones or memory cards, the OS decided on how to render them based on their EXIF data, specifically the camera sensor resolution (I'm guessing that for most modern dedicated point-and-shoot cameras and smartphone cameras the high-DPI mode would be the default anyway, but that could be an useful fallback for older photos).

But the I in DPI would have to mean two things that until now have been distinct. Dots per inch on paper versus per inch on screen. Right?
 
Mountain Lion must not have many new features if Apple has to resort to cheap tricks like this to entice people to upgrade. :rolleyes:
 
I do understand. I guess I was thinking more about how incredibly small some pictures that now seem fine would look. Your suspicion does tell me how difficult a time you had with understanding the difference between pixel density and screen resolution. Some offense.

They will do the same thing apps do on the iPhone4/4S. They will be pixel doubled. They won't look smaller. They will just look pixelated
 
I think Apple won't release Retina display Macbooks until they can produce them for the same prices as current Macbooks...

The pricing strategy of Apple changed by the years. Of course, you can build an expensive Macbook Pro with 512GB SSD, but I think if there is a Retina Macbook in near future, the ultra high resolution display won't be just an option. So it shouldn't be expensive.

But I'm very glad that Apple cares about screen resolution, because it seems that other companies don't...
 
My goodness the screens on these new MBP's are going to blow my mind. Goodbye jagged graphics.

However, I'm concerned about the GPU's that apple use, whether it would even be able to show a cursor on a high rez screen! ;)
 
My question is, with retina displays on computers, is there any reason to go to a higher definition than that? Isn't it called "retina" because the eyes can't perceive the pixels? Will there be high res than that? if so, what is the benefit of a ridiculously huge amount of pixels if we can't see them?
 
What's the likelihood of this happening to the 27" iMac line? Is that even possible? I was under the impression that 4k displays still cost thousands to produce... but damn, a double resolution 27" would be godlike...
 
If they include Thunderbolt, it won't make any difference. You'd be able to add any 'Pro' features you want: DVD, Blu-ray, 27" monitor, hard disks, raid, etc.

It would make a very significant difference. Currently, to use every feature I have on the go, I have to carry exactly two items: the computer, and the power adapter. If they were to make the huge mistake that has been suggested, I'd have to add at least one peripheral for each feature I want to bring back up to acceptable levels, plus cords.

No, I believe it would take up the same space.

(Of course Preview lets you scale the image view up and down, so you could set it to display at "half size" and then the image would be small and sharp. But when you tell it to display at normal size, the 800 x 600 picture will be displayed over 1600 x 1200 display pixels.)

One way to think of it is this: If you have a 2880 x 1800 retina display you can think of it as a 1440 x 900 display where each pixel is made up of 2 x 2 "sub-pixels". Apps are all still working in terms of 1440 x 900, but when drawing the OS (and optimized apps) can use the "sub-pixels" to make things appear sharper.

If I were to open, for instance, a 1080p video on a 1440x900 HiDPI screen, would I be able to view it at full resolution? Would I have to tell the player to display it at half size to fit it on the screen?
 
There's some well thought-out speculation on here, but I'm still not sure how a retina display would cope with images, such as jpg photos.

The consensus seems to be that if I had a small photo, say 320x200 that currently filled up quarter of my screen, the pixels would automatically be 'doubled' so the image would still fill up quarter of the screen and effectively would look exactly the same, as the OS layer displaying the photo would still address the screen in a kind of 2x2 pixels = 1 pixel 'non-retina' mode. That I can go along with.

However, what then happens if my photo is larger than the 'non-retina' size of the screen? If I reduce it to fit on screen, it's still doing to be displayed in the same 2x2 mode where it would now be better to show it in retina mode.

I realise that imaging software could be modified to show smaller images using 2x2 pixels and larger images in retina mode. However, this is the sort of thing that should really be handled at OS level, but it seems too 'messy' a solution for that.

In other words, when images are displayed at 100%, do we really think the OS will choose whether to use retina resolution or not depending on the size of the image?
 
If I were to open, for instance, a 1080p video on a 1440x900 HiDPI screen, would I be able to view it at full resolution? Would I have to tell the player to display it at half size to fit it on the screen?
Yes, to the first question. (Well, it's possible an app that isn't HiDPI aware might inadvertently prevent the OS from taking advantage of the retina display -- e.g., by using non-HiDPI offscreen buffers or doing it's own pixel-level rendering. My guess is that apps using QuickTime for playback while using Apple-supplied codecs will definitely take advantage of retina displays automatically.)

In regard to your second question. First I think most players will fit the video to the screen when first opening the movie. From there you can set it the way you want: e.g., "actual size", "full screen', etc.

In an app that is not retina-display aware the names of menu items might be named confusingly. So when a "half size" option is used on a retina display, the video would actually display at full resolution (exactly one physical screen pixel for each source pixel), while the "100%" option would actually display one source pixel into 2x2 physical screen pixels. For retina aware apps, I'd expect them to avoid confusing labels like that, though I'm not sure exactly how. Apple will hopefully set a good standard for that in their QuickTime player.


There's some well thought-out speculation on here, but I'm still not sure how a retina display would cope with images, such as jpg photos.

The consensus seems to be that if I had a small photo, say 320x200 that currently filled up quarter of my screen, the pixels would automatically be 'doubled' so the image would still fill up quarter of the screen and effectively would look exactly the same, as the OS layer displaying the photo would still address the screen in a kind of 2x2 pixels = 1 pixel 'non-retina' mode. That I can go along with.

However, what then happens if my photo is larger than the 'non-retina' size of the screen? If I reduce it to fit on screen, it's still doing to be displayed in the same 2x2 mode where it would now be better to show it in retina mode.

I realise that imaging software could be modified to show smaller images using 2x2 pixels and larger images in retina mode. However, this is the sort of thing that should really be handled at OS level, but it seems too 'messy' a solution for that.

In other words, when images are displayed at 100%, do we really think the OS will choose whether to use retina resolution or not depending on the size of the image?

I think it's going to be simpler than that. Generally, apps aren't going handle images at a high level any different than they do now. It's just when an image is actually rendered, it will use all available physical pixels to do so.

I think when an app runs on a 2880 x 1800 retina display, for example it will still think of it as a 1440 x 900 display and behave accordingly. Only low-level OS rendering routines will actually work with the physical "sub-pixels".

So for example, suppose you open a 4000 x 3000 photo on two different macs:
(a) has a 1440 x 900 13" display
(b) has a 2800 x 1800 13" display.
The app thinks of both displays as 1440 x 900.
For both it calls an OS API to create a window of size, say, 1400 x 1050 to display the image. But on the retina display the OS knows to actually create a window covering 2800 x 2100 physical pixels.
The app will then call an OS API to draw the 4000 x 3000 pixel image into the window. On the non-retina display the OS will scale the image down to 1400 x 1050 pixels. But on the retina display the OS will scale the image to the 2800 x 2100 physical pixels. Since all those pixels take up the same amount of space on the 13" retina display as just 1400 x 1050 pixels take on the non-retina 13" screen, the image looks a lot sharper on the retina display. Meanwhile, the app doesn't really need to do anything special to take advantage of the retina display. The OS is handling it automatically at the lower level.

Of course, as you mention at the start of your post, this is speculation. But it makes sense because it's the best way to ensure as many apps as possible take advantage of retina displays with the lest amount of effort. Most apps will mainly need to add high DPI artwork for their own icons and other built in bitmap graphics.

edit: sorry went on really long. I think this is the kind of thing that may be confusing when trying to talk about but will instantly make sense once you start using it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.