Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Well, the human eyes have a big depth of field compared to cameras on certain settings, so literally "focusing" on just one tree is usually impossible. Now, if you're talking about discerning details, I'm not sure whether or not the human eye is good at it. I've only had 20/20 vision with glasses for small portions of my life.

Actually, it's what the eye is best at. There's a small portion in the very center of your eye is a super dense collection of cones called the fovea that's made for picking out even the smallest differentiation in shapes and colors within a centralized point. Even without perfect 20/20 vision, your fovea is still doing it's job. You just have to be closer up to something to...for lack of a better word...detail scan it.

It's what makes us capable of reading. It's what made it possible for our great ape ancestors to pick out individual branches to swing on with their long monkey arms. It's all about picking out small details in your environment.
 
Actually, it's what the eye is best at. There's a small portion in the very center of your eye is a super dense collection of cones called the fovea that's made for picking out even the smallest differentiation in shapes and colors within a centralized point. Even without perfect 20/20 vision, your fovea is still doing it's job. You just have to be closer up to something to...for lack of a better word...detail scan it.

It's what makes us capable of reading. It's what made it possible for our great ape ancestors to pick out individual branches to swing on with their long monkey arms. It's all about picking out small details in your environment.

Oh, I thought we were comparing to other types of lenses (organic or mechanical). What I was saying about the focus is that you cannot focus on just one tree out of a lot. Your eyes have a big depth of field which will put all of the trees into focus. A Nikon D80, for example, could have a very low F-stop and focus on just one item out of 5 that are barely spaced apart.

The fact that we have 2 eyes probably does make us good at directing our attention at one item since we can aim both at it. So I guess our "attention" is detailed. Trying to look at what is in front of you as one big image is difficult. This is all from experience, by the way, not from actual research.
 
Of course, very few people will ever need to manually set the resolution, and those few people will only need to do it a couple of times. But that one time when you need to change it and you can't will be an extreme annoyance.

If enough people want to do it (that is, scale everything to a new virtual display resolution) there's the incentive for a third-party app… or you could do a bit of research to find out how Apple's new settings actually translate to that model of thinking.

Before the Retina display can we agree that things generally looked pretty crappy if you tried to change the resolution? We were visiting relatives one time, and I couldn't help but notice that they had their iMac set to a non-native resolution. I noticed straight away, because to my eye it looked terrible. But they were obviously okay with it, and had been using it like that for some time. It bothered me because I knew the display could look so much better! Maybe I have a little bit of Steve Jobs inside me. I think this illustrates precisely the kind of scenario that would have caused Steve to lose sleep at night—the idea that his beautiful user-interface with all those painstakingly handcrafted pixels might be unwittingly corrupted by the novice-user. In a way, it's a wonder OS X allowed you to switch to a non-native resolution at all, given how awful it always looked, and given Steve's desire to control every nuance of the finished product!

I haven't tried the new Retina display with one of the alternate settings, but I imagine it looks a lot better. Also, the preferences panel now more accurately reflects the fact that you're scaling graphics, not changing the overall resolution of the display. So all in all, I'd say it's a big improvement.
 
Oh, I thought we were comparing to other types of lenses (organic or mechanical). What I was saying about the focus is that you cannot focus on just one tree out of a lot. Your eyes have a big depth of field which will put all of the trees into focus.

That's just it, you really don't, at least not at a super detailed resolution. You think they do because you're looking at your computer screen, seeing all the stuff around it, and thinking "well, I can see the screen and everything around it. We have a wide depth of field". But...not true. Want proof? Take the sentence below and focus on the bolded word. Try to read the words to the left and right of it without moving your eyes.

....................................She sells lots of seashells by the sea shore..................................

You can see every word in the sentence when focusing on seashells. Might even be able to read the words at the beginning and end. But it won't be clear. Nope. Your focus is dead on in the center of your vision.

Looking at a forest, you wouldn't have any other choice but to look at a single tree. You just don't notice it because your eyes are moving around constantly, and your brain does a goodly bit of interpolating behind the scenes. Your field of view becomes more and more generalized the farther out from the fovea it goes.

A camera lens is more generic. It picks up everything in about equal focus unless you playing around with the aperture.
 
That's just it, you really don't, at least not at a super detailed resolution. You think they do because you're looking at your computer screen, seeing all the stuff around it, and thinking "well, I can see the screen and everything around it. We have a wide depth of field". But...not true. Want proof? Take the sentence below and focus on the bolded word. Try to read the words to the left and right of it.

....................................She sells lots of seashells by the sea shore..................................

You can see every word in the sentence when focusing on seashells. Might even be able to read the words at the beginning and end. But it won't be clear. Nope. Your focus is dead on in the center of your vision.

Looking at a forest, you wouldn't have any other choice but to look at a single tree. You just don't notice it because your eyes are moving around constantly, and your brain does a goodly bit of interpolating behind the scenes. Your field of view becomes more and more generalized the farther out from the fovea it goes.

A camera lens is more generic. It picks up everything in about equal focus unless you playing around with the aperture.

By "focus" I was talking about which objects you see as sharp as opposed to blurred. I should have been more specific. The trees that are close together will all be sharp (the lens focusing on them).

But you are right about what we can focus on with our minds, and the sentence you put demonstrates it. For a brain with so many cores, we are pretty bad at focusing at more than one thing at once.
 
But you are right about what we can focus on with our minds, and the sentence you put demonstrates it.

And in this instance, your mind and your eye are about the same thing (real zen like). You give something a casual glance, and you're only gonna get the basic shape because you're not giving it much attention. Do a thousand yard stare at it for a minute or so, study it in detail, and you'll pick up every dip, doodle, and dot on the thing.

So our eyes are better at far more than picking up just basic shapes, which proves original guy wrong. The eye sees the individual trees in a forest, regardless of if you're aware of it or not.
 
Not my opinion. Our eyes are not made to distinguish one tree from a forest of trees. Its made to recognize shapes and forms, not one element within the whole picture. Thats why the whole idea idea behind retina, based on our eyes resolving one pixel, is hogwash. Yeah, I agree 1920x1200 would be fine, but the retina display native is 2880x1800, a resolution that is almost impossible to use on a 15' screen comfortably.

Retina and individual pixels has nothing to do with running a scaling factor of 1:1 on a 2880x1800 display. It's about screen real-estate at that point. Comfortable or not is your opinion. Don't spread it as fact. Speak only for yourself.

----------

My eyes without glasses will make everything look like impressionist art unless I am looking at something very close. I'm pretty sure everyone in my extended family has glasses.

Wear your glasses. I have perfectly good vision once I put on my glasses and contacts. Are you saying companies should be doing product design around the fact that some people are too "proud" to wear corrective lenses ? :rolleyes:
 
Ah, okay. There are certainly some valid reasons you might want to output to a different resolution, particularly when output is destined for a different kind of display. Perhaps it would be better though for these scenarios to be handled separately from the settings which define the physical size of elements. It has me thinking… I have my MBP hooked up to a 30 inch display. I wonder how the Retina MBP would handle that. I'm assuming it would handle it with all the grace and refinement I expect from Apple, recognising that the 30 inch display has half the resolution, and I would still expect UI elements to look the same physical size.

Here's how it works: If you take the Retina MBP with the "best for retina" settings, it will see its own display as a "1440 x 900 retina" display, and the 30 inch display as a "2560 x 1600 non-retina" display. So there are 1440 x 900 retina points, made of 2x2 pixels, and 2560 x 1600 non-retain points, made of 1x1 pixels. The size of user interface elements will be the same number of points on both screens. If you look at the exact number of points and screen size in inch, you'll see the points on the retina display are maybe ten percent smaller than on the 30", but the same as a 27" 2560x1600 display.


Where in Windows does it let you set your display to 2880x1800?

That's most likely automatic (I haven't tried it). MacOS X knows "this display has 2880x1800 pixels, but you don't want to use it as a 2880x1800 point display, but a 1440x900 retina display, so we don't let the user choose 2800x1800 pixels and destroy their eyes". Windows doesn't know that, it sees 2880x1800 pixels, so it lets you set it to 2880x1800.

The same would most likely happen if you could buy an external display at that resolution and attach it to any Macintosh.
 
Last edited:
Quite the opposite. The most detailed portion of our vision is a rather small space at the center of our field of vision, roughly the size of the moon in the sky. Our eyes are incredibly good at separating and focusing on small details within a large amount of visual noise.
Take a look at a simple object, a square, or a circle, or a triangle. You immediately see the shape, not the pieces that make up that shape. Ever seen those construction lights that tells you to go left or right? Well, you immediately see arrows even though the individual lights are huge and very visible.

Retina and individual pixels has nothing to do with running a scaling factor of 1:1 on a 2880x1800 display. It's about screen real-estate at that point. Comfortable or not is your opinion. Don't spread it as fact. Speak only for yourself.

----------



Wear your glasses. I have perfectly good vision once I put on my glasses and contacts. Are you saying companies should be doing product design around the fact that some people are too "proud" to wear corrective lenses ? :rolleyes:
I haven't spread anything as fact that was only opinion. The ways our eyes work isn't mysterious. Its optimized to see shapes. Its good at distinguishing one object from another. But its horrible at pinpointing and distinguishing uniform pieces that make up a shape. That is a fact. 2880x1800 native being uncomfortable to view is just my opinion, but I thought it was understood as my opinion. The clue was the word uncomfortable which is very subjective. Do we need to put imo in front of every sentence now?
 
Here's how it works: If you take the Retina MBP with the "best for retina" settings, it will see its own display as a "1440 x 900 retina" display, and the 30 inch display as a "2560 x 1600 non-retina" display. So there are 1440 x 900 retina points, made of 2x2 pixels, and 2560 x 1600 non-retain points, made of 1x1 pixels. The size of user interface elements will be the same number of points on both screens. If you look at the exact number of points and screen size in inch, you'll see the points on the retina display are maybe ten percent smaller than on the 30", but the same as a 27" 2560x1600 display.

Thanks for that—make sense. Now given the potential confusion between 1440 x 900 points and 1440 x 900 pixels, I'm all the more convinced Apple did the right thing by avoiding these numbers in the user preferences. That's something developers should be able to get their heads around, but it would only cause many users unnecessary confusion.

That's most likely automatic (I haven't tried it). MacOS X knows "this display has 2880x1800 pixels, but you don't want to use it as a 2880x1800 point display, but a 1440x900 retina display, so we don't let the user choose 2800x1800 pixels and destroy their eyes".

You see how easy it is to say 'pixels' when you mean 'points'! ;)
 
I haven't spread anything as fact that was only opinion. The ways our eyes work isn't mysterious. Its optimized to see shapes. Its good at distinguishing one object from another. But its horrible at pinpointing and distinguishing uniform pieces that make up a shape. That is a fact. 2880x1800 native being uncomfortable to view is just my opinion, but I thought it was understood as my opinion. The clue was the word uncomfortable which is very subjective. Do we need to put imo in front of every sentence now?

Well, yes, kinda, when you start off a paragraph with the following :

Not my opinion.

And then proceed to list out what you think...
 
Those users who want even more screen real estate by tapping into the full 2880x1800 resolution mode of the display can also do so, but the option involves a workaround that is not authorized by Apple.

Haha, wow. :rolleyes:

So who owns your computer? You or Apple?

- "not authorized by Apple"?

If I buy the hardware I will do with it as I please. :mad:
I hope you do.
 
Haha, wow. :rolleyes:

So who owns your computer? You or Apple?

You're allowed to do it, but Apple is not going to care if you come crying to them about your OS getting corrupted from a hack you did. Of course, this is probably a safe hack, but there are dangerous ones.
 
Its good at distinguishing one object from another. But its horrible at pinpointing and distinguishing uniform pieces that make up a shape. That is a fact.

Can you go into more detail about this? The way you're making it sound, the human eye pick up an arrow shape, but has trouble seeing the > and the - it's comprised of.

...which isn't true. The closest to that would be that people tend to notice the general before the abstract.
 
You, just don't expect Apple to help you with it once you've messed around the OS in unsupported ways.

You're allowed to do it, but Apple is not going to care if you come crying to them about your OS getting corrupted from a hack you did. Of course, this is probably a safe hack, but there are dangerous ones.

changing the resolution different than they put in System Preferences is NOT a hack. It is using the machine as intended and even using APIs Apple provides to do the changing.... just the same as Diablo 3 and other games can run at 2880x1800.
 
changing the resolution different than they put in System Preferences is NOT a hack. It is using the machine as intended and even using APIs Apple provides to do the changing.... just the same as Diablo 3 and other games can run at 2880x1800.

If you're using a 3rd party application to change the resolution in a way that is not supported by Apple, don't expect support if issues arises (graphic artifacts, glitches with applications).

That's all. You can try to argue all you want with Apple's support (be it by phone or at the Genius bar), it won't get you anywhere.

Just accept that twiddling with unsupported features comes with consequences. Know what you're doing. No one is going to stop you. No one has to help you though.
 
While I don't understand all the technical talk here, I recently got one as a gift and I really feel there is quite a difference. I had been considering purchasing the 13" MBP that was just released because the price was within my means so I completely understand the argument that it's too expensive to pay 2100. Price aside, the new computer has an amazing screen and I feel the difference between retina and non-retina screen's is apparent when using it. Although, I still think that the 2010-2011 MBP that my girlfriend uses has an amazing screen, especially when I compare it to the HP, Dells, etc. Definitely a step in the right direction with USB3 and the HDMI too :D
 
What is the image size of the desktop screen shot (Shift + Command + 3) in default mode? I am assuming it will be 2880x1800 pixel even though the screen resolution is 1440x900. Can anyone post a picture?
 
Saw a MacBook Retina today at BestBuy. Not that impressed. Not for $2200 at least.
 
laggy pos i heard

Well,... this comment made me stop by the Apple store yesterday, they had the 2.3 i7 rMBP on display, I was just blown away on how awesome the screen really looks in person, I miss the most the MBP branding at the bottom of the screen, with it missing well, it seems less of a mbp to me.

My personal experience with the rMBP for the 1hr that I played with it was that; while pulling up safari, pages, keynote, iphoto and iMovie to launch were all snappy and fast! the app I use most often iMovie was beyond laggy*

... Now I've tried iMovie on the iMac and it's snappy as hell on that beast (2011). I'm not sure what is at play here, maybe the app is designed to run in retina mode only, but as the keynote presentation stated I wanted to get the most real estate while editing in iMovie (not finalcut pro). After launching the app I went into the settings and changed the display so it gave me the most content on the screen (no screenshot sorry). I then proceeded to open up the san francisco project and also the various sample events and began grabbing random clips and injecting them into the san francisco project. the idea was to grab pieces that were not analyzed (there were a few) and see how it did. While in non-retina mode (1920x1200), I noticed that if you go to highlight a clip in your event and attempt to drag it into an existing project, the system lags considerably, it feels for a moment that the mouse in unresponsive. The worst part is that this is a repeatable problem, it doesn't matter if you do it once, or if you do it a number of times. There is lag picking up video content from the events to your iMovie project.

I went back to system preferences, and changed the display back to retina. The resolution was back to the 1440x900 and there was NO LAG!

so I now believe that whomever blurted out that the system was laggy, maybe just maybe they were doing what I did. Now I didn't attempt to run any of the resolution sizeup apps on the instore model after all I had been playing with it for 1hr. and I can honestly say that it's a beautiful laptop. And I was sorely disappointed in the performance of the video, what's more if you playback the video in the non retina resolution (1900x1200) there are dropped frames and stutters, go back to retina, no issues.

I know the system is still driving all the same pixels, one just looks sharper then the other, so what is the deal here?

in other apps, the lag was not apparent in iPhoto, pages, keynote, safari. So is it iMovie? (it can't be, at least I don't think it could be, as I stated it runs snappy as hell in an iMac 2011).
 
Do you actually believe that nonsense from iFixIt? Contrary to what iFixIt says, the battery is easily replaceable (by going to the Apple Store and handing over $199. Compared to say a Dell, where you order a battery for $150 and take out your trusted old screwdriver). Contrary to what iFixIt says, you can replace the LCD screen without breaking it (just don't try to remove the glass cover which isn't there with your big fat fingers). Contrary to what iFixIt says, everything else can be repaired (what do you think UK consumer laws would say if a £1799 computer _cannot be repaired_ after two years? )

Not user fixable, and when applecare runs out the costs will be prohibitive to repair any of the components screen , drive, or even ram failure- effectively killing the secondary market.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.