How close to Retina Display is the cuurent 17" MacBook Pro's display?

Discussion in 'MacBook Pro' started by Smiller4128, Jun 5, 2012.

  1. Smiller4128 macrumors member

    Joined:
    Feb 1, 2010
    #1
    Hi there! Bought a 17" MacBook Pro a little over 40 days ago and have been extremely happy with it ever since! There's nothing this awesome machine can't do for me! Now with that being said, there are of course the rumors that the new MacBook pro's coming out will have Retina Display in them. To be honest, this is the only new rumored feature that I really care about as it's quite a switch going from the ipad 2 to the ipad 3 with Retina Display. However, I have that the 17" MacBook Pro's screen is already high definition and close to being Retina Display in its own. So my question is, is this true? Does the current model 17" Screen already resemble something close to a "Retina Display"? I'd say no looking between my ipad 3 and my macbook pro but it could just be that they look different on a computer as opposed to a ipad. If it's not, then i'm still within Best buy's return period for silver members and would be very sided into returning it and waiting for the release of the Retina Display macbook pros.
     
  2. simsaladimbamba

    Joined:
    Nov 28, 2010
    Location:
    located
    #2
    "Retina resolution" is "defined" as something above 260 dots per inch (DPI), the 17" MBP has around 130 DPI, thus only 2840 x 2400 pixel would get the 17" MBP to "Retina resolution".
     
  3. w00t951 macrumors 68000

    w00t951

    Joined:
    Jan 6, 2009
    Location:
    Pittsburgh, PA
    #3
    Apple's site has a formula, I think. It's either that or they provided one during the Apple keynote introducing the iPhone 4.
     
  4. neilpryde23 macrumors regular

    Joined:
    Nov 28, 2011
    #4
  5. Arelunde macrumors 6502a

    Arelunde

    Joined:
    Jul 6, 2011
    Location:
    CA Central Coast
    #5
    Dummy question: Can an existing screen be upgraded to Retina via software?

    Just wondering...
     
  6. ivanpk macrumors regular

    Joined:
    Jul 8, 2011
    #6
    No, since you have to physically increase the number of pixels.
     
  7. Arelunde macrumors 6502a

    Arelunde

    Joined:
    Jul 6, 2011
    Location:
    CA Central Coast
    #7
    Darn!

    Would like to say other things, but that will have to do... :(
     
  8. Adamantoise macrumors 6502a

    Joined:
    Aug 1, 2011
    #8
    Nah won't happen.

    Retina as defined by Apple is a resolution fine enough that the human eye cannot distinguish between adjacent pixels.

    Ultimately, this depends on one's viewing distance.

    As for what resolution they pick? That depends on what the HD4000 can drive without hindering performance really. I doubt there are several panels to choose from on this.
     
  9. Stetrain macrumors 68040

    Joined:
    Feb 6, 2009
    #9
    "Retina" isn't a fixed DPI. It's a function of both DPI and viewing distance. That's why a 1080p TV looks pretty good from across the room on your couch but when you walk to within a foot or two of the TV you can see the grid of pixels.
     
  10. Randomoneh, Jun 6, 2012
    Last edited: Jun 6, 2012

    Randomoneh macrumors regular

    Randomoneh

    Joined:
    Nov 28, 2011
    #10
    He's joking.
    *****

    Definitive answer: it depends on your minimum separable visual acuity, display contrast (if you have shi*ty contrast, high pixel density is a waste since you can't perceive it) and viewing distance.

    We'll make a guess about all of these.

    Minimum separable acuity (lower is better) at normal contrast levels, for general population is anywhere from 30 to 60 arcseconds (0.5 to 1 arcminute). For young, healthy eyes it is closer to 30 arcseconds and for older eyes it is closer to 60 arcseconds.

    Personally, my minimum visual acuity at my display is 28.8 arcseconds (0.48 arcminutes)

    If you want to take a test on your display, tell me and we'll calculate your minimum visual acuity.

    But for now, let's make a guess and say your eyes are not that healthy and you're not that young and let's say your visual acuity is 36 arcseconds (0.6 arcminutes).

    That means, for every degree of your field of view, you need 100 pixels (36 arcseconds is 0.01 degrees).

    For your display, we'll assume display is capable of reaching normal contrast values.

    As for viewing distance, let's guess it's 26''. At that distance, your 17'' MacBook is occupying 31 degrees of your horizontal field of view. We have said you need 100 pixels per degree, so you'd need 100 pixels * 31 degree = 3100 pixels horizontally and 1938 pixels vertically (3100/1.6).

    That is, at guessed minimum separable acuity of 0.6 arcminutes and viewing distance of 26'', you need same display with resolution on 3100x1938 pixels.

    If you want more precise calculation, we need to hear from you.
     
  11. sammich, Jun 6, 2012
    Last edited: Jun 12, 2012

    sammich macrumors 601

    sammich

    Joined:
    Sep 26, 2006
    Location:
    Sarcasmville.
    #11
    I don't even think your liberal usage of quotation marks allows you to make wrong statements like that.

    I updated a post I way back with an new image to get the message across. So I'll post it here for the extra kudos :)

    Note: this chart is a simplification, but it gets the message across: 'retina' isn't an arbitrary fixed PPI regardless of typical usage distance. Also, see the post above this one.

    There's a whole can of worms when it comes to capturing content that is 'retina'. If we assume one arc minute is the limit of visual acuity and we take into account the Nyquist theorem, then at a basic level we might be able see differences up to half of one arc minute. So taking that theory, we would see benefits of higher resolution all the way up to 600+ DPI for people with perfect vision. Read here.

    [​IMG]

    http://en.wikipedia.org/wiki/Optimum_HDTV_viewing_distance#THX_ranges
     
  12. Randomoneh, Jun 6, 2012
    Last edited: Jun 6, 2012

    Randomoneh macrumors regular

    Randomoneh

    Joined:
    Nov 28, 2011
    #12
    Hey sammich, very, very nice graph, you get a +1 for that :)
    There are a few things wrong with it but if you correct them, it will be one hell of a graph regarding "Retina displays".

    1.) You're using lower limit of human vision (minimum separable visual acuity) - that is 60 arcseconds or 1 arcminute. Younger people tend to have much better minimum separable visual acuity than that. I have it at 28 arcseconds. I'm sure you have one that's better than 60 arcseconds (1 arcminute).

    2.) You haven't taken into consideration that humans have such and such minimum separable acuity at normal contrast values. If you had a display that's capable of achieving super-high brightness values, your minimum separable acuity would be quite higher when testing it at such a display (see: relation between minimum perceptible acuity and contrast).

    3.) About HDTV viewing distance - it is expressed in degrees of [person's] horizontal field of view which display occupies. Lower limit is 20° (10.3 ft for 50'' 16:9 display) but upper limit is currently only limited by resolution! That's why, one of the improvements that comes with 8K TV's viewing angles greater that 40°. Quoted figure is 100° for future ultra-HDTV's. For a 50'' 16:9 display, 100° viewing angle would be even 1.524 ft. Of course, no one would sit this close to display, so this figure of 100° is meant for larger displays. You see where I'm going with this?

    4.) "Above the line - you can't see individual pixels". Can you? NHK study suggests participants were able to see difference up to angular resolution of 310 pixels per degree (155 cycles/degree) and more, even though average minimum separable visual acuity of participants was [angular resolution] of 120 pixels per degree (0.5 arcminutes or 30 arcseconds). Participants linked / associated higher angular resolutions with greater sense of realness of an image.

    More about everything at this paper [.pdf], among others. Pages 64 and 65.
     
  13. sammich, Jun 6, 2012
    Last edited: Jun 6, 2012

    sammich macrumors 601

    sammich

    Joined:
    Sep 26, 2006
    Location:
    Sarcasmville.
    #13
    Thanks randomoneh.

    1) I had to choose a baseline for the chart and to keep things simple, and general, the commonly accepted value of 1 arc minute of acuity is good enough for the purposes of the chart.

    I haven't searched for data on how visual acuity 'degrades' with age but one could assume that for the range of all smartphone users, young and old, it would be an acceptable starting point.

    2) See above.

    3) Yes, I came across those points in the linked wikipedia article on optimum viewing distances for HDTV's (typically 1080p). The range is 20º-40º of the HFoV and I wouldn't expect one to journey too much beyond that for any display. For large FoV display placement, the article also mentions motion sickness in viewers. DPI should take acceptable, real world scenarios. I mean, the technology is there, why not just make an array of iPad 3 panels to 50" and call it a day? Because it'll be far beyond the people buying it's needs.

    If I've missed your point, then you might have to come around for a second pass.

    You raise a lot of facts that I'll have to read into later, but it's important to note that real world usage scenarios these numbers don't mean much as you approach a certain point. Take for example the Samsung GS3, it uses the 'awful' pentile subpixel arrangement in a 4.8" screen for 306 DPI. Even with less sub pixels, most reviewers have praised the screen and that one can barely see the pixels. And for most devices, anti-aliasing can artificially bring smoother details to text beyond the physical capabilities of the screen.

    4) That line, is in hindsight, a bit of a throwaway line. It was the original wording I used because it suited my original audience, and because it more or less suits the needs of the discussion. It's not absolutely true, as you point out, but it's close enough.
     

Share This Page