Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

mshepherd

macrumors regular
Original poster
Feb 29, 2004
152
9
I was doing some thinking about retina displays and I see some issues for graphics people that work with raster artwork. Let's say you have an image that is 1024x768 you are working with, if you look at that on a retina display (assuming it is double the resolution of a normal display) it will be half the size it would be on a normal display to view it with no distortion. Also, web people will need to build all of their sites with this in mind.
 

Arkious

macrumors 6502a
Mar 14, 2011
583
0
Newcastle, UK
i think they will look great, but to be honest when i sit in front of my 27" iMac, i fall in love with its screen every time. But at the same time the big screen is its biggest downfall. Because the MacBook Pro, MacBook Air and iMac to some extent have relativly older/ less capable graphics cards on them, if they were to put a retina display on them it would need 4 times more power (dont know how accurate that is but 4x the pixels is 4x the power in my eyes). take the new iPad for example... look at what they needed to do, just to get the performance of the iPad2. Take battery life into consideration for the MacBook Air. I just dont think its going to happen.

Now dont get me wrong i would love to see what they would look like with a retina display and yeah i would almost definatly buy one. I would definatly buy a 27" Retina Dispay if that were to become available but then i would have to splash out on the power output of MacPro to be able to play a game on it for example. But although retina will look flawless for images, wouldnt it be too much of a long shot at the moment?
 

jablko

macrumors member
Nov 12, 2007
73
0
Lincoln, Nebraska
Current graphics technology can more than adequately deal with 4x the pixels we're currently using in most use scenarios. To display 1920x1080 with 32-bit color, a graphics card only needs 8 meg of video RAM, and QFHD (four times that resolution) only requires 32 meg for 2D applications. Any Mac from the last 10 years can probably handle that (though their DVI connector would limit resolution).

However, there are plenty of applications now that use the GPU for more than just pushing 2D pixels to the screen, and they are the reason we measure VRAM in gigs rather than megs these days. For example, CS6 recommends 1 gig of VRAM. Bridge uses that memory to store thumbnails, Photoshop uses that memory to apply certain transformations and filters, etc. But here's the thing: the media files you're working on aren't going to be any larger just because you're using a HI-DPI display. The only thing that changes is the zoom factor. The amount of VRAM CS6 needs will be barely affected.

The exception is gaming. If you want to play games at the screen's native resolution, you will need to hold a lot more rendered pixels in RAM. But honestly, gamers are used to playing at a non-native resolution to get faster performance anyway, so in many cases, this will be a matter of magnification with current games, not more actual pixels.

I think for most designers, photographers, videographers, etc., higher resolution displays will only be a good thing.
 

Arkious

macrumors 6502a
Mar 14, 2011
583
0
Newcastle, UK
Current graphics technology can more than adequately deal with 4x the pixels we're currently using in most use scenarios. To display 1920x1080 with 32-bit color, a graphics card only needs 8 meg of video RAM, and QFHD (four times that resolution) only requires 32 meg for 2D applications. Any Mac from the last 10 years can probably handle that (though their DVI connector would limit resolution).

However, there are plenty of applications now that use the GPU for more than just pushing 2D pixels to the screen, and they are the reason we measure VRAM in gigs rather than megs these days. For example, CS6 recommends 1 gig of VRAM. Bridge uses that memory to store thumbnails, Photoshop uses that memory to apply certain transformations and filters, etc. But here's the thing: the media files you're working on aren't going to be any larger just because you're using a HI-DPI display. The only thing that changes is the zoom factor. The amount of VRAM CS6 needs will be barely affected.

The exception is gaming. If you want to play games at the screen's native resolution, you will need to hold a lot more rendered pixels in RAM. But honestly, gamers are used to playing at a non-native resolution to get faster performance anyway, so in many cases, this will be a matter of magnification with current games, not more actual pixels.

I think for most designers, photographers, videographers, etc., higher resolution displays will only be a good thing.

Yeah I suppose your right I've not really thought about it like that, but gaming would deffo take its toll
 

jablko

macrumors member
Nov 12, 2007
73
0
Lincoln, Nebraska
On the subject of how it will affect Web design, I'm very curious about that as well. I've heard there are a couple competing proposals W3C is considering to change the HTML specification accommodate adaptive images.

My understanding is one proposal would make the img tag like the video tag in HTML5, which allows you to specify multiple sources and lets the browser to choose which one it supports. For images, that would mean you would specify one file for a mobile browser, another for a normal display and a third for a HI-DPI display, all based on detecting the display resolution (which is already reported anyway). You can achieve this through some creative JavaScript now, but it's a bit of a hassle.

Another proposal would be to have images that only load a certain percentage of their pixels depending on the reported screen size. So, you would create an image with a 4k or QFHD display in mind, but on a 1080p display, only a quarter of the pixels would load, and on a 720p screen, only 1/8th of the pixels would load (or something like that). The advantage of this method over the other is it would require less server space since you'll only be keeping one file instead of a separate file for each size, while still using less bandwidth on mobile devices. JPEG2000 files already support this, but unfortunately, aren't well supported by browsers.

This won't help many situations, but I'd also love to see browsers support SVG scaling, so that design elements can be vector graphics and scale the same way text does.

However W3C chooses to handle adaptable scaling, it will take time for the specification to be agreed on, the browsers to support it, and webmasters to implement it. A new spec certainly won't be ready for Apple's early adoption of HI-DPI displays. I will be very interested to see how they deal with scaling in the short term. My guess is that images on the Web will be scaled up to match the new resolution (as they are on the iPhone), rather than appearing in postage-stamp sizes. They should look just as sharp as they do on the current iMac monitors and at the same size, just not as sharp compared to the text around them.
 

7enderbender

macrumors 6502a
May 11, 2012
513
12
North East US
I don't care about the underlying technology but I'd like to have something that offers more real estate than most current Apple models. Retina, or so it seems, won't do that since they are obviously looking to double the resolution and then make everything twice as big. No gain for people who are looking to buy a Mac for photo editing and/or audio recording. The screens and the limited number of interface connections are the biggest obstacle for me right now. I'm really curious what will come out next week, if the 15" MBP with hi res matte will still be available and what an upgraded Mac Mini might entail.
Call me crazy, but if I go Mac I'm leaning towards a Mini or refurbished Mac Pro because the screens are the pitfall compared to what e.g. NEC has to offer. But even that only makes sense if there is a Mac laptop available still that at least matches the resolution of my 6-year old ThinkPad with around 125ppi at 14"
 

d0vr

macrumors 6502a
Feb 24, 2011
603
1
(dont know how accurate that is but 4x the pixels is 4x the power in my eyes)

I would be surprised if it needed more than twice the power. Most of the power requirement comes from the back light, which is already there being used. After that it comes down to processing the graphics themselves (the chip) which will use less and less power over time.
 

TyroneShoes2

macrumors regular
Aug 17, 2011
133
3
I guess to answer that question we have to have an agreement what "retina" is. I think what it means is having resolution that is at or beyond what the human eye can perceive at a particular distance from a particular screen, and in the case of a laptop or desktop, that would be at a typical distance. IOW, highly-enough resolved to not see pixels or jaggies, which technically, means it reaches just beyond a ceiling or limit of 1/60th of a degree of arc for the human vision system.

I think if this whole retina thing starts a pixel race, that we might find ourselves suffering from the fallout of that. By that I mean that more pixels is ONLY better if you actually need more pixels to display an image without apparent pixelation or apparent jaggies. Anything beyond that is really wasted, and the CPU/GPU power, the battery power, and any other resources that might be diverted towards giving me more pixels than I can use? Well, that sounds more like a problem rather than a feature or a solution.

I have a MBA 13, and I can't see the need for a higher rez that this at the distances that I view it. I also have an iPad 1 and an iPad 3. While the iPad 3 is gorgeous, I left it at work and had to read USA Today on the old iPad 1 one day last week. Sure, you can see the difference when you are looking for it, but I really did not feel like I was missing anything when reading on the iPad 1, even after months of being conditioned to the iPad 3.

Off-angle viewing? That's a completely different story; I constantly view the iPad at severe angles to line up shots in the terrific Virtual Pool HD simulation game, and the iPad 3 only dims a little bit; the iPad 1 washes out significantly. So it depends, I guess, on how you use it. A Ferrari won't get me to work any faster on the 101 at rush hour.

You can still make the argument that the iPad 3 is better, but it takes 3 times as long to charge, runs hot, and the battery does not last as long, so all of that compromise is basically at the behest of an improvement in readability that is minor and somewhat questionable.

Bottom line, take "retina" up to the level where it really can't yield any more advantage, but be sure to stop there. If it becomes a numbers game in a vendor war, we all will lose.

...I'd like to have something that offers more real estate than most current Apple models. Retina, or so it seems, won't do that since they are obviously looking to double the resolution and then make everything twice as big. No gain for people who are looking to buy a Mac for photo editing and/or audio recording. The screens and the limited number of interface connections are the biggest obstacle for me right now...
Honestly, I think everyone has buried the lead; the real revolutionary thing that will benefit us is not "retina" displays, it is resolution independence, which will allow high-rez images that can be scaled to different sizes (icons, fonts) without that being strapped and handcuffed to the various resolution settings on the monitor as it is today. And that will come along with "retina" (although not because of it), but all anyone will talk about will be "retina", even though that is really not what will be important about resolution improvements that are rumored to be unveiled next week. It seems like 7enderbender has presented us with the very problem that this will solve.
 

dalcorn1

macrumors regular
Jul 10, 2008
171
23
On the subject of how it will affect Web design, I'm very curious about that as well. I've heard there are a couple competing proposals W3C is considering to change the HTML specification accommodate adaptive images...

top post. I came on looking for some more info on how HI DPI images might be supported.

Interesting to see that Apple and keen to keep bandwidth down, with limitations such as no autoplay on videos for mobile Safari, yet are more than happy to load in 20 hi res photos to the browser.

It'll be interesting to see how non cable ISPs cope with the server strain when this all kicks into gear.
 

TyroneShoes2

macrumors regular
Aug 17, 2011
133
3
Double that and you're close.
Rather than throw out vague numbers, lets clear this up.

According to http://en.wikipedia.org/wiki/Visual_acuity:

"20/20 is the visual acuity needed to discriminate two points separated by 1 arc minute—about 1/16 of an inch at 20 feet."

20/20 vision refers to the acuity needed to resolve characters centered on lines 20 mm apart from a distance of 20 feet, and 1 arc minute is exactly the same as 1/60th of a degree of arc.

1/60th of a degree of arc may not be EXACTLY the limit of the resolution of the fovea of EVERY human eye, but it is what the vision system is capable of in the great majority of cases; 90% of the world population does not have visual acuity as good as (or better) than 20/20, and a very large percentage of them do not even approach 20/20 acuity, even at their life's peak of health.

Plus, we do not see with 20/20 vision all of the time. Only a few percent of the visual field, foveal vision, had much acuity at all. 97% of so of what we see in any one moment is seen with very little acuity and is actually very blurry.

We compensate continually for this by using the nystagmus characteristic of the eye, meaning that the eye moves in small increments very quickly so that we can "take a lot of quick snapshots" in serial order so that we can see tiny parts of the field of vision, of what we are looking directly at, in focus, one after the other. The brain then stitches these snapshots together as a part of perception to simulate that we are seeing more of the visual field in clear focus than we really actually are. You are doing exactly that as you read this line of text; only every other word or so is ever in focus at any particular time.

20/16 or better probably refers to about the 98th percentile, and 20/8 is probably the absolute limit (for the one person in 20 billion people ever to walk the planet that could see better than every other single person that ever lived). If you wish to refer to what the limit is for 99% of the world's population, it is then approximately 20/10 or below, so in that way your statement might actually have a grain of truth. But 20/10 is the extremely rare exception, not the norm.

And 20/20 is the accepted standard that optometry targets to correct for, because they assume that this is the level of visual acuity that the bulk of the healthy population has. It is probably at a bare minimum representative of the 90th percentile or better, and improving vision beyond 20/20 hits a wall of diminishing returns very quickly, as far as any particular benefit of higher visual acuity is concerned, and that is why it is the accepted standard.

And this is also exactly the same standard that was used to develop HDTV resolutions adapted by the Grand Alliance, which implies that they researched this pretty thoroughly at the time. Most people sit further away than the optimum distance to resolve 20/20 vision, and so ironically do not fully receive the benefits of resolutions even this high; you have to sit 7.8 feet away or closer from a 60" screen to fully resolve HD, while most people have a 52-55" screen and sit 12-15 feet away from that.

The point is that once you produce a display that exceeds the visual acuity of 90% of the population (where the other 10% may really only see marginally all that much better and even if they did it would not really buy them much), the tradeoffs ramp up quickly and it becomes an unsound endeavor. You have to draw that line somewhere, and where that appears to be practical, is to produce "retina" displays that resolve fully for folks that have 20/20 vision, and not higher than that. And that means the reasonable target probably should be exactly as I stated: 1/60th of a degree of arc.
 
Last edited:

Randomoneh

macrumors regular
Nov 28, 2011
142
0
All right, it's obvious you have read something, but you haven't read enough so get ready for surprises. Shall we start? Ah...

1/60th of a degree of arc may not be EXACTLY the limit of the resolution of the fovea of EVERY human eye, but it is what the vision system is capable of in the great majority of cases; 90% of the world population does not have visual acuity as good as (or better) than 20/20, and a very large percentage of them do not even approach 20/20 acuity, even at their life's peak of health.
20/16 or better probably refers to about the 98th percentile, and 20/8 is probably the absolute limit (for the one person in 20 billion people ever to walk the planet that could see better than every other single person that ever lived). If you wish to refer to what the limit is for 99% of the world's population, it is then approximately 20/10 or below, so in that way your statement might actually have a grain of truth. But 20/10 is the extremely rare exception, not the norm.
And 20/20 is the accepted standard that optometry targets to correct for, because they assume that this is the level of visual acuity that the bulk of the healthy population has. It is probably at a bare minimum representative of the 90th percentile or better, and improving vision beyond 20/20 hits a wall of diminishing returns very quickly, as far as any particular benefit of higher visual acuity is concerned, and that is why it is the accepted standard.

20/20 translates to 60 pixels per degree (30 cycles per degree). Now here's a quote from "The Historical Evolution of Visual Acuity Measurement, page 4.":
The horizontal lines represent one-line increments on a standard chart. The dark band represents STANDARD VISION (20/20, 1.0). The data also show that “normal” vision was and is substantially better than “standard” vision. Normal vision does not drop to the standard level [20/20, 1 arcminute] until 60 or 70 years of age. Snellen was well aware of this and described the “20/20” level not as threshold or perfect vision, but as a level that is “easily recognized” by normal eyes.

Another quote from paper "Resolution Acuity is better than Vernier Acuity, page 525.":
da8a4523596a52f7678a369029128254.jpg


That sets visual acuity of healthy person to somewhere from 0.3 to 1 arcminutes. Translated to angular resolution, that is from 60 (1) to 200 (0.3) pixels per degree. That is 30 to 100 cycles per degree.

My minimum separable acuity is ~120 pixels per degree (60 cycles per degree).

And one more quote, just so that we can forget about "20/20 is 90% of population" talk:

August Colenbrander, M.D. (Smith-Kettlewell Eye research Institute and California Pacific Medical Center) also emphasizes that, contrary to popular belief, 20/20 is not actually normal or average, let alone perfect, acuity. Snellen, he says, established it is a reference standard. Normal acuity in healthy adults is one or two lines better. Average acuity in a population sample does not drop to the 20/20 level until age 60 or 70. This explains the existence of the two lines smaller than 20/20: 20/15 and 20/10.

I read that Snellen choose that standard [20/20] so that all healthy young adults exceed it.

And that is confirmed by this chart:
visualacuitychanges.png


The point is that once you produce a display that exceeds the visual acuity of 90% of the population. [After that] the tradeoffs ramp up quickly and it becomes an unsound endeavor. You have to draw that line somewhere. And that means the reasonable target probably should be exactly as I stated: 1/60th of a degree of arc.

Are you familiar with new UHDTV standard? Research and standardization is done by Japanese NHK. This is important. They did a research (page 4, short description) in which they presented test subjects with images of different quality in terms of angular resolution. Remember, Apple's "Retina display" means angular resolution of 60 pixels per degree (1/60 of a degree per pixel) at normal viewing distances. In their research, they concluded next:
The higher the angular resolution, the greater the sense of realness, and the sense greatly saturates above about 60 cpd [120 ppd]; above 155 cpd [310 ppd] - images are indistinguishable from the real object.

Here's an image representing results of the study. 1 cpd (cycle per degree) = 2 ppd (pixels per degree).
zu_1.png


Plus, we do not see with 20/20 vision all of the time. Only a few percent of the visual field, foveal vision, had much acuity at all. 97% of so of what we see in any one moment is seen with very little acuity and is actually very blurry.

Come on - it's like reading "Human vision for dummies". No need to teach the basics. Acuity of the most dense point (foveola) determines our acuity.

You have to draw that line somewhere. And that means the reasonable target probably should be exactly as I stated: 1/60th of a degree of arc.
No, for now it should be (just as noted in NHK paper) 1/120th of a degree of arc and in the future - when CPUs and GPUs catch up - 1/310th of a degree. ;)
 
Last edited:

dissdnt

macrumors 65816
Aug 3, 2007
1,489
5
I was wondering that too with web design how this is all going to work. It seems like a never ending series of refreshing pixel based graphics. There should be some standardized vector based rendering for html that is lightning fast and adaptive so once it's done it's done!
 

53kyle

macrumors 65816
Mar 27, 2012
1,282
111
Sebastopol, CA
I was doing some thinking about retina displays and I see some issues for graphics people that work with raster artwork. Let's say you have an image that is 1024x768 you are working with, if you look at that on a retina display (assuming it is double the resolution of a normal display) it will be half the size it would be on a normal display to view it with no distortion. Also, web people will need to build all of their sites with this in mind.

Look at the iPod touch, iPhone, and the new iPad. They have retina displays and Pictures aren't smaller just sharper. Also if you look at this article https://www.macrumors.com/2012/06/12/a-closer-look-at-the-new-macbook-pros-retina-display/, apple scales things to the right size.
 

G5isAlive

Contributor
Aug 28, 2003
2,598
4,487
You can still make the argument that the iPad 3 is better, but it takes 3 times as long to charge, runs hot, and the battery does not last as long, so all of that compromise is basically at the behest of an improvement in readability that is minor and somewhat questionable.

You had me until you had to go the iPad 3 who cares route...

I take it you don't own one. Maybe you have a 2. I don't know. I owned both the 2 and now the 3 and ...

I charge over night. I don't notice any difference in charging time because I am sleeping. (3 x? doubt it but whatever)

runs hot? no. maybe it runs warmer than my 2 but it never runs hot to the point that I notice it at all.

shorter battery life? I have not noticed that either. Not saying you can't find some statistic that says its 10% shorter, just saying in actual use it doesn't strike me as shorter and definitely not too short.

minor and questionable improvement in readability? totally wrong. The screen is a big deal. I love how much more crisp it is. HUGE difference.

we can debate retina displays forever, like the virtue of floppy disks was debated versus cd's, and then cd's versus dvds, and even black and white versus color, but we only debate out of ignoring the clear future. This debate will seem quaint in 5 years when every display is a retina display. That's the way of the future. Retina displays are easier to read. I love the one in my phone. The one in my iPad, and yes, looked at the RMBP yesterday and it was great.

The question on buying now or not is not which is better, or what do we need, its, are you a leader or a follower? How much do you want to grow with the new technology versus learn with everyone else when its mainstream.

Because in 5 years even the diehard holdouts will have retina displays.
 

mamaiku

macrumors newbie
Jan 12, 2011
5
0
We already are changing/feeding graphics based on resolution. Its called responsive design, and it is necessary in order for one site to view well on different devises (laptop, ipad, phone). People saw this coming a long time ago, and so web design has been moving down this path for many, many years. There are several different options available to designer / developers today to handle this issue. Bumping up for retina will be in full swing once the screens become a large enough share of the market to justify developing for.

My concern is the retina's ability to push that resolution and have enough juice left over for Photoshop to use. I also worry about designing web on these, since your monitor will be so vastly different from all normal users - a very small screen with a HUGE resolution. 20 pixels on most machines is a good size - 20 pixels on the retina... will that be legible?? I guess we will now what designers are using the retina, their designs will be gigantic.

For print, photography and video, its a no-brainer. But for web.... dont know. Scared to pull the trigger...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.