Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I have no idea what movies you've watched, but you can see a huge amount of difference between a BD and a DVD. And by huge, I mean...like...a whole bunch.

Agreed 100%. I cannot stand watching DVD on my big screen. Perhaps he should spend some time calibrating his tv.
 
You can see a huge difference between a blueray and downloaded/streamed content... especially in action scenes, where a Stream will typically go blocky (DVD and Blueray will not).

One thing to note is that in order to see the difference you need good playback equipment too. If you're watching films on your computer screen, you probably won't see any difference. Ditto on an older LCD TV.

I remember watching "300" in my friend's basement - high end BD player, $10k projector - WOW!! Totally blew me away. It's way better than what most movie theaters do - not because they couldn't, but because they're tight, they don't care, and they have old equipment - some even show 3D movies on non-3D setups so everything is only 1/2 as bright as it should be).

Anyway if that MBP is slim, has an SSD, an i7 _and_ a retina display... it's almost too much. I'll be first in line if it's either slim or has a retina...

----------

Not only overkill, it will be detrimental to at least two other very important things, battery life and graphics performance.

Huh? If the battery can fit in a 10" iPad, imagine how much battery you can fit in a 15" MacBook Pro?!

They'll dump the CD drive, make all the other components smaller, use Flash memory instead of a HD form factor SSD, and suddenly you have acres of space. And that space will be filled with battery.

Battery as far as retina vs. non-retina is concerned is a complete non-issue. Battery life will be limited by how slim they'll want to make the thing. And by the Core i7, especially if it's a quad core.
 
I wouldn't say it's a complete non-issue.
Although I agree Apple will most likely ditch the optical drive and make room for a bigger battery.

I'm hopeful for a standard SSD...but that may be too many leaps for 1 generation. They know they can milk sales by spreading it out.

Still, fingers crossed.
 
Yes, Retina is a marketing term, but why should you set the definition to an arbitrary 300 dpi? If I hold my phone 1 inch from my face I can see pixels. So that means it has to be higher than 300 dpi. Why? Because our perception of detail is based on distance.

So Retina is the output of an equation that looks like: Retina = PPI/Distance from face? The reason it's BS is because designers can't know exactly how far you're holding the object from your eyes.
 
Note that if Apple did put "Retina Displays" in their Macs, then the CPU (Intel Ivy Bridge) would power the display, and GPU would power those intensive games (a new feature in Ivy Bridge).

Wow that is completely wrong a CPU does not power a display that is the function of the GPU. All Intel is claiming is that the integrated GPU's of the Ivy Bridge platform are capable of addressing the resolutions of the proposed retina standards. As video cards have set ranges of resolution, frequency and color bit depth they are designed for.

If you are gaming with AMD Radeon or nVidia GeForce discreet GPU's the Intel's integrated GPU is disabled so that the much more powerful discreet graphics run everything. That is on laptops which support switching, on desktops the the discreet video card is always enabled and the integrated GPU is always disabled as you don't have the power concerns. Although with desktops you do have the option of hooking displays up to both the integrated and discreet video ports. It isn't something one would do for games.

When gaming as resolution increases the GPU requirements increase as well. Since more detail of the games environment must be rendered on screen at a time.
 
Huh? If the battery can fit in a 10" iPad, imagine how much battery you can fit in a 15" MacBook Pro?!

They'll dump the CD drive, make all the other components smaller, use Flash memory instead of a HD form factor SSD, and suddenly you have acres of space. And that space will be filled with battery.

Battery as far as retina vs. non-retina is concerned is a complete non-issue. Battery life will be limited by how slim they'll want to make the thing. And by the Core i7, especially if it's a quad core.

You want to waste all that space for a battery so it has the same battery life? You really don't think they could do something else? Or make the macbook lighter? You really think it's worth it for a screen you're not going to see much difference on (unless maybe you tend to hover really close to your screen which is bad for your eyes)?

Not to mention once again, graphics performance as well (especially on the 13" which already doesn't have room for a seperate graphics card.. room it could have if you weren't filling the space with more battery to power that "retina" display).
 
The point of resolution independence is that when a programmer writes code to draw a line x number of inches long it ends up being that long on a traditional display or a HiDPI display. The same goes for text. That is resolution independence as what you see is not tied to the pixel density of the screen. The operating system in effect uses as many pixels as needed to get the right dimensions on screen.

Resolution independence isn't about your tweaky need for more real estate. Rather the goal is to get more of that WSIWYG effect that Apple has been known for.



That is the whole point of resolution independence, the on screen UI features get drawn the same size no matter what your screen size. I can't understand why this is so damn difficult for people to grasp, yet we have already many posts in this thread indicating that people don't get it.

Ok, so if that is the case explain how things will be the same size on a "retina" MBA screen that is 11" at 2732x1536 versus a "retina" 15" MBP At 2440x1900? They both have different DPI, but the elements are just pixel doubled.

HiDPI mode is just pixel doubling, just like on iPad and iPhone. The difference is that iPad and iPhone only have two display sizes. Macs have 6 display configurations between just laptops (11", 13" Air (1440), 13" Pro (1280), 15" (1440), 15" (1680), 17" (1920)). The DPI on all those are different. Which means pixel doubling will make one element larger on one screen than it is on another, just like we have right now with non-hidpi.

My point is that on a 1440x900 screen at 15", interface elements are HUGE. Why should that become the "standard"? On iOS having standard element sizes is great, because they are based on touch size, which most people have similar sized finger tips. On a Mac you use a mouse, so preferred screen resolution differs as a mouse is always a 1px target.

True "resolution independence" would mean that the user can scale interface elements to whatever size they want. So if they have bad eyesight, they can make toolbars 1" tall. If they don't, then they don't have to. HiDPI does not give us this, it just gives us crisp text and interface elements that are the same size as they are now, still dependent on DPI.
 
Great news

I've been waiting years for this. As all of this eventually goes mainstream, content will go 4K, and people will finally see movies in their true resolution. (The people who still say they can't tell the difference between HD and VHS tapes will complain of course.)
 
Apple leads and the rest follows


Yep, the 2011 13" Macbook PRO is a class-leading 1280x800, whereas competitors have had >1600x1200 displays in 13" form factor for years.

Just sayin'...

FWIW I'm happy with my 1440x900 MBA, and dont really see the need for a higher res on a 13" formfactor.
 
These resolutions are higher than necessary given that people don't view computers screens as closely as they view iPhone screens...

Yeah, anyone else notice that for the larger screens, Intel has recommended too high of a DPI?

I mean, if 300 dpi is good for viewing from 12-16", shouldn't 150 dpi be sufficient for large displays viewed from 24-30", instead of the recommended 220 dpi Intel lists?

I suspect they are confusing things by combining retina capability with the desire for "more elements on a screen". Either way, people have a minimum desired element size which is based primarily (?) on a couple factors - viewing distance and their own eyesight, and to a lesser extent on dpi and contrast. (Assuming "decent" resolution, not old style VGA.)

When the physical size of the screen is introduced as a variable, people can fit more stuff on a larger screen, but if the elements get too small or too large, they're not happy.

There should be a notion of a minimum font size or minimum element size for applications and OS elements, and items get larger from there. The complication is that some other items should scale, but perhaps not all.
 
Yeah, anyone else notice that for the larger screens, Intel has recommended too high of a DPI?
If you want to use pixel doubling, thats to an extent necessary - as interface elements otherwise can end up too large, significantly reducing productivity.
 
As much as I would love for these to be in the 2012 lineup, I don't know how Retina displays will work out for my design process.

I'm kind of hesitant to replace my 2010 Macbook at 72dpi simply because I may not be able to replicate the same viewing experience as the typical user who still has a 72dpi screen.

What happens if I design my graphics or website on the new retina display, and everything looks great? How will I know how it looks on regular screens?

Another concern is, GPUs don't seem to be technologically advanced enough to drive these resolutions. How does Apple intend on maintaining the current graphics performance with these new displays?

Even the latest and greatest from ATI/nVidia won't be able to handle a 30" Retina display.

It'll be interesting to see how these all get implemented. I hope GPUs won't become the next bottleneck, and further setback the enthusiast/gaming community from the Mac environment.
 
As much as I would love for these to be in the 2012 lineup, I don't know how Retina displays will work out for my design process.

I'm kind of hesitant to replace my 2010 Macbook at 72dpi simply because I may not be able to replicate the same viewing experience as the typical user who still has a 72dpi screen.

What happens if I design my graphics or website on the new retina display, and everything looks great? How will I know how it looks on regular screens?

Another concern is, GPUs don't seem to be technologically advanced enough to drive these resolutions. How does Apple intend on maintaining the current graphics performance with these new displays?

Even the latest and greatest from ATI/nVidia won't be able to handle a 30" Retina display.

It'll be interesting to see how these all get implemented. I hope GPUs won't become the next bottleneck, and further setback the enthusiast/gaming community from the Mac environment.

Wrong, at least for the AMD 7000 series which support 4K displays.
 
Excuse my wording, but I intended on saying that they won't handle it in terms of reasonable graphics performance.

I understand that the 7000 series and the likes has support for it. But how will they perform? That is a different story all together.
 
Resolution independence doesn't work, that's all. x2 is the only way to get higher res displays - clearly more flexible on a laptop screen as the base size is flexible, e.g. could be 1680x1050 (x2) or 1440x900 (x2) but the x2 factor is the only way to make this work.

X2 is certainly not the only way, just the easy way - the backwards compatible way.
It is plain stupid to let the resolution dictate the size and wreak havoc on differently PPI:d screens.
 
Could you explain how you got to the 12gbps figure please?

I think he's saying that 3840 by 2160 pixels at 24 bits per pixel at 60 frames per second equates to 11,943,936,000 bits per second. That's 11.12 gigabits per second of data transfer (using base-2 nomenclature - obviously it'd be 11.94 gbps in base-10.)
 
So Retina is the output of an equation that looks like: Retina = PPI/Distance from face? The reason it's BS is because designers can't know exactly how far you're holding the object from your eyes.

No, but they can determine the typical range(s) at which the display is viewed, and calculate the required PPI based on that and typical visual acuity. Strangely enough, that's exactly how 'retina display' has been defined since the very first moment it was described by Steve Jobs on stage with the announcement of the iPhone 4.

----------

Ok, so if that is the case explain how things will be the same size on a "retina" MBA screen that is 11" at 2732x1536 versus a "retina" 15" MBP At 2440x1900? They both have different DPI, but the elements are just pixel doubled.

HiDPI mode is just pixel doubling, just like on iPad and iPhone. The difference is that iPad and iPhone only have two display sizes. Macs have 6 display configurations between just laptops (11", 13" Air (1440), 13" Pro (1280), 15" (1440), 15" (1680), 17" (1920)). The DPI on all those are different. Which means pixel doubling will make one element larger on one screen than it is on another, just like we have right now with non-hidpi.

My point is that on a 1440x900 screen at 15", interface elements are HUGE. Why should that become the "standard"? On iOS having standard element sizes is great, because they are based on touch size, which most people have similar sized finger tips. On a Mac you use a mouse, so preferred screen resolution differs as a mouse is always a 1px target.

True "resolution independence" would mean that the user can scale interface elements to whatever size they want. So if they have bad eyesight, they can make toolbars 1" tall. If they don't, then they don't have to. HiDPI does not give us this, it just gives us crisp text and interface elements that are the same size as they are now, still dependent on DPI.

HiDPI is a mid-point compromise on the road to resolution independence.

Once you hit the point where a user can't distinguish individual pixels, it becomes easier to manage resolution independence, because you can have settings which control how big a 'display unit' is (in pixels) and have software specify sizes in 'display units' (rather than pixels). Unfortunately, software has historically been written with widget-sizes specified in pixels, so there will be a moderately painful period which will be eased by the HiDPI stage.

Now that we're seeing pixels too small to visibly distinguish, we have more size options for any given widget where it remains readable or otherwise visually clear. This means that you can configure a system such that your 'display unit' is 1, 2, 3 or more pixels in size, and the system can do the necessary math to convert your buttons/text/etc to pixel dimensions.

I'm guessing 2015-2018 is when we'll actually hit the resolution independence stage, largely due to the inertia of legacy software not being updated to take advantage of it.

----------

Yeah, anyone else notice that for the larger screens, Intel has recommended too high of a DPI?

I mean, if 300 dpi is good for viewing from 12-16", shouldn't 150 dpi be sufficient for large displays viewed from 24-30", instead of the recommended 220 dpi Intel lists?

I suspect they are confusing things by combining retina capability with the desire for "more elements on a screen". Either way, people have a minimum desired element size which is based primarily (?) on a couple factors - viewing distance and their own eyesight, and to a lesser extent on dpi and contrast. (Assuming "decent" resolution, not old style VGA.)

When the physical size of the screen is introduced as a variable, people can fit more stuff on a larger screen, but if the elements get too small or too large, they're not happy.

There should be a notion of a minimum font size or minimum element size for applications and OS elements, and items get larger from there. The complication is that some other items should scale, but perhaps not all.

I don't think they're over-specifying the resolution. I think they're simply acknowledging that while most people 'typically' sit within that range of their desktop displays, they also often lean in more closely. For example, at work, I generally find myself at one of 3 ranges (roughly 16", 24" or 36") depending on exactly what I'm doing (examining something, typing normally, reading/planning code). Specifying a higher-than-absolutely-necessary resolution for a desktop means that people who lean in will still get the benefit of the 'retina' resolution.

----------

As much as I would love for these to be in the 2012 lineup, I don't know how Retina displays will work out for my design process.

...

What happens if I design my graphics or website on the new retina display, and everything looks great? How will I know how it looks on regular screens?

That's easy. Just change your resolution from 'WxH HiDPI' to 'WxH' and check out your design. (Or keep a regular, non-HiDPI, display around for testing until it becomes unnecessary.)


Another concern is, GPUs don't seem to be technologically advanced enough to drive these resolutions. How does Apple intend on maintaining the current graphics performance with these new displays?

Even the latest and greatest from ATI/nVidia won't be able to handle a 30" Retina display.

It'll be interesting to see how these all get implemented. I hope GPUs won't become the next bottleneck, and further setback the enthusiast/gaming community from the Mac environment.

It's really only a strain for a modern GPU to push these kinds of pixels around when they have to also do 3D rendering. GPUs have been capable of pushing these sorts of resolutions for 'normal' desktop applications for about 15 years at this point. At one point, I had an 8MB Matrox video card which was capable of these resolutions, if I'd had a display that could do it back then. My first computer had a 14" monitor (12.7" visible) that could push 1280x1024 at 50Hz, with later systems the monitors became both physically larger and capable of higher resolutions. It wasn't until the HDTV craze really hit that 1920x1080 was considered 'high resolution' for a desktop system.
 
Like I mentioned before, reasonable graphics performance, not just to drive and power the displays.

Imagine running the latest games at those resolutions smoothly, just wow.
 
Apple leads and the rest follows

Right...is that why my 15" MBP has a resolution of 1440-by-900, while a Dell XPS 15 has a resolution of 1920x1080?

Give it a break people...while I certainly appreciate the display on my iPhone 4, I'm not stupid enough to think Apple was the first to come out with products that had high resolutions / dense pixels.

And the fact that a Dell has better resolution than my MBP annoys the hell out of me. You fanboys need to give this "Apple is always the leader / best" mentality a rest.
 
I have no idea what movies you've watched, but you can see a huge amount of difference between a BD and a DVD. And by huge, I mean...like...a whole bunch.

And anyway, why would they use a lossy compression algorithm that produces picture quality worse than a DVD when they have at least 6x the amount of space to work with?

My DVD player has a 1080p upscaler that is surprisingly good. It ended up being a higher quality 1080p than Bluray's 1080p. It was The Dark Night on Bluray.

----------

Uh ? An upscaled DVD, compressed using MPEG2 which is an older and worse codec than H.264 is worse than a blu-ray playing at native resolution ?

How is that even remotely possible unless the Blu-ray came from a worse source than the DVD ? The Blu-ray both has more details (upscaling cannot add details that aren't there to begin with) and a much more efficient compression scheme...

Are you just repeating Steve's spiel ? :rolleyes: Should I simply dismiss you entirely and ignore your posts or was this just a one time event ?

Dude, nobody ever gets what I'm saying. The DVD player is a fine Feroudja with a 1080p upscaler. Sure, it's not as good as a lossless 1080p video, but it's better than Bluray. Bluray uses very lossy compression even though it is a better algorithm.

I actually compared the two on a large screen. The Bluray looked horrible compared to the upscaled DVD, believe it or not. And it was a new movie, the Dark Night.
 
My DVD player has a 1080p upscaler that is surprisingly good. It ended up being a higher quality 1080p than Bluray's 1080p. It was The Dark Night on Bluray

Yes, for sure.


Dude, nobody ever gets what I'm saying.

So, time to give up and finish your high school homework.


The DVD player is a fine Feroudja with a 1080p upscaler. Sure, it's not as good as a lossless 1080p video, but it's better than Bluray.

AFAIK, there is no "lossless" 1080p available to consumers - BD is the gold standard.

(And it's "Faroudja", since your spell checker seems to be defective.)


Bluray uses very lossy compression even though it is a better algorithm.

Right - you want us to believe that h264 or VC1 at 40Mbps is inferior to MPEG2 at 7 Mbps?


I actually compared the two on a large screen. The Bluray looked horrible compared to the upscaled DVD, believe it or not. And it was a new movie, the Dark Night.

We don't believe it. Period.

Check your settings on all components - perhaps your BD player is downsampling to 480i on its output.

And you do realize the "The Dark Knight" is a 4 year old movie, not a "new movie"?
 
Last edited:
Right - you want us to believe that h264 or VC1 at 40Mbps is inferior to MPEG2 at 7 Mbps?




We don't believe it. Period.

Check your settings on all components - perhaps your BD player is downsampling to 480i on its output.

And you do realize the "The Dark Knight" is a 4 year old movie, not a "new movie"?

At the time, the movie was new, and my cousin (a Bluray fan) actually suggested that we try it to see how great it looks. We checked the settings, and it was indeed outputting the full resolution of 1080p.

I will admit that the Bluray movie had better color than the DVD (due to the better compression algorithm), but the overall quality ended up being worse with its very scratchy edges. This was on a very large screen, by the way.
 
iMac - the first 4K screen you'll ever own.

screenshot20120414at416.png


It would probably be cheaper for Apple to give the 27" iMac an 8K resolution (7680×4320 - 4x the resolution of the 21") and skip anything in-between.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.