Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Just think of what would happen if this technology came to TVs. Ultra Blu Ray?

Why bother with bluray for 4k plus resolutions? Digital files/memory sticks/sd cards would be much more convenient and mean smaller packaging and easier storage. Bluray was always going to be the shortest lived physical media 'standard', why get bogged down by mechanically limited devices when progress is now so fast? Apple did the right thing by not spending money on licensing fees, increasing unit costs, by not putting blu ray drives/burners into macs.

----------

Intel developed thunderbolt

Intel and Apple developed thunderbolt, with Apple footing most of the bill. It's the reason for the 12 month Mac exclusivity.
 
THIS is why I sold my 30" Apple Cinema Display. In a few years, I can get a very nice one (or a cheap, good-by-today's-standards one).

----------

Why bother with bluray for 4k plus resolutions? Digital files/memory sticks/sd cards would be much more convenient and mean smaller packaging and easier storage. Bluray was always going to be the shortest lived physical media 'standard', why get bogged down by mechanically limited devices when progress is now so fast? Apple did the right thing by not spending money on licensing fees, increasing unit costs, by not putting blu ray drives/burners into macs.

----------



Intel and Apple developed thunderbolt, with Apple footing most of the bill. It's the reason for the 12 month Mac exclusivity.

Yeah, Bluray was a bad idea (and Steve was correct about it). Plus, the quality wasn't all that good because of the very lossy H.264 compression it used. It ended up looking worse on my big screen than a DVD in my good 1080p upscaling DVD player. Of course, my dad got a Bluray player against my advice and was soon sorry that he did. Oh, and Disney and Sony tried to cram Bluray down everyone's throats during the 2008-2009 recession, probably the worst time to introduce a more expensive way to watch movies. Many people don't even care about the quality.

Also, didn't Intel make Lightpeak by themselves, then Apple turned it into Thunderbolt?
 
Yes, this is true, in fact for anything other than a desktop monitor, this may indeed be overkill.

My work PC has a 22 inch monitor, and the resolution is set at 1920x1200, and it may be my 40 year old eyes, but at a viewing distance of around 30 inches, I can't see any pixels.

Not only overkill, it will be detrimental to at least two other very important things, battery life and graphics performance.

Personally, my screen is nice enough on my laptop. It's not worth it to get a "retina" display (honestly mine probably already is from as far away as I tend to view it at, remember the definition is just not being able to see a pixel from a certain distance) to either lose battery life (or get a heavier laptop to accommodate the bigger battery) and to take up a lot of graphics performance.

Keep in mind the iPad 3 needs the bigger battery mostly for the display (you don't gain much by going wi-fi only so it's not the 4g that's killing the battery ;) ) and also the much faster graphics chip (where it doesn't have that much a jump ahead of the iPad 2 despite the faster graphics chip cause most of that is being used up by the retina display).

Now take that and put it in an even bigger screen and it will probably be an even harder hit on the battery and graphics card.

Honestly, it's uncomfortable anyways to lean over enough to see the pixels on my laptop's screen when I have it on a table, why do I need a "retina" display? Even if I did tend to look at it that closely, it's still not worth the trade off imho.
 
AMD and Nvidia are the two GPGPU vendors vying for Apple's affections for HiRes Displays, not Intel.

Only the AMD's 7000 series support HiRes (4k) display (Max resolution: 4096x2160 per display), and not even the Nvidia 680 series support above 2560x1600 on one Display.

Intel is no where near the ballpark of these leading vendors. They never will be.
 
Why bother with bluray for 4k plus resolutions? Digital files/memory sticks/sd cards would be much more convenient and mean smaller packaging and easier storage. Bluray was always going to be the shortest lived physical media 'standard', why get bogged down by mechanically limited devices when progress is now so fast? Apple did the right thing by not spending money on licensing fees, increasing unit costs, by not putting blu ray drives/burners into macs.

If you can think of a better way to distribute 4k plus resolution movies out to the public for cheap, I'm all ears. Streaming a movie that large won't be an option for 99% of North America. You're only other choice would be mass produced thumbdrives. But hell, all the people who consider Blu-Ray a dead format are also proclaiming the same thing about USB. Not only that, but it'd be considerably more expensive than a single Blu-Ray disc. A good 32GB thumbdrive costs around $30 these days. Guess your only choice would be a 40+ Thunderbolt drive (which is, apparently, the future for all types of everything ever), but that would be A. overkill, and B. cost way, way, wayyyyy too much to justify the expenditure for a single movie.

Guess our only choice would be Blu-Ray. It can hold tons of information, is easy to mass produce, and it's cheap.
 
The whole 'Retina' term is marketing bulljive. Sit far away enough from *any* screen and PRESTO! ITS RETINA DISPLAY! :rolleyes:

'Retina' should mean; 300dpi, literally that many dots (or pixels) per inch at ANY distance, thereby truly being 'retina'

Its basically false advertising.

----------



When you give your product the tag line 'RESOLUTIONARY'...

Yes.


Snobby much?

Btw, if Samsung deserves so much credit for "retina display," then explain why it was brought to market on an Apple product, and not a Galaxy phone?

Theres a difference between designing, engineering, and implementing a feature, vs simply manufacturing it. Sorry, no matter how much you seem to not like giving Apple credit, it was Apple that designed and brought it to market.
 
Yeah, Bluray was a bad idea (and Steve was correct about it). Plus, the quality wasn't all that good because of the very lossy H.264 compression it used. It ended up looking worse on my big screen than a DVD in my good 1080p upscaling DVD player.

I have no idea what movies you've watched, but you can see a huge amount of difference between a BD and a DVD. And by huge, I mean...like...a whole bunch.

And anyway, why would they use a lossy compression algorithm that produces picture quality worse than a DVD when they have at least 6x the amount of space to work with?

Millah said:
Btw, if Samsung deserves so much credit for "retina display," then explain why it was brought to market on an Apple product, and not a Galaxy phone?

Theres a difference between designing, engineering, and implementing a feature, vs simply manufacturing it. Sorry, no matter how much you seem to not like giving Apple credit, it was Apple that designed and brought it to market.

Apple does deserve credit for kicking off the high DPI revolution. The only problem is when Apple does something first, all these people come out of the woodwork, screaming copycat and abusing the word innovation like it's going out of style every time some other company follows suit. It gets so annoying so very quickly.
 
We won't see that. Ultra Blu-ray is a bag of hurt. :rolleyes:

Okay .. so when 4K video becomes more common, enlighten us how to distribute the video on cheap and mass produced method? I certainly wouldn't want to download 20GB of compressed 4K videos off iTunes Store. Majority of the universe do not have 20Mbps to eat huge downloads in minutes. For us, it means days and days of download. You happy with that?
 
PLEASE lay off my retina. Seriously...
My MBP can already fry up eggs and bacon - now the new ones will be thinner and even more high-powered?
I realize that my MBP will eventually die. Until then, it will ALWAYS have more power than I already need.
What sucks is that, some day in the future, I'll need to buy a new machine. And it's going to seriously burn me.
I own an ORIGINAL iPad. For the life of me, I can't see any "pixelation". What the hell are they going on about?
"Well hey, the new MBP can boot up in one nanosecond, run 2,000 programs simultaneously on it's 5TB of dedicated internal superawesomesawesomenessitronics..."
It takes me about 15 seconds to walk to the fridge and pop open a bottle of beer. Another 10 seconds to enjoy the first gulp.
Time, I've got. Treatment for 3rd degree burns from accidentally touching my new MBP with my wrists - no thanks.
And even if they do figure out how to cool it all down - WHO CARES?

As for Blu Ray, that was actually cool - if you question it, then try watching Planet Earth on your upscaling DVD player and wait for a scene with a flock of hundreds of birds. Bringing the world in that level of detail, clarity, and framerate (which, as someone else pointed out, is "retina") into my living room is AWESOME!

Anyway, all of this technological advancement to save us a few nanoseconds or pixelate our retinas... Sorry, guess I'm just not nerdy enough to give a damn.
 
Last edited:
I have no idea what movies you've watched, but you can see a huge amount of difference between a BD and a DVD. And by huge, I mean...like...a whole bunch.
You can see a huge difference between a blueray and downloaded/streamed content... especially in action scenes, where a Stream will typically go blocky (DVD and Blueray will not).
 
Snobby much?

Btw, if Samsung deserves so much credit for "retina display," then explain why it was brought to market on an Apple product, and not a Galaxy phone?

Theres a difference between designing, engineering, and implementing a feature, vs simply manufacturing it. Sorry, no matter how much you seem to not like giving Apple credit, it was Apple that designed and brought it to market.

No, its called being realistic.

The guys that actually MAKE the display dont get any credit. I would bet my left testicle that a big majority of people that WAITED IN LINE to get the first iPads didn't and still don't know the display is made by Samsung. Yes, I'm willing to bet that even the die hard overnight campers have no clue about the iPads specs. But that doesnt matter according to people like you. Doesnt matter if any or all of the components inside mac products have a label with from a different company so as long as Apple gets 100% of the credit if only for the sake of worshipping Apple. You don't actually believe that everything packaged inside an Apple box is completely at the hands of Apple, do you? Intel, Nvidia, AMD, Samsung, Toshiba, Hitachi, LG, etc... all those guys are responsible for supplying parts to Apple. Foxconn puts all their stuff together. You think just because Apple slaps a sticker on their product they did 'all' the work?

Its phenomenal how obsessed people have become with Apple... blindly obsessed.

Like I said, give credit where credit is due.
 
Would an all in one with a retina like display still have a low power consumption? That's one of the main reasons I bought my iMac. For it's low power usage. Power is not cheap these days. And these displays do draw more power.

But apart form that I'd love a retina iMac. That'd be so so cool. But if it comes out 2013 or 2016 I'm fine with that as I do have a mid 2011 iMac. And I'm not upgrading this iMac (ie buying anew) till 2015 at the earliest.
 
You think just because Apple slaps a sticker on their product they did 'all' the work?

Samsung components are great and they deserve credit for that, but Apple is not simply slapping stickers. Their product can exist thanks to these components, and these components exist because Apple contributed in creating a huge market which requires them and will require even better components in the future.

Samsung deserves credit for the components, but Apple definately deserves credit for being very consistently pushing devices which define new standards.
 
The point of resolution independence is that when a programmer writes code to draw a line x number of inches long it ends up being that long on a traditional display or a HiDPI display. The same goes for text. That is resolution independence as what you see is not tied to the pixel density of the screen. The operating system in effect uses as many pixels as needed to get the right dimensions on screen.

Resolution independence isn't about your tweaky need for more real estate. Rather the goal is to get more of that WSIWYG effect that Apple has been known for.

That is the whole point of resolution independence, the on screen UI features get drawn the same size no matter what your screen size. I can't understand why this is so damn difficult for people to grasp, yet we have already many posts in this thread indicating that people don't get it.
Aaah, that's embarassing. It's wrong on so many levels! A friend of mine as a MacBook Pro and I have the exact same MacBook Pro, but the Hi-res version. Everything on my screen is smaller compared to his.

That's why we need "HiDPI"-mode. Bigger resolutions but not smaller icons, just the same size with a higher resolution. If I recall correctly the iPhone 4 masked the resolution as the iPhone 3GS-resolution when web browsing as not to suddenly display full web pages vs mobile web pages.
 
I'm really looking forward to Retina Macs so that I can get the sharp text now seen on iOS.

And as an iOS developer I will be able to display a new iPad screen in portrait at full resolution. That will make work easier.
 
Yeah - keep the elements the same size, just make the image crisper, Apple!

Both are needed. We need elements to be of a readable size (possibly adjustable to suit different tastes), but high DPI would be great for images.

I have a 15" MBP with the hi-res screen, but prior to this I had a ThinkPad with 1920 x 1200 in 15.6" screen. While I miss the resolution of the ThinkPad sometimes (spreadsheets, image editing, multiple windows, etc) the elements on a 1680 x 1050 display are much better suited to my eyesight. So the elements of the current hi-res MPB display work fine, but for photography work (or games when I get a chance to play them) I would prefer a higher DPI.

One thing we all need to be aware of however is if the next thing is a 'DPI war' for screens some manufactures will increase DPI at the expense of quality - i.e. colour matching will be way off, as will viewing angles probably. You will see very high resolution, but very poor colour quality photographs.
 
Given that some of us want elements to remain the same size as they are at 1440x900 on a 15" panel, and some want the likes of 1920x1200 on at 15" panel for more desktop space, I do hope Apple caters to both of us.

They did originally with the 17" MacBook Pro, providing a standard resolution and a high resolution option. They still do with the 15" MacBook Pro.

Be interesting to see if 2880x1800 on the 15" is what Apple keeps as the resolution during the next revision in 2013 (looking at Intel's roadmap). I'd imagine they would - they've never been one for offering ultra high resolutions on their laptops (13" Pro, 1920x1200 not available on the 15" etc).
 
Not quite...

Given all the hype for 2012 Apple products, the rumors of a big MacBook Pro refresh and Retina Display graphics in OS X Lion, I'd be surprised if all new MacBooks don't have Retina Displays.

Can you imagine a 27" Retina Thunderbolt display? If they times the current resolution by 4 it's be 10240 x 5760 pixels! That's 58,982,400 pixels!

. . . Okay, maybe that won't get upgraded any time soon.

As impressive as that sounds, unfortunately that's not right. Increasing the resolution x4, the actual res would be 5120 x 2880 pixels (it works by increasing the area, not the physical dimensions - you've increased the res x16 times (4x width, 4x height). Would be great though, although in thinking that way, you'd be looking at a 54" screen on your desk!!!
 
PLEASE lay off my retina. Seriously...
My MBP can already fry up eggs and bacon - now the new ones will be thinner and even more high-powered?
I realize that my MBP will eventually die. Until then, it will ALWAYS have more power than I already need.
What sucks is that, some day in the future, I'll need to buy a new machine. And it's going to seriously burn me.
I own an ORIGINAL iPad. For the life of me, I can't see any "pixelation". What the hell are they going on about?
"Well hey, the new MBP can boot up in one nanosecond, run 2,000 programs simultaneously on it's 5TB of dedicated internal superawesomesawesomenessitronics..."
It takes me about 15 seconds to walk to the fridge and pop open a bottle of beer. Another 10 seconds to enjoy the first gulp.
Time, I've got. Treatment for 3rd degree burns from accidentally touching my new MBP with my wrists - no thanks.
And even if they do figure out how to cool it all down - WHO CARES?

As for Blu Ray, that was actually cool - if you question it, then try watching Planet Earth on your upscaling DVD player and wait for a scene with a flock of hundreds of birds. Bringing the world in that level of detail, clarity, and framerate (which, as someone else pointed out, is "retina") into my living room is AWESOME!

Anyway, all of this technological advancement to save us a few nanoseconds or pixelate our retinas... Sorry, guess I'm just not nerdy enough to give a damn.

You are so lucky to be completely satisfied by yesterday's technology. When your grandkids swoop in with their hover boards and automatic shoelaces, you can tell them stories about walking to school (uphill--both ways--in the snow).
 
I don't want 1920x1200 on a 15" screen, elements are small enough on 1680x1050!

Maybe it is just me, but I like things being the size they are at 1440x900 on a 15" screen. I was happy for 2880x1800 for the 15" being the rumoured resolution!

I guess you were downvoted by younger people... The one thing I regret from my MBP 15" is choosing the hi-res 1680x1050. Things are small for my not-quite-40-et eyes! Luckily, I'm plugged in an external monitor most of the time.

If they keep two options, like they do now, it would be the best for every one!
I would be glad to "upgrade" to 2880x1800 on my next one. :D
 
Yeah, Bluray was a bad idea (and Steve was correct about it). Plus, the quality wasn't all that good because of the very lossy H.264 compression it used. It ended up looking worse on my big screen than a DVD in my good 1080p upscaling DVD player.

Uh ? An upscaled DVD, compressed using MPEG2 which is an older and worse codec than H.264 is worse than a blu-ray playing at native resolution ?

How is that even remotely possible unless the Blu-ray came from a worse source than the DVD ? The Blu-ray both has more details (upscaling cannot add details that aren't there to begin with) and a much more efficient compression scheme...

Are you just repeating Steve's spiel ? :rolleyes: Should I simply dismiss you entirely and ignore your posts or was this just a one time event ?
 
I'm still sad they dumped the resolution independence and went down the hi-dpi route instead. One size does not fit all...

Resolution independence doesn't work, that's all. x2 is the only way to get higher res displays - clearly more flexible on a laptop screen as the base size is flexible, e.g. could be 1680x1050 (x2) or 1440x900 (x2) but the x2 factor is the only way to make this work.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.