Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Is this seriously a concern...?

It's already arguable that 4K resolution is even necessary on full size 50+ inch television sets with regards to difference in clarity at normal viewing distances and, hell, even the ability to tell them apart from 1080p panels. I have no doubt a 4K resolution looks better, but from normal 9 foot viewing distances in average family rooms with regards to adequate TV sizes, people can barely tell the difference between 720p and 1080p panels, let alone 4K.

The funnier part still is that we can't even fully utilize our 1080p sets considering most, if not all ISP's and cable providers, end up streaming much lower quality signals to your TV's while most, if not all, of the set top boxes can't even push what would be needed for the 1080p TV's we have now to utilize fully. The only way you're going to get such a clear signal is with physical BluRay media in home now, as no streaming or on demand video service can provide that level of quality, even as we enter 2013... and we're already thinking of 4K panels for TVs? Call me what you will, but that's putting the car before the horse if I've ever heard of it.

And 4K on a laptop, while not only being unnecessary from a perceivable viewing distance compared to the current retina resolution, would have the same issues. Any media you'd want to view couldn't be streamed to take advantage of such resolutions. Plus, any digital media isn't yet distributed, and probably never will be judging from how 1080p content has been handled, in any quality that will make that difference relevant.

TL;DR:

The current panel of the rMBP should actually be the least of your concerns for 4K resolutions. If you want to see the real problem, focus on the ISPs who currently offer no way to get content at that resolution efficiently (they can't even get full quality 1080p streamed to TVs in 2013, what makes you think they'll be able to push 4X the data in 10 years time), the media providers who are offering no good way to get media other than physically at that resolution, and the sheer fact that 4K vs. the current retina resolution of 2880x1880 on the rMBP could most likely never be differentiated at a 15" for factor, the concern for 4K resolutions and the current computer's monitor resolution should be the least of your worries.

Plus, by the time any of this 4K stuff becomes reality, the current rMBP's will be way past their usable lifetimes. I'm estimating mainstream consumer adoption of the tech in 5 to 8 years time... at a minimum...
 
Though, at what kind of frame rate?

Presumably, 24 fps or whatever the standard frame rate is for 4k TV.

A 386 could decode multiple 4k streams if given infinite time to run at whatever crappy frame rate.

You realise that using quicksync, the sandy bridge GPU can decode high def video using about 1-2% CPU, yes? 4k resolution will only be a factor of 4x that, so we're talking somewhere between 5% and 10% cpu utilisation per stream?
 
The pre-retina 15" MBPs have never been able to display 1080p content at full resolution.

Why is this situation any different?

I'll give you a hint: It's not.

P.S. Bring back 17". A 17" RMBP would play 4k video fine.
 
Last edited:
With 4K TV's and displays coming that are able to display content greater than the rMBP can, the retina screen is going to be outdated soon. The resolution for the rMBP is too low for 4K content.

Perhaps in a future rMBP, the resolution / PPI could be bumped to 3840 x 2160 so that 4K content can be enjoyed to the fullest.

4k only makes a difference on 17"
screens and above. There is littje to no benefit to 4k on a laptop other than better support for upsampling without artifacts.
 
no.

since all your TVs and monitors are outdated, you can give them to me.

oh, give me your car aswell, it's not a 2013 model, so what would you want to use it for now?

House too, don't see any voice controlled kitchen or a spa with touch screen? your house is useless!

what is the best way to get rid of my MBPR? i cant sell it now.. nobody wants it.
 
With 4K TV's and displays coming that are able to display content greater than the rMBP can, the retina screen is going to be outdated soon. The resolution for the rMBP is too low for 4K content.

Perhaps in a future rMBP, the resolution / PPI could be bumped to 3840 x 2160 so that 4K content can be enjoyed to the fullest.

Soon already? Wow, pretty specific time frame you have there. All tech is outdated before you can buy it, good luck if that's new information or bothers you.
 
Presumably, 24 fps or whatever the standard frame rate is for 4k TV.

A 386 could decode multiple 4k streams if given infinite time to run at whatever crappy frame rate.

You realise that using quicksync, the sandy bridge GPU can decode high def video using about 1-2% CPU, yes? 4k resolution will only be a factor of 4x that, so we're talking somewhere between 5% and 10% cpu utilisation per stream?
I see. Though I don't think 24fps is up to snuff for the right computer.

No, I don't know anything about the Quick-sync. It's something I'd have to look into. And I don't think the math or the scale of the algorithms would work out that way.
 
4K is going to be incredibly cost prohibitive for the next few years. I'm not expecting to see any sort of mass adoption of it until at least 2015 or 2016.
 
4K is going to be incredibly cost prohibitive for the next few years. I'm not expecting to see any sort of mass adoption of it until at least 2015 or 2016.

Agreed. The limiting factors are going to be screens and network bandwidth. Not processing....video processing is one of those (relatively) easily solved "embarassingly parallel" problems in terms of compute power required.
 
With 4K TV's and displays coming that are able to display content greater than the rMBP can, the retina screen is going to be outdated soon. The resolution for the rMBP is too low for 4K content.

Perhaps in a future rMBP, the resolution / PPI could be bumped to 3840 x 2160 so that 4K content can be enjoyed to the fullest.

The retina moniker apple uses means that you can not distinguish individual pixels. Therefore every screen that is retina is the highest resolution you will need. Forever.
(unless someone changes the view distance used to calculate retina)

The reason for 4k is because large TV's and projectors are still not retina. In fact the end goal for very large TV and projectors is 8k.
 
No way! How big were those 4k TVs? 70 inches? Having this insane resolution on a laptop is still great. 4k laptops will not be around for AWHILE. And I mean, AWHILE.
 
Managed to find a 2560x1440p trailer last night and played it on my rMBP. Looks absolutely stunning and certainly not what I'd call outdated.
 
The first 4k movie I've seen was 160GB.
So unless they shrink these videos tremendously then acquiring them will be a chore.

Optical Disc? No sir, not in 2013.

Downloading? What's your ISP's monthly data cap? Mine is 500GB, that's 3 movies and that's 100Mb/s service, downloading with normal DSL 6-15Mbps would take quite a while.

200GB flash drives for the movie? Even most 128GB flash drives I've seen were right at a hundred bucks, on top of the movie cost.

Then again I'm sure someone that actually owns a true 4k projector at the moment could care less about a $150 movie. I'm just thinking someone cares way, way, way more about pixels than I will for quite some time.

Monthly cap? Do you have that on a land-based broadband(e.g. non-mobile)? In Sweden we have no such thing as a cap, you can download/upload as much as you want.
 
Monthly cap? Do you have that on a land-based broadband(e.g. non-mobile)? In Sweden we have no such thing as a cap, you can download/upload as much as you want.

It depends on the company. We have no monthly caps on Verizon FiOS here in New York.
 
4k 27" monitors will come eventually, but not for a while. You would need a pretty impressive CPU and video card to run that kind of setup at reasonable framerates, and I don't think the tech is there yet. Maybe in 2-3 years.
 
Monthly cap? Do you have that on a land-based broadband(e.g. non-mobile)? In Sweden we have no such thing as a cap, you can download/upload as much as you want.

The caps (at least the ones I've seen in TX, NC, SC) where I've lived aren't advertised but is usually listed/hidden somewhere on the ISP's website.

Also the caps are usually higher than 95% of the people will ever reach.

This is the info Charter has listed on their website which is kind of hidden within their support section.
Thresholds are commonly used in the industry, and Charter has established in the Company's Acceptable Use Policy "AUP" residential service thresholds at 100 gigabytes (GB) of bandwidth per month for customers subscribing to Lite and Express services, 250 GB of bandwidth per month for customers subscribing to Plus and Max (Grandfathered) service and 500 GB of bandwidth per month for Ultra100 service.

Charter for cable internet(25-100Mbps) and ATT u-Verse for DSL(3-24Mbps) are basically the only providers in my area for internet. There are a few spots that can receive FiOS, which IMO is ridiculous but that's just me.
I've had the Charter 100Mbps service since it became available in my area almost 2 yrs ago and have never hit the 500GB cap I have, or at least I've never received an email or contacted by Charter about it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.