Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

PatriotInvasion

macrumors 68000
Original poster
Jul 18, 2010
1,643
1,048
Boston, MA
So I have a 13" rMBP that works great, but am seeing a ton of threads on here about the Retina MBP's struggling to handle so many pixels, etc.

My question is why is this not the same issue with a retina iPad? I mean, the iPad has a pixel doubled 2048x1536 display and does not have an Ivy Bridge chip clocked at 2.5GHz or higher. Nor does it have 8GB of RAM.

Can someone explain why the iPad has no lag issues and the seemingly much more powerful rMBP's may struggle?:confused:
 

Maczor

macrumors regular
Oct 23, 2012
148
0
LU, Switzerland
Because they run different OS and are designed to handle quite different tasks. Applications on the iPad are also optimized to run on limited resources. It really doesn't make much sense to compare these two devices... they really serve quite different purposes / are used differently.
 

dmccloud

macrumors 68030
Sep 7, 2009
2,970
1,696
Anchorage, AK
Not only do the two devices run completely different OSes, but the number of pixels pushed by each machine is vastly different:

rMBP (15"): 2880 x 1800 resolution (5184000 pixels)
rMBP (13"): 2560 x 1600 resolution (4096000 pixels)
iPad (3rd/4th Generation): 2048 x 1536 resolution (3145728 pixels)
 

nontroppo

macrumors 6502
Mar 11, 2009
430
22
Because Apple don't spend much time or much resource on their graphics drivers for OS X.
 

evanavevan

macrumors member
Jun 24, 2010
86
1
From what I understand, it's because there's a lot more scaling (which is more CPU/GPU intensive) involved in OS X and Retina than the native resolution of iOS at Retina.
 

bizack

macrumors 6502a
Apr 21, 2009
611
399
As someone pointed out, it's because the rMBP has to scale and then rescale the entire screen on each (rendering) pass. I'm sure there's some neat tricks they're doing to optimize this (texture caching I'm guessing... probably why there's more shared VRAM in the rMBP 13").
 

PatriotInvasion

macrumors 68000
Original poster
Jul 18, 2010
1,643
1,048
Boston, MA
Because they run different OS and are designed to handle quite different tasks. Applications on the iPad are also optimized to run on limited resources. It really doesn't make much sense to compare these two devices... they really serve quite different purposes / are used differently.

Isn't iOS built on OS X as its core. Steve Jobs said at the iPhone unveiling that "iPhone runs OS X."

It was my understanding that iOS is just a touch-friendly front with the core being OS X.
 

Maczor

macrumors regular
Oct 23, 2012
148
0
LU, Switzerland
Isn't iOS built on OS X as its core. Steve Jobs said at the iPhone unveiling that "iPhone runs OS X."

It was my understanding that iOS is just a touch-friendly front with the core being OS X.

iOS was "derived" from OSX, which in reality means: they used some core functions / libraries from OSX so that they don't reinvent the wheel, but iOS is not OSX. They are similar, but in the end, implemented quite differently.

iOS doesn't work the same way OSX does ( if it were, we could use at least some OSX applications directly on iPhone and iPad without having to develop an iOS version specifically for the mobile devices ). As others have pointed out, OSX constantly scales the content before rendering the final screen ( no matter which resolution you have set... be it even "best for retina" ), which of course impacts overall performance.

The scaling mechanisms / algorithms will most likely improve over time, but I doubt anyone can say "how much exactly will things improve and when exactly".
 

leman

macrumors Core
Oct 14, 2008
19,184
19,037
As someone pointed out, it's because the rMBP has to scale and then rescale the entire screen on each (rendering) pass. I'm sure there's some neat tricks they're doing to optimize this (texture caching I'm guessing... probably why there's more shared VRAM in the rMBP 13").

First of all, it doesn't have to do that. Its enough to just redraw/rescale the dirty rects (parts of the screen that changed). The only time you need to do this for the whole screen every frame is when you have fast full-screen animation as with Mission Control or full-screen movie watching. And HD 4000 has plenty of bandwidth to do this way over 100 times a second (something you will never need). Note that there is an initial lag when initialising Mission Control, because the graphics content of all applications must be collected first, but the animation itself is smooth afterwards. This already shows that full-screen scaling is not a performance problem, at all. The only scenario where I see it becoming a problem is gaming.


Also, please refer to my post about his very topic here:

Drivers are actually the least significant factor here. Driver performance matters in 3D applications, where internal driver inefficiencies in implementing the complex API can severely reduce the performance.

Desktop compositing uses only a small subset of the GPU features and does not involve thousands API calls per second as usual 3d application do. And Apple has its own optimisations to ensure that the desktop compositor has very fast access to video RAM (I do hope that these are no broken).

The culprit is most likely the combination of some inefficiencies of Apple's own HiDPI implementation + bad application code. Take the App Store as the prime example for a really sluggish app on the retina MBP. The problem here is that it handles resize actions in a very inefficient way (I have no idea what they do, but they seem to recalculate/redraw the whole view multiple times when resized). It is actually already sluggish at non HiDPI mode - but it only becomes apparent with HiDPI, where it has to render 4x pixels.

Basically, the hardware has undergo some incredible advancements in the last few years, so many programmers became lazy. Its insane how much resources some applications need to perform really mundane tasks.

P.S. and yes, the HD 4000 has enough horsepower/bandwidth/fillrate for retina, as I have pointed out in multiple threads already. Its benchmarked pixel fillrate is way above 1Gpixel/sec while even 15" retina has 'only' 5 megapixels. This basically means that the card is easily capable of more than 150 full-screen retina updates per second. On practice, only the modified display regions ever get updated, so you usually need only a fraction of that power. The only time it gets 'narrow' is when you have constant large-scale updates, like in movies (less of a problem, as we usually need less then 30 fps here) or games.
 

LostSoul80

macrumors 68020
Jan 25, 2009
2,136
7
A desktop OS is designed to allow multitasking. It may contain an entire world of processes aimed at finalize as much horsepower as possible.
iOS contains a small but critical part of OS X. This doesn't absolutely mean they share deeper roots.
 

bizack

macrumors 6502a
Apr 21, 2009
611
399
First of all, it doesn't have to do that. Its enough to just redraw/rescale the dirty rects (parts of the screen that changed). The only time you need to do this for the whole screen every frame is when you have fast full-screen animation as with Mission Control or full-screen movie watching. And HD 4000 has plenty of bandwidth to do this way over 100 times a second (something you will never need). Note that there is an initial lag when initialising Mission Control, because the graphics content of all applications must be collected first, but the animation itself is smooth afterwards. This already shows that full-screen scaling is not a performance problem, at all. The only scenario where I see it becoming a problem is gaming.


Also, please refer to my post about his very topic here:

Sure, but let's assume 25% of the screen is dirty (to be conservative). That's still a sizable number of scales/rescales with a rather large pixel density. I agree with you, but I also think it is _a_ factor leading to (slight) performance decrease. Also, given that each window in OS X is rendered as a texture for smooth animations, that's a rather large texture size/window that needs to be cached to memory.
 

leman

macrumors Core
Oct 14, 2008
19,184
19,037
Sure, but let's assume 25% of the screen is dirty (to be conservative). That's still a sizable number of scales/rescales with a rather large pixel density. I agree with you, but I also think it is _a_ factor leading to (slight) performance decrease. Also, given that each window in OS X is rendered as a texture for smooth animations, that's a rather large texture size/window that needs to be cached to memory.

Oh, don't get me wrong, I would never argue that HiDPI rendering is less demanding. Its pretty clear that there is lots of additional workload involved. I'm just trying to make the point that HD 4000 is by no means too slow to deal with that workload. BTW, 25% even of worst case scenario (highest res mode of 15" retina) is smaller than a 1024x1024. Modern games have thousands of 512x512 texture applications per second, modern GPUs like HD 4000 eat stuff like 1024x1024 for breakfast.

And as to video memory requirement, its also really clear that the HiDPI mode will require significantly more RAM for textures. However, that's not 'that much'. A fullscreen 2880*1800 texture is 20MB. If we assume that the active cache consists of 20x that much data, we still nee 'only' around 400MB. My WindowServer currently occupies around 800MB of data (retina 15" inch, HiDPI 1680x1050 mode). And for the HD 4000, it does not matter whether the texture resides in the VRAM or system RAM, the two are the same thing anyway.
 

bizack

macrumors 6502a
Apr 21, 2009
611
399
Oh, don't get me wrong, I would never argue that HiDPI rendering is less demanding. Its pretty clear that there is lots of additional workload involved. I'm just trying to make the point that HD 4000 is by no means too slow to deal with that workload. BTW, 25% even of worst case scenario (highest res mode of 15" retina) is smaller than a 1024x1024. Modern games have thousands of 512x512 texture applications per second, modern GPUs like HD 4000 eat stuff like 1024x1024 for breakfast.

And as to video memory requirement, its also really clear that the HiDPI mode will require significantly more RAM for textures. However, that's not 'that much'. A fullscreen 2880*1800 texture is 20MB. If we assume that the active cache consists of 20x that much data, we still nee 'only' around 400MB. My WindowServer currently occupies around 800MB of data (retina 15" inch, HiDPI 1680x1050 mode). And for the HD 4000, it does not matter whether the texture resides in the VRAM or system RAM, the two are the same thing anyway.

If this were Hacker News I'd upvote you. Best explanation so far.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.