So I have a 13" rMBP that works great, but am seeing a ton of threads on here about the Retina MBP's struggling to handle so many pixels, etc. My question is why is this not the same issue with a retina iPad? I mean, the iPad has a pixel doubled 2048x1536 display and does not have an Ivy Bridge chip clocked at 2.5GHz or higher. Nor does it have 8GB of RAM. Can someone explain why the iPad has no lag issues and the seemingly much more powerful rMBP's may struggle?
Because they run different OS and are designed to handle quite different tasks. Applications on the iPad are also optimized to run on limited resources. It really doesn't make much sense to compare these two devices... they really serve quite different purposes / are used differently.
Not only do the two devices run completely different OSes, but the number of pixels pushed by each machine is vastly different: rMBP (15"): 2880 x 1800 resolution (5184000 pixels) rMBP (13"): 2560 x 1600 resolution (4096000 pixels) iPad (3rd/4th Generation): 2048 x 1536 resolution (3145728 pixels)
From what I understand, it's because there's a lot more scaling (which is more CPU/GPU intensive) involved in OS X and Retina than the native resolution of iOS at Retina.
As someone pointed out, it's because the rMBP has to scale and then rescale the entire screen on each (rendering) pass. I'm sure there's some neat tricks they're doing to optimize this (texture caching I'm guessing... probably why there's more shared VRAM in the rMBP 13").
Isn't iOS built on OS X as its core. Steve Jobs said at the iPhone unveiling that "iPhone runs OS X." It was my understanding that iOS is just a touch-friendly front with the core being OS X.
iOS was "derived" from OSX, which in reality means: they used some core functions / libraries from OSX so that they don't reinvent the wheel, but iOS is not OSX. They are similar, but in the end, implemented quite differently. iOS doesn't work the same way OSX does ( if it were, we could use at least some OSX applications directly on iPhone and iPad without having to develop an iOS version specifically for the mobile devices ). As others have pointed out, OSX constantly scales the content before rendering the final screen ( no matter which resolution you have set... be it even "best for retina" ), which of course impacts overall performance. The scaling mechanisms / algorithms will most likely improve over time, but I doubt anyone can say "how much exactly will things improve and when exactly".
First of all, it doesn't have to do that. Its enough to just redraw/rescale the dirty rects (parts of the screen that changed). The only time you need to do this for the whole screen every frame is when you have fast full-screen animation as with Mission Control or full-screen movie watching. And HD 4000 has plenty of bandwidth to do this way over 100 times a second (something you will never need). Note that there is an initial lag when initialising Mission Control, because the graphics content of all applications must be collected first, but the animation itself is smooth afterwards. This already shows that full-screen scaling is not a performance problem, at all. The only scenario where I see it becoming a problem is gaming. Also, please refer to my post about his very topic here:
A desktop OS is designed to allow multitasking. It may contain an entire world of processes aimed at finalize as much horsepower as possible. iOS contains a small but critical part of OS X. This doesn't absolutely mean they share deeper roots.
Sure, but let's assume 25% of the screen is dirty (to be conservative). That's still a sizable number of scales/rescales with a rather large pixel density. I agree with you, but I also think it is _a_ factor leading to (slight) performance decrease. Also, given that each window in OS X is rendered as a texture for smooth animations, that's a rather large texture size/window that needs to be cached to memory.
Oh, don't get me wrong, I would never argue that HiDPI rendering is less demanding. Its pretty clear that there is lots of additional workload involved. I'm just trying to make the point that HD 4000 is by no means too slow to deal with that workload. BTW, 25% even of worst case scenario (highest res mode of 15" retina) is smaller than a 1024x1024. Modern games have thousands of 512x512 texture applications per second, modern GPUs like HD 4000 eat stuff like 1024x1024 for breakfast. And as to video memory requirement, its also really clear that the HiDPI mode will require significantly more RAM for textures. However, that's not 'that much'. A fullscreen 2880*1800 texture is 20MB. If we assume that the active cache consists of 20x that much data, we still nee 'only' around 400MB. My WindowServer currently occupies around 800MB of data (retina 15" inch, HiDPI 1680x1050 mode). And for the HD 4000, it does not matter whether the texture resides in the VRAM or system RAM, the two are the same thing anyway.