Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
We are talking about 1920 x 1200 retina display. That is 3840 x 2400 pixels. That's 9.2 million pixels. Times 200 fps, that's 1.84 billion pixels per second, or 5.5 GByte per second, or 44.2 GBit per second. Good luck. "Moving pixels around" means you need to both read and write the pixels. That's 88.4 GBit per second.
2880x1800x4 (8bit RGBA) are approx. 20 MByte.
Try to draw that at 60fps and you are at 1200MByte/s = 1,2 GByte/sec

Now think about available memory bandwidth.
Both VRAM and RAM, because you have to transfer new data if you scroll.

This machine is maxed out because of memory bandwidth.
No driver update, CUDA, OpenCL or anything else will help with that.

Right idea to try the math but misses a few details about current architectures.

CPU and DRAM are connected by a fast bus. (CPU's onboard memory controller)
GPU and VRAM are connected by a fast bus. (in fact, could be as high as 80megabytes/sec (not bits) according to here, although probably not.
CPU and GPU are connected by a much slower bus. (PCI Express)
The actual display port (DisplayPort) reads directly from VRAM independent of the GPU.

This calculations you guys posted assumes that each frame has to be rendered by the CPU, pushed to DRAM, copied via PCIe DMA to VRAM, and then pushed to screen. This used to be the case in the 1990s.

Quartz relies on a compositing engine to handle putting together each frame. The GPU is fed small textures which are pieces of data marked to be pre-rendered graphics, and then told to draw them in some sort of z-order onto VRAM while applying whatever transforms (scaling) requested by the compositing engine. This means that the GPU does most of the work in terms of putting the image together, assuming whoever wrote the app understood how to separate out layers/buffers. Heck, OSX automates this for most coders these days because each NSView/UIView pretty much becomes its own texture.

It also means that when you want to scroll something, you've rendered what can't be 3D accelerated on the CPU, then copied to the GPU, and then finished on the GPU, saved to VRAM, and then from that, a chunk may be copied to the actual framebuffer that the DisplayPort reads from. (Yeah that's an awful run on sentence...sorry.)

In short, when doing a scrolling operation, you only need to ship over the textures once, and most of it's saved, and then when you scroll, as long as it's been shipped over as a texture, it can be drawn to screen for practically free. Even when the GPU has to do work, it can cache the data in VRAM and then reuse it for the next frame of the scroll without having to redo most of the work.

There are limitations to this, but in general, outside of games (which do many more transforms), most GPUs can handle compositing a scrolling operation quite easily at well over 100fps without being anything close to a GT650M, nor needed to continually saturate PCIe once the necessary textures have been prepped.

As an aside for those thinking Apple should have put in 2GB of VRAM: VRAM does not make the graphics system faster unless you're feeding in more textures than fit in the onboard VRAM for a single frame. If you can show a benchmark saying that we're using up all 1GB of VRAM and paging more textures in that didn't fit while scrolling facebook.com, then it would help. Otherwise, it'd just make the cost higher and the performance equal to now.
 
I presume with iOS devices Facebook or Apple is doing some server side processing to reduce the bandwidth sent to devices. Why not do that to the top 10 identifiable end user PC and other devices as well.

If I am wrong that they are, they should.

Anandtech:
"The GPU has an easy time with its part of the process but the CPU’s workload is borderline too much for a single core to handle. Throw a more complex website at it and things get bad quickly. Facebook combines a lot of compressed images with text - every single image is decompressed on the CPU before being handed off to the GPU. Combine that with other elements that are processed on the CPU and you get a recipe for choppy scrolling."

I also think webkit and maybe Safari should be modified to allow the dual and quad core systems we have now and in the future to distribute the load. Between these two schemes, graphics responsiveness should improve, compatibility with older and crippled devices improved, and bigger and better things in the future made practical on a non-desktop unit.

Rocketman

Or... we could yell and scream at facebook for having an atrocious web page that tries to refresh all sorts of stuff and keep creating a larger and larger page to relayout as you scroll.
 
I just think it's a pity there's no MacBook Pro Retina without Retina.

The machine is lighter, much more powerful, and more stylish, which I all value.

However I personally prefer performance and battery life over a retina display, at least at this early stage.

I don't know how much retina affects battery life, and this is my personal opinion based on my needs, I know it wont match everyone.

I travel a lot, and while taking 12-24 hour train trips I would love to be able to spend more of that time working on projects, until then, back to my Kindle (black and white 30 day battery). Taking all that into consideration, I also need power for when I'm working on larger projects, often not where I usually live.
 
Nah it just means screen capture is poorly coded. Now, since you have DP4, if you or someone could show me how Airplay Mirroring works, showing the performance on the TV and the computer, that I'd be interested in. If performance sucks there then I would be disappointed, if not the problem is with the way screen capture and Quicktime X works, nothing else.

Airplay mirroring and screen capture will both suck. It's simply that encoding a video while doing whatever it is you're doing takes up a lot of CPU resources.
It's been this way for Windows, for Linux, for OSX. And it only gets worse with more pixels. Fact of life.
 
Anandtech claims otherwise. Also, people with Mac Pros and far stronger machines also experience slower UI, scrolling etc, yet they have significantly stronger GPUs, nor does turning on the dGPU make any difference, which suggests this is primarily a CPU/Software issue.

This tells me you have no clue how OS X utilizes the GPU for UI events. But someone above has explained some of it. Quartz is a technology Apple introduced years ago that offered hardware acceleration under OS X. It's how OS X offloads UI events to the GPU to free up the CPU so it can tend to core processing tasks in applications. Quartz was a big deal and really made the experience of OS X much better: UI events were finally much faster, more fluid, and animations became much smoother. Before Quartz, there was literally zero hardware acceleration with early versions of OS X and you could tell.

So right there your CPU being the culprit theory isn't holding weight.

Second, you'll notice on Cinebench that the rMBP scored significantly better on the CPU test compared to the MacBook Pro 2011 (6+ vs. 4.9). But the MacBook Pro 2011 scored better by a few extra frames on the GPU test compared to the rMBP. Therefore, the only conclusion that can be derived from this test is that the older MBP is handling UI events better. An understanding of OS X's hardware acceleration and knowledge of how many more pixels there are on the rMBP points to the culprit being the GPU and how it's managing all those pixels.

----------

As an aside for those thinking Apple should have put in 2GB of VRAM: VRAM does not make the graphics system faster unless you're feeding in more textures than fit in the onboard VRAM for a single frame. If you can show a benchmark saying that we're using up all 1GB of VRAM and paging more textures in that didn't fit while scrolling facebook.com, then it would help. Otherwise, it'd just make the cost higher and the performance equal to now.

And this is exactly what I thought. Basically, it's not as much about the amount of VRAM, although it's important, it's about how the GPU and OS X are interacting with each other. That right now the high number of pixels are stressing the GPU such that there seems to be too much data to deal with at one time.

I'm afraid a better GPU driver is rather trivial here because of how much more optimization might need to happen at various levels of the OS.
 
Some how I don't even believe you have one. Your simply saying you have one and it's rubbish just to make your point more valid. Anyone who has one has reported it as being a beautiful device to work on yet you think it's a gimmick. Surely if you really did think it was a gimmick you wouldn't of baught one in the first place. You just sound like a hater disguised as a customer. I'm not saying it's a perfect machine and I'm not defending its true shortcoming but I can't believe anyone would spend so much money on a RMBP and then refer to it as a decoration for an interior. A true customer would simply state it wasnt for them. State their reasons why and would be more subjective in their opinion.

No kid, I buy nothing including her until I have a good ride and I like it enough to keep it. Try not to overuse words like "would" and "I" too much.
 
What I do not understand in this context, which means I do not disaggree so far, but how does retina ipad (ipad 3) handle all this workload - having a lot weaker GPU/CPU?
:confused:

could there be something OS X could learn from it's small brother?
cheers
/Karl
 
No kid, I buy nothing including her until I have a good ride and I like it enough to keep it. Try not to overuse words like "would" and "I" too much.


Attacking grammar is a sure sign of defeat when arguing a point. Your lying, plain and simple.Plus have you read other peoples posts, many of them use the word "I" oftenly and repeated, this isn't an Essay, it's a post on a forum, people write this way sometimes. Your post contains grammatical errors which are quite basic, perhaps polishing up on your English might help with the blemishes in an otherwise flawless piece of modern literature.

And don't call me kid, you have no idea my age regardless of incorrect grammer.
 
Last edited:
Second, you'll notice on Cinebench that the rMBP scored significantly better on the CPU test compared to the MacBook Pro 2011 (6+ vs. 4.9). But the MacBook Pro 2011 scored better by a few extra frames on the GPU test compared to the rMBP. Therefore, the only conclusion that can be derived from this test is that the older MBP is handling UI events better. An understanding of OS X's hardware acceleration and knowledge of how many more pixels there are on the rMBP points to the culprit being the GPU and how it's managing all those pixels.

Actually Cinebench is known to perform better on slower GPU's and this happened many times before which basically made Cinebench's Open GL test obsolete. So it's a Cinebench issue. rMBP's gaming performance on full resolution compared to 2011 MBP's is better. So the GPU is more than enough to handle the full resolution at better framerates. Hence this is totally about the CPU, if it was about the GPU, I would be seeing better framerates scrolling Facebook on my Mac Pro with a 5870, which is 3.5 times faster than GT650M and it's pushing even less pixels on my 30".
 
Last edited:
Oh, and, the Cinebench score for the old MBP is:

36.36 fps.

Guess what it is for the rMBP?

34.27 fps.

Both machines running ML DP4. How do you like them Apples?


Cinebench score for my Mac Pro with 5870 is (GUESS)

33.52, slower than both MBP's, yet 5870 is 3.5 times more powerful than GT650M. Holy moses, how can this be?

Maybe it's because Cinebench is nonsense and cannot really show the power of the GPU which can only be demonstrated by actual gaming results. If you compare 5870 to GT650M on games, 5870 is much much faster as expected.

----------

On the desktop front you are looking at a GTX 670/680 for this kind of power in 3D. I know this is on the 2D/desktop rendering side but going from 1356 shaders down to 384 and at lower clocks is going have a large impact.

If what you mean is Facebook scrolling, it won't. Facebook scrolls crap on many macs, including mine which has a much powerful GPU than the rMBP or any mobile GPU out there right now.

So the results say that if this thing is GPU related and GPU related ONLY, even after 4 more generations, rMBP may not really have the smooth facebook scrolling on a much lower resolution display. This is due to the fact that even after 4 generations, the GPU within a Macbook won't be as fast as a 5870. This may change if there are some breakthroughs on the road though but the trend has been that each new MBP has 50-60% faster GPU compared to the previous generation.

But I bet that this isn't a GPU issue really and with some better coding on Apple's side, it'll be fixed sooner or later.
 
Last edited:
I hope the fixed the bugs with display card switching

This is why I feel like waiting for the 2nd revision really is a good idea.

If they still have 2 cards on the machines I hope they fixed the issue that is plaguing so many machines running OSX all over the world causing them to freeze while switching displays. I have to say that my love for my MBP has been drained significantly by the amount of freezes I get if I don't turn that switching off.

I thought this only affected mid-2010 machines, but a quick search on the internet shows that almost every model is affected, and that's really sad.
 
I am curious...

My rMBP 2,3/16/256 will come in 3 weeks; I think it will already ship with ML.

I was curious about the graphics performance of my current machine (5,5 year old MBP, 2,13 C2D, Radeon 1600, 4GB RAM, Snow Leopard) using and tested several fps rates using Quartz Debug.

Scrolling on most website gives me 30-40 fps in Safari (current version for SL) (websites with many big graphics like engadget.com, macrumors.com etc.). Facebook timeline site result in 20-25 fps.

Using firefox (current version) I get 50-60 fps on most websites, also on facebook timeline sites.

Not bad for the good old machine :)

I know the rMBP is a VERY big update to my current machine in many things. But if the graphics performance is going to feel uncomfortable in comparison to my 5,5 year old machine, I will return it and wait another year for better graphics chips.

The only thing is that I would like to get ML (didn't like Lion too much), but my Radeon X1600 is not supported anymore.
 
Hi everyone,

Does any of you have a 2.7GHz / 16GB RAM / 512 SSD setup? and if so, how is that handeling the UI and such... I get the feeling that most (not all) complaints (in some case clear frustrations :p) come from the base models....

any feedback on the heavier version(s) of the the rMBP ?

kind regards,
Maurice
 
Hi everyone,

Does any of you have a 2.7GHz / 16GB RAM / 512 SSD setup? and if so, how is that handeling the UI and such... I get the feeling that most (not all) complaints (in some case clear frustrations :p) come from the base models....

any feedback on the heavier version(s) of the the rMBP ?

kind regards,
Maurice

The base models aren't somehow drastically less powerful. You're getting a clock bump and the extra RAM and storage are irrelevant to normal use, which is what the complaints are regarding. The GPU is exactly the same. I would hardly call it a "heavier" rMBP. The base is already "heavy".
 
Attacking grammar is a sure sign of defeat when arguing a point. Your lying, plain and simple.Plus have you read other peoples posts, many of them use the word "I" oftenly and repeated, this isn't an Essay, it's a post on a forum, people write this way sometimes. Your post contains grammatical errors which are quite basic, perhaps polishing up on your English might help with the blemishes in an otherwise flawless piece of modern literature.

And don't call me kid, you have no idea my age regardless of incorrect grammer.

You are a cute kid. It is not your grammar I am attacking. Frankly, I am not attacking anything as it requires effort. Curbing your fanatical fanboy enthusiasm over however nice looking, but disposable and limited in use gadget was the idea.
 
Airplay mirroring and screen capture will both suck. It's simply that encoding a video while doing whatever it is you're doing takes up a lot of CPU resources.
It's been this way for Windows, for Linux, for OSX. And it only gets worse with more pixels. Fact of life.

Doesn't intel quicksync help mitigate this?

----------

This tells me you have no clue how OS X utilizes the GPU for UI events. But someone above has explained some of it. Quartz is a technology Apple introduced years ago that offered hardware acceleration under OS X. It's how OS X offloads UI events to the GPU to free up the CPU so it can tend to core processing tasks in applications. Quartz was a big deal and really made the experience of OS X much better: UI events were finally much faster, more fluid, and animations became much smoother. Before Quartz, there was literally zero hardware acceleration with early versions of OS X and you could tell.

So right there your CPU being the culprit theory isn't holding weight.

So how do you explain what iBug2 has explained to us multiple times already? It's convenient you always ignore the facts that undermine your theories.
 
You are a cute kid. It is not your grammar I am attacking. Frankly, I am not attacking anything as it requires effort. Curbing your fanatical fanboy enthusiasm over however nice looking, but disposable and limited in use gadget was the idea.

Fanboy you say? considering i've built my own PC systems for over 15 years, it's not accurate to call me a fanboy. Infact this is only the second Apple product I've ever bought. Hardly a hardcore fan. I'm not defending the products flaws. It's just a** holes like you don't help by giving people your false opinions on a product you've never even tried. It's so obvious from your post you simply hate the idea of this product and just wanted to troll on it, pretending to own one to help cement your views of somebody who is a genuine owner therefore should be taken more seriously. Sounds better than a person who is hating the product based on what he's read from other people.
 
Comparing a desktop class GPU to a mobile one, that's your idea of intelligent conversation?

Heard of sarcasm? ;)

Though if I was gaming at the native res of the retina I would need a GTX 690? get it?
 
I just think it's a pity there's no MacBook Pro Retina without Retina.

The machine is lighter, much more powerful, and more stylish, which I all value.

However I personally prefer performance and battery life over a retina display, at least at this early stage.

I don't know how much retina affects battery life, and this is my personal opinion based on my needs, I know it wont match everyone.

I travel a lot, and while taking 12-24 hour train trips I would love to be able to spend more of that time working on projects, until then, back to my Kindle (black and white 30 day battery). Taking all that into consideration, I also need power for when I'm working on larger projects, often not where I usually live.

Don't run the screen at the higher resolution and voila, MBP Retina w/o retina-ish.
 
Yeah me too, I was reading this and I said, wait a second I was reading this a week ago at the lalo shrimp page....
Yup.. Same here... Oh well, I supposed not everyone reads Anandtech compulsively like I do. :)
 
Fanboy you say? considering i've built my own PC systems for over 15 years, it's not accurate to call me a fanboy. Infact this is only the second Apple product I've ever bought. Hardly a hardcore fan. I'm not defending the products flaws. It's just a** holes like you don't help by giving people your false opinions on a product you've never even tried. It's so obvious from your post you simply hate the idea of this product and just wanted to troll on it, pretending to own one to help cement your views of somebody who is a genuine owner therefore should be taken more seriously. Sounds better than a person who is hating the product based on what he's read from other people.

LOL. No, I never built a PC, nor any other computer from parts. I suppose I was a bit busy exploring the world in last few decades to dedicate my time in solitude tinkering around with those exciting appliances. You see, my good man, just as I don't believe in sex after marriage, the sheer act of animal like copulation never had any contract indicating permanent acquisition. Then, yes, I lied and the only excuse to the misunderstanding was her gender, where you both share a very unlogical but amusing perception of...usage.
 
1) Javascript was designed around executing all the events as if it were on one thread. There is no parallelism expected. And therefore there is no easy way to make a browser's javascript engine make full use of multiple cores for one web page's javascript interpreter. (independent pages or tabs can be run on separate cores, but this doesn't affect the facebook scrolling benchmark because it's just one page you're working with. It's functionally equivalent to launching a separate browser per page, which is what Chrome does on OSX and Windows.)

2) Packets are simply chunks of data. Not even complete chunks of data. I'm not sure what you mean by "treat packets as threads", but chunks of data are simply a bunch of numbers that may or may not have meaning. Can you clarify?
Hmmm. I didn't really mean to say packets, but objects, like html, each image, any live feeds. If webkit were to run 3-4 instances of Java on the basis that if you have 4 processors you probably also have a bit more memory and storage associated with that hardware.

FB code is far from optimized for resources and yes it is annoying it keeps feeding down a single page of "infinite scroll". Recently I was on a FB page and wanted to click on one of those items on the very bottom of the page. It was hard to catch! That is a UI error.

When I am on finance.google.com and am watching a stock live, it often hangs and fails to refresh after I work with a variety of other browser tabs or windows. The process sharing or threading is weak on browser to date.

I do think having the browser initiate more than one instance of java (or gag, flash) could speed things up quite a bit.

Rocketman
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.