We are talking about 1920 x 1200 retina display. That is 3840 x 2400 pixels. That's 9.2 million pixels. Times 200 fps, that's 1.84 billion pixels per second, or 5.5 GByte per second, or 44.2 GBit per second. Good luck. "Moving pixels around" means you need to both read and write the pixels. That's 88.4 GBit per second.
2880x1800x4 (8bit RGBA) are approx. 20 MByte.
Try to draw that at 60fps and you are at 1200MByte/s = 1,2 GByte/sec
Now think about available memory bandwidth.
Both VRAM and RAM, because you have to transfer new data if you scroll.
This machine is maxed out because of memory bandwidth.
No driver update, CUDA, OpenCL or anything else will help with that.
I presume with iOS devices Facebook or Apple is doing some server side processing to reduce the bandwidth sent to devices. Why not do that to the top 10 identifiable end user PC and other devices as well.
If I am wrong that they are, they should.
Anandtech:
"The GPU has an easy time with its part of the process but the CPU’s workload is borderline too much for a single core to handle. Throw a more complex website at it and things get bad quickly. Facebook combines a lot of compressed images with text - every single image is decompressed on the CPU before being handed off to the GPU. Combine that with other elements that are processed on the CPU and you get a recipe for choppy scrolling."
I also think webkit and maybe Safari should be modified to allow the dual and quad core systems we have now and in the future to distribute the load. Between these two schemes, graphics responsiveness should improve, compatibility with older and crippled devices improved, and bigger and better things in the future made practical on a non-desktop unit.
Rocketman
Nah it just means screen capture is poorly coded. Now, since you have DP4, if you or someone could show me how Airplay Mirroring works, showing the performance on the TV and the computer, that I'd be interested in. If performance sucks there then I would be disappointed, if not the problem is with the way screen capture and Quicktime X works, nothing else.
Anandtech claims otherwise. Also, people with Mac Pros and far stronger machines also experience slower UI, scrolling etc, yet they have significantly stronger GPUs, nor does turning on the dGPU make any difference, which suggests this is primarily a CPU/Software issue.
As an aside for those thinking Apple should have put in 2GB of VRAM: VRAM does not make the graphics system faster unless you're feeding in more textures than fit in the onboard VRAM for a single frame. If you can show a benchmark saying that we're using up all 1GB of VRAM and paging more textures in that didn't fit while scrolling facebook.com, then it would help. Otherwise, it'd just make the cost higher and the performance equal to now.
Some how I don't even believe you have one. Your simply saying you have one and it's rubbish just to make your point more valid. Anyone who has one has reported it as being a beautiful device to work on yet you think it's a gimmick. Surely if you really did think it was a gimmick you wouldn't of baught one in the first place. You just sound like a hater disguised as a customer. I'm not saying it's a perfect machine and I'm not defending its true shortcoming but I can't believe anyone would spend so much money on a RMBP and then refer to it as a decoration for an interior. A true customer would simply state it wasnt for them. State their reasons why and would be more subjective in their opinion.
I travel a lot, and while taking 12-24 hour train trips I would love to be able to spend more of that time working on projects, until then, back to my Kindle (black and white 30 day battery). Taking all that into consideration, I also need power for when I'm working on larger projects, often not where I usually live.
No kid, I buy nothing including her until I have a good ride and I like it enough to keep it. Try not to overuse words like "would" and "I" too much.
Second, you'll notice on Cinebench that the rMBP scored significantly better on the CPU test compared to the MacBook Pro 2011 (6+ vs. 4.9). But the MacBook Pro 2011 scored better by a few extra frames on the GPU test compared to the rMBP. Therefore, the only conclusion that can be derived from this test is that the older MBP is handling UI events better. An understanding of OS X's hardware acceleration and knowledge of how many more pixels there are on the rMBP points to the culprit being the GPU and how it's managing all those pixels.
Oh, and, the Cinebench score for the old MBP is:
36.36 fps.
Guess what it is for the rMBP?
34.27 fps.
Both machines running ML DP4. How do you like them Apples?
On the desktop front you are looking at a GTX 670/680 for this kind of power in 3D. I know this is on the 2D/desktop rendering side but going from 1356 shaders down to 384 and at lower clocks is going have a large impact.
This is why I feel like waiting for the 2nd revision really is a good idea.
Hi everyone,
Does any of you have a 2.7GHz / 16GB RAM / 512 SSD setup? and if so, how is that handeling the UI and such... I get the feeling that most (not all) complaints (in some case clear frustrations) come from the base models....
any feedback on the heavier version(s) of the the rMBP ?
kind regards,
Maurice
Attacking grammar is a sure sign of defeat when arguing a point. Your lying, plain and simple.Plus have you read other peoples posts, many of them use the word "I" oftenly and repeated, this isn't an Essay, it's a post on a forum, people write this way sometimes. Your post contains grammatical errors which are quite basic, perhaps polishing up on your English might help with the blemishes in an otherwise flawless piece of modern literature.
And don't call me kid, you have no idea my age regardless of incorrect grammer.
Airplay mirroring and screen capture will both suck. It's simply that encoding a video while doing whatever it is you're doing takes up a lot of CPU resources.
It's been this way for Windows, for Linux, for OSX. And it only gets worse with more pixels. Fact of life.
This tells me you have no clue how OS X utilizes the GPU for UI events. But someone above has explained some of it. Quartz is a technology Apple introduced years ago that offered hardware acceleration under OS X. It's how OS X offloads UI events to the GPU to free up the CPU so it can tend to core processing tasks in applications. Quartz was a big deal and really made the experience of OS X much better: UI events were finally much faster, more fluid, and animations became much smoother. Before Quartz, there was literally zero hardware acceleration with early versions of OS X and you could tell.
So right there your CPU being the culprit theory isn't holding weight.
You are a cute kid. It is not your grammar I am attacking. Frankly, I am not attacking anything as it requires effort. Curbing your fanatical fanboy enthusiasm over however nice looking, but disposable and limited in use gadget was the idea.
Comparing a desktop class GPU to a mobile one, that's your idea of intelligent conversation?
I just think it's a pity there's no MacBook Pro Retina without Retina.
The machine is lighter, much more powerful, and more stylish, which I all value.
However I personally prefer performance and battery life over a retina display, at least at this early stage.
I don't know how much retina affects battery life, and this is my personal opinion based on my needs, I know it wont match everyone.
I travel a lot, and while taking 12-24 hour train trips I would love to be able to spend more of that time working on projects, until then, back to my Kindle (black and white 30 day battery). Taking all that into consideration, I also need power for when I'm working on larger projects, often not where I usually live.
Yup.. Same here... Oh well, I supposed not everyone reads Anandtech compulsively like I do.Yeah me too, I was reading this and I said, wait a second I was reading this a week ago at the lalo shrimp page....
Fanboy you say? considering i've built my own PC systems for over 15 years, it's not accurate to call me a fanboy. Infact this is only the second Apple product I've ever bought. Hardly a hardcore fan. I'm not defending the products flaws. It's just a** holes like you don't help by giving people your false opinions on a product you've never even tried. It's so obvious from your post you simply hate the idea of this product and just wanted to troll on it, pretending to own one to help cement your views of somebody who is a genuine owner therefore should be taken more seriously. Sounds better than a person who is hating the product based on what he's read from other people.
Hmmm. I didn't really mean to say packets, but objects, like html, each image, any live feeds. If webkit were to run 3-4 instances of Java on the basis that if you have 4 processors you probably also have a bit more memory and storage associated with that hardware.1) Javascript was designed around executing all the events as if it were on one thread. There is no parallelism expected. And therefore there is no easy way to make a browser's javascript engine make full use of multiple cores for one web page's javascript interpreter. (independent pages or tabs can be run on separate cores, but this doesn't affect the facebook scrolling benchmark because it's just one page you're working with. It's functionally equivalent to launching a separate browser per page, which is what Chrome does on OSX and Windows.)
2) Packets are simply chunks of data. Not even complete chunks of data. I'm not sure what you mean by "treat packets as threads", but chunks of data are simply a bunch of numbers that may or may not have meaning. Can you clarify?
Why buy old clunk MacBook??? Apple made perfect thin notebook with retinas. If some say is slow they lie. Anandtech is known to be troll.