Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Modern day graphics cards do not saturate the full 16x bus. Sony already has a laptop that does such. People have been doing this for a while with ExpressCards in laptops with great results which is a much slower connection.

Express card video solutions as well as sony's solution aren't very powerful - nothing like the cards that are currently embedded.
 
Here's what I got from reading this thread, people want:

Thinner MBP
Retina Display
Dedicated GPU

Unlikely that we'll get from Apple the first two without the third. But it's also unlikely that they'll be able to put a dedicated GPU capable of running a retina display in a thinner package due to thermal dynamics (I could be wrong). There has to be compromise somewhere. Apple could also release a thinner model with integrated graphics only. As much as I'd like to see all three of these in the next update (thinner matters least to me) if Apple plans on going integrated only, which I doubt they would ever do for the 15" and 17" models, then we are going to have some pretty serious performance drawbacks that would make many people on this forum very unhappy.
 
Last edited:
In fact, I think if you really believe that first, it's done like that, you need to look at 3D games back in the 90s, running on 8 MB Voodoo 2 cards or lesser. You'll be surprised how much was being done on those GPUs... Next you need to read up on what compositing actually is to see how much it's not just a bunch of textures on quads.

Finally, my 9400m had again no problems with sloppy animations or lag pushing as many pixels as the Retina iPad does, which has quite the lesser GPU compared to a 9400m anyhow.

Ok then, open 30 safari windows and do some expose on a 30-inch cinema display and tell me that runs at a perfect smooth 60fps.

Next you need to read up on what compositing actually is to see how much it's not just a bunch of textures on quads.
So I guess those engineers at apple lied to me when they said “its just bunch of opengl textures"?!? OMG, apple gyped me, they should refund my WWDC ticket!!

I don’t like sourcing wikipedia but: http://en.wikipedia.org/wiki/Quartz_Compositor

“ Quartz Compositor runs using the graphics processor (GPU) by encapsulating each rendered backing store in an OpenGL texture map or surface. The GPU then composes the surfaces and maps to provide the final image, which is delivered to its frame buffer.Quartz Extreme only uses OpenGL commands..."
 
my main hackintosh machine is Atom 330 cpu with Nvidia ION (9400m) GPU

Everything silky smooth and youtube 1080p full-hd plays smoothly with low CPU usage thanks to Nvidia..

Animations are smooth too. Why wouldn't they be? Animations on OSX are smooth on PowerPC machines and their GPU's are really slow.

And why in the hell someone wants to open 30 safari windows at once? somekind of test? that is no test. That is retarded youtube show-stuff.

----------

Not so sure it is. For 90% of the usage done on a laptop (presumably, on battery), discrete GPUs kinda suck. They don't play games anywhere near as a desktop, and kill battery, badly.

Why would anyone play games on a laptop?
Why dont throw it to the fire, it will burn a little faster.
 
Express card video solutions as well as sony's solution aren't very powerful - nothing like the cards that are currently embedded.

What do you call not powerful? I have only used the Vidock and I used an ATI 5770. I was able to play all desktop games like Crysis, Call of Duty, Devil May Cry, and Dirt. My benchmarks came in almost the same as on the installed on my desktop.

This is with no support or native drivers. Granted a desktop ATI 5770 is not going to win any Benchmark wars but it does better then most mobile GPUs.

I am waiting for the thunderbolt edition which will be much faster. I know Sony's is not very fast but thats because they used a mobile gpu. Mobile gpus are mostly weak and have horrible driver support. Desktop brake-out boxes are the future.
 
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_2_1 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148 Safari/6533.18.5)

Most 13" macbook pros are only used for email and updating facebook. No biggie really. If you really are a pro you wont have a 13" macbook.

Cool... I'll make sure I consult with you next time I buy a computer since you seem to know exactly what everyone needs.
 
Everything but the 15" and 17" MBPs will get the integrated Intel GPUs, which will be better than the last Nvidea 320m chips from 2010.

i wish this were true..

[not saying it isn't as i don't know... but if a little mba could handle what my 2010 mbp can, i'd pick one up real quick like ...here's to hoping :) ]
 
I am sorry, but that article is just stupid. Apple has ordered CPUs with 16 shader cores because all fast IB mobile CPUs have 16 shader cores. And anyway, I never heard of an IB CPU with 6 shader cores!? Its 8 or 16, with most mobile CPUS having 16 per default. The integrated HD4000 will be enough for 13" notebooks, the 15" and up will be most surely getting a dedicated GPU. Even if Apple redesigns the MBP to make it thinner, this will most certainly lead to losing the optical drive and the reclaimed space can be used to accomodate the GPU.

So I wouldn't worry. And I certainly wouldn't take anything Charlie writes for truth, he is a known for writing up provocative FUD to boost his website.
 
No offense to you personally, but I love how many experts we have on this forum that claim Apple should add this or that option... I think these people fail to see the long term impact of their countless recommendations.

When did I claim I was an expert? All I stated was a more than reasonable option I think Apple should provide.
 
This story sounds like FUD, think about it - it from Semi-Accurate - it name means that it not accurate information. Semi-Accurate is also know to be pretty much Anti-Intel, Anti-NVidiia and Anti-Apple. Pro-ARM and Pro-AMD.

I believe these are crazy people came up with wild and stupid idea of ARM Based Macbook AIR - That idea alone shows it Anti-Intel and Anti-Apple and Pro-ARM
 
Can we PLEASE stop using Semiaccurate and that blathering twit Charlie Demerjian as ANY kind of source unless we're looking for laughs? The guy is a twit and pulls most of this stuff out of his butt.
 
I know the fanboys are going to rate me down, but this is the Apple of today and tomorrow: now that they hit the mass market of stupid uninformed sheep, they are still going to make you pay a premium prices but without the quality or even the innovation anymore.
 
I do occasionally game on my MBP (Top end 2011 I7) so the truly awful Intel chipset without discrete graphics would leave me looking elsewhere for a new 17" lappy come update time. I don't think it's likely though..Not worrying about it yet that's for sure...Another year at least before my Pro needs to be replaced.
 
Last edited:

That's not a lack of double buffered Window. That's a lack of a Window doing it's redrawing and swapping it's buffer to screen because it is frozen and thus not processing its update message, or any other message in its run loop. You'd know this if you read Petzold and did Win32 programming.

----------

Ok then, open 30 safari windows and do some expose on a 30-inch cinema display and tell me that runs at a perfect smooth 60fps.

Yeah, let's also just try to run Crysis on it while we're at it. :rolleyes:

So I guess those engineers at apple lied to me when they said “its just bunch of opengl textures"?!? OMG, apple gyped me, they should refund my WWDC ticket!!

Oversimplification of the process for people like visit this forum it seems.
 
The report is very unclear, I'd assume that Apple will continue with the current set-up. A dedicated graphics chip (from AMD?) on the 15 and 17 inch MBPs along with an integrated Intel graphics to save battery life in less demanding applications and the 13 inch will be left with the integrated graphics only. I just can't see Apple ditching a dedicated graphics card on the MBP. If they did, they call it MacBook Air.
 
I think what is most interesting about the article is the idea that Apple was planning, or still is planning, or plans to actually put in some of the Macbook Pros, an Ivy Bridge with with half the EUs. (8 vs. 16). The 8 EU GPU would be a custom part, as I believe Intel only offers the full "GT2" on mobile. There is no GT1 for mobile chips.

That's a bit unsettling.
 
The report is very unclear, I'd assume that Apple will continue with the current set-up. A dedicated graphics chip (from AMD?) on the 15 and 17 inch MBPs along with an integrated Intel graphics to save battery life in less demanding applications and the 13 inch will be left with the integrated graphics only. I just can't see Apple ditching a dedicated graphics card on the MBP. If they did, they call it MacBook Air.

Heck I'd like to see the Optimus setup in the Air and in the 13" laptops if anything. But yes, like you, I think this article is a load of crap.
 
Last edited:
Heck I'd like to see the transformer setup in the Air and in the 13" laptops if anything. But yes, like you, I think this article is a load of crap.

This.

You have MBA's screen as a tablet and iOS style interface and when you dock it to the keyboard-battery-combo, whoila the OSX Dock appears from the bottom and mouse cursor comes alive, trackpad gestures as usual.

And if they could do all this in purely ARM chips, full blown OSX ARM version such as MS Win8 will be, we could be speaking here about 20+ hours batterylife on a Apple MacBook Transformer ARM
 
This.

You have MBA's screen as a tablet ...

You misunderstand (I goofed though, it's called Optimus, not transformer, so it's my fault), I didn't mean the Asus EEE Pad Transformer, I meant the nVidia GPU technology, where the integrated graphics are used when a task doesn't require access to hardware acceleration and the dedicated card kicks in when an application performs a certain operation.
 
But it doesn't make graphics chips, they are on the CPU die, which is not the same thing.

What the GPU is attached to is immaterial.

Who sells the most GPUs ?
Who generates the most revenue selling GPUs ?


Selling GPUs is a business. Those selling the most ( not the fastest) have leverage.

ATI and NVidia tried to leverage coupling GPUs to memory controllers into dominating the chipset business and selling more GPUs. Intel and AMD absorbed the memory controllers into the CPU die (for additional reasons that have to do with decreasing latency and increasing integration) and took away that leverage ( a happy, for them, side-effect) .

In the distant past, the classic PCI-e slot graphics card was key to the graphics market. That era is over. Embedded, whether integrated into the chipset/CPU packge or inserted directly onto the motherboard , graphics is the dominate market now.
 
I know the fanboys are going to rate me down, but this is the Apple of today and tomorrow: now that they hit the mass market of stupid uninformed sheep, they are still going to make you pay a premium prices but without the quality or even the innovation anymore.

You don't have to be a "fanboy" to rate down such uninformed nonsense. And what a sly trick this is, pre-emptively berating anyone who doesn't agree with you. Doesn't work though. And berating Apple's customers (the "stupid uninformed sheep"). I think there are people who deserve being called "stupid" and "uninformed" a lot more than Apple's customers.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.