Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
isn't something supposed to come out today? yesterday that macrumors post was saying Apple is releasing something new on the 19th. The day is young i guess. Over here on the West Coast.

RAWR @ APPLE

please get these things out the door.
 
I always hoped some driver update would make this possible... I guess I'll have to wait till my next MBP for this functionality...

Indeed. It is somewhat disappointing that this could not be implemented through a s/w update, as I thought it would be. Oh well.
 
yea, nvidia Optimus relies on software to tell it what GPU is needed, not hardware demand.

PC's that have this tech already implemented have seen problems with certain things where the software picks the wrong GPU to use.

I assume we haven't seen this in the current generation MBP's because the 9600m is too old to be hot switchable, or maybe it the 9400m, but something like that.
 
It would be nice to have it! Sometimes itsn't optimal logging off to change gfxcard!!

2 gfxcard is perfect if this tech works fine, and OpenCL can use both at same time I think as now
 
Powerful equals battery-draining.

Furthermore, stupid Intel insists on packaging a weak-ass GPU in its chip packages, whether you want them or not.

Is not this Intel behavior very similar to Apple insistence on having very limited range of laptops? :D

I am not sure presence of IGP in the package is a problem at all. I have not seen any data regarding power consumption of this GPU when at idle (when discrete GPU is used). For all we know it might be negligibly low.
 
With the discreet graphics I barely get 2 hours of battery time, but with the integrated graphics I can't even scroll an image in Photoshop without it looking like it's being done on a 10 year old PC.

are you sure you have enough ram or you don't use a old ppc Photoshop?
Consider than photoshop use gfx very little
anyway... On my 13" I useany gfx programs like PixelMator and it damn fast.
 
Doesn't ATI/AMD have "ATI Switchable Graphics", which also switch between the integrated GPU or discrete on the fly? It would have to be updated to work on OS X I think, but then again, technically so does Optimus as officially it doesn't support OS X right now.

ATi indeed do. ATi Hybrid Graphics Technology.

I reckon the new Mac Books would have HD 5XXX series just because they have better power management than anything nVidia has to offer and most likely the new Fermi platform. (If its ever released)

http://www.amd.com/us/products/technologies/ati-hybrid-graphics/Pages/ati-hybrid-graphics.aspx

IMO, ATi's implementation is better. It doesn't re-route the graphics through the motherboard and it has PowerXpress designed specifically for
 
if this is true it doesn't mean anything because memory controller technology has been nailed down for years.
Intel only started including it with the core i7 series.

The thought of Intel audio has my ears cringing in pain. My ears could be wrong of course. As to video you still have the question of Intel quality here, besides the fact that other GPUs have also been accelerating video for some time.
This is nothing but anti-Intel bias, as the audio is digitally outputted to the miniDP/HDMI port, you have Nvidia to thank for the current pros not being able to output audio through the miniDP port. It's all digital, no conversions are taking place, so no loss of quality.

I don't get the issue here since there will be an extra gpu to help out when the intel IGP doesn't have the power. I can only hope they take out the optical drive to make room for a beefier gpu and cooling.
 
You're not really predicting the future here... nVidida announces new technology... Apple adopts... how is this so surprising?

IMHO, it would be a HUGE disappointment if this were NOT the case.

It's is an honor to quote the founder of the (in)famous waiting for arandale thread. For those thinking the arandale MBPs will be a step backwards from current models, it would be worth checking out the geekbench results posted a couple weeks ago for an apparent test model MBP with an i7 arandale - it absolutely was substantially faster in every metric. If you suspect those results were a fake, check out scores for competitor's PCs. Every one of them had the arandale IGU as part of the package, and it's just better. And the IGP is reportedly similar to the current 9400, far better than intel's previous efforts and more than adequate for everyday tasks, including watching HD videos.

It will almost certainly be the case that whatever discrete GPU apple might reasonably include will do better than the current models at games or work. As to whether optimus' discreet management (i.e. you won't notice it) of intel integrated GPU and a 300x series NVidia will be a better solution than some of the best ATI GPU options, I am far less certain. It's not surprising to see Apple try the optimus route, but I wonder if that was the only way to end the kludgey manual switch of GPUs. Either way, they will be measurably better at every thing than today's MBPs, but I wonder if we are losing something in the way of battery life, total processing power, and gaming with the Nvidia approach.
 
It's is an honor to quote the founder of the (in)famous waiting for arandale thread. For those thinking the arandale MBPs will be a step backwards from current models, it would be worth checking out the geekbench results posted a couple weeks ago for an apparent test model MBP with an i7 arandale - it absolutely was substantially faster in every metric. If you suspect those results were a fake, check out scores for competitor's PCs. Every one of them had the arandale IGU as part of the package, and it's just better. And the IGP is reportedly similar to the current 9400, far better than intel's previous efforts and more than adequate for everyday tasks, including watching HD videos.

It will almost certainly be the case that whatever discrete GPU apple might reasonably include will do better than the current models at games or work. As to whether optimus' discreet management (i.e. you won't notice it) of intel integrated GPU and a 300x series NVidia will be a better solution than some of the best ATI GPU options, I am far less certain. It's not surprising to see Apple try the optimus route, but I wonder if that was the only way to end the kludgey manual switch of GPUs. Either way, they will be measurably better at every thing than today's MBPs, but I wonder if we are losing something in the way of battery life, total processing power, and gaming with the Nvidia approach.

Don't believe benchmarks of rumored hardware. Thats how the big boys keep on beating down ATi/AMD.
 
And that's intel's design mistake. The memory controller should be on the main die.

I'm sure you know better than the engineers at Intel. Arrandale is the first 32nm CPU, while the IGP and memory controller are at 45nm. They need to use those 45nm fabs for something and the new 32nm process doesn't have a lot of capacity just yet, because it's cutting edge. This is the best compromise they could make for the current fab conditions.
 
nothing scientific to this comment but...Doesn't Apple Generally like to claim up to 5x faster performance?
I think what we end up with will be pretty fricken acceptable! :D
 

"Also note that a dual-link DVI connection is not included in its spec, so it's limited to 1920x1200 via DVI and by extension, HDMI. 2,560 x 1,600 is only available via DisplayPort, which means GMA HD is likely incompatible with almost all 30in displays currently and previously available on the market."
 
I'm sure you know better than the engineers at Intel. Arrandale is the first 32nm CPU, while the IGP and memory controller are at 45nm. They need to use those 45nm fabs for something and the new 32nm process doesn't have a lot of capacity just yet, because it's cutting edge. This is the best compromise they could make for the current fab conditions.

You do know that Cmaier used to be design Chips at AMD and was part of the teams that kicked Intel's ass?
 
Let's not go Nvidia, really they have nothing great to offer. I want to see the 5830 in the high end MBPs and the 5650 in the low end MBPs. If they go with the 300 series I will be very disapointed, but I guess I am not in the market for at least another 2 years so go ahead Apple, botch this update!
 
Let's not go Nvidia, really they have nothing great to offer. I want to see the 5830 in the high end MBPs and the 5650 in the low end MBPs. If they go with the 300 series I will be very disapointed, but I guess I am not in the market for at least another 2 years so go ahead Apple, botch this update!

Given nVidia's track record the Fermi series will be very power hungry.
 
You're not really predicting the future here... nVidida announces new technology... Apple adopts... how is this so surprising?

IMHO, it would be a HUGE disappointment if this were NOT the case.

I'm not predicting the future, didn't say that, wish I could though, might come in handy, thanks for the idea though I'll have to work on it..

See if you read the Engadget article, which you didn't, what I was placing my bet on was primarily the release date.

The Best Buy SKU's, if real, are for products that should be in their system by the 14th of March. CeBIT is March 2-6, nVidia is supposed to announce Optimus at that time. Acer, who's releasing the 'first' laptop to use Optimus, was mum on what GPU was going into their laptop. Maybe Apple's waiting until nVidia makes their official announcement at that time, just much more discretely (get it? clever I know...) than Acer. I just think the timing of it all matches up, there's some loose association in there between everything, but I guess that's part of what makes up these rumors in the first place. Happy Trolling!
 
I'm not predicting the future, didn't say that, wish I could though, might come in handy, thanks for the idea though I'll have to work on it..

See if you read the Engadget article, which you didn't, what I was placing my bet on was primarily the release date.

The Best Buy SKU's, if real, are for products that should be in their system by the 14th of March. CeBIT is March 2-6, nVidia is supposed to announce Optimus at that time. Acer, who's releasing the 'first' laptop to use Optimus, was mum on what GPU was going into their laptop. Maybe Apple's waiting until nVidia makes their official announcement at that time, just much more discretely (get it? clever I know...) than Acer. I just think the timing of it all matches up, there's some loose association in there between everything, but I guess that's part of what makes up these rumors in the first place. Happy Trolling!

Oh wow, another graphics idea stolen from AMD.

First CPU/GPU on die, now driver based GPU switching for Laptops.
 
I wonder if Apple leaked this information to deliberately throw a bone to the ravenous, frenzied anticipaters on this forum and others.
 
No matter what video solution they come up with, will it run Crysis efficiently? I know a laptop will never play it at full power, but at least do something. I actually use bootcamp to play a few games and they tend to run well using the 8600M nVidia chip in my Santa Rosa MBP. Crysis? You have a better chance of winning over Mike Pence to vote for the current healthcare reform bill.
 
My dream (won't happen I know) would be to have 1 rockin' GPU, 1 mid range GPU and the intel integrated one. Then the system can mix 'n match, and scale the three GPU's as needed for the tasks at hand. Of course I'd also love to see a way for 2 external displays and the built-in display all at once. Again, won't happen but you can't blame me for dreaming.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.