PDA

View Full Version : nVidia's new switching graphics: perfect for Air?




Anonymous Freak
Feb 9, 2010, 11:09 AM
AnandTech just posted an article (http://www.anandtech.com/mobile/showdoc.aspx?i=3737) about nVidia's new "Optimus" switchable graphics platform.

This takes all of the work of switching between integrated graphics and discrete and moves it from hardware into software. Obviously, it would require a re-write of the drivers in OS X, but hopefully nVidia has been working with Apple on this, as they have now implemented it on Windows.

In short: Previous 'switching graphics' systems have required a physical electronic switch on the motherboard to handle it, and a hard switch in graphics drivers, requiring a logout in OS X, and similar in Windows. (At best on Windows, you could do it when launching a 3D application, but it required disabling the currently active video driver, which meant you couldn't already be running another 3D app at the same time.) This makes it so that instead of the discrete GPU completely taking over control of the display, it just does the rendering 'blind', and pushes the finished frames over to the integrated GPU for display, using 'overlay' techniques that are already available. (In short, "DUH! Why didn't we think of this earlier?" I had a Matrox m3D back in 1998 that did this.)

The major advantage is that you get all the power savings of the integrated graphics 95% of the time, but when you launch an app that would benefit from using a discrete GPU, (the nVidia drivers monitor for this,) it would load just that app onto the GPU. (Or multiple apps, as required, just like if it was running 'native'.)

According to nVidia, there is very little overhead for doing it this way, as the major extra bandwidth is GPU-to-system, which is usually low traffic anyway. (And PCIe is bi-directional, so this 'extra' data flow doesn't take away from the system-to-GPU bandwidth.)

While at present, I greatly prefer ATI's mobile GPU offerings, (and it doesn't look like nVidia will have anything based on a new architecture for awhile in the mobile-space,) even nVidia's current offerings would be plenty for the Air (and 'plain' MacBook,) when used this way. They could throw in a higher-wattage GPU, knowing it wouldn't be used as often. (Heck, they could even make it so that on battery, it uses integrated graphics for more things that under wall power it would switch to discrete GPU for, or have a slider in energy settings to let you pick how often discrete GPU comes on.)

For the Pro line, I'd still prefer to see solely discrete GPUs, preferably from ATI.



deconstruct60
Feb 9, 2010, 07:30 PM
AnandTech just posted an article (http://www.anandtech.com/mobile/showdoc.aspx?i=3737) about nVidia's new "Optimus" switchable graphics platform.


But the MacBook Air doesn't even have a discrete option mode now.
If the discrete chip option was left out for space/heat&power/price issues that doesn't change at all with Optimus. There will still not be enough space/thermal budget to put it in. ( I don't the cost savings of the video switching hardware was the price overhead blocker. The increase in cost is buying the additional discrete processor and the memory it needs for its framebuffer).

If the discrete chip turns on at all that will only decrease the battery life folks are getting with the MacBook Air now.

The MacBook Air seems even more destine to be laptop of choose for those who really don't care about high end graphics at all ( and stuck with an i3 Arrandale with a drop in top-end graphics performance from the 9400 ).
Either that or Apple pushes it down in price substantively and sticks with coreDuo/9400M

The Optimus is better for the machines that already have an optional discrete mode. Potentially if some folks have been running those laptops in discrete mode most of the time they would see increased battery life. However, for anyone who is been running on 9400M for 90+% of their laptop usage the battery life may go down (not up) if flip to Optimus.


The major advantage is that you get all the power savings of the integrated graphics 95% of the time, but when you launch an app that would benefit from using a discrete GPU, (the nVidia drivers monitor for this,) it would load just that app onto the GPU. (Or multiple apps, as required, just like if it was running 'native'.)


On a MBAir is that even 5% of the time on a discrete GPU??? The new Arrandale GPU is slower than a 9400, but hugely better than the previous Intel options. It will do video decoding may even do Flash10.1 load if get correct Intel drivers eventually deployed. So as the article points out at the end. What is really left for the discrete chip. Higher end games? Apple is going to juice up a MacBook Air so that it is a better gamer machine?

I don't think that is the market they are aiming at. I think a chunk of the market they are aiming at might be looking at a iPad and the MacBook Air sales numbers will drop even more.

Anonymous Freak
Feb 9, 2010, 08:42 PM
But the MacBook Air doesn't even have a discrete option mode now.
If the discrete chip option was left out for space/heat&power/price issues that doesn't change at all with Optimus. There will still not be enough space/thermal budget to put it in. ( I don't the cost savings of the video switching hardware was the price overhead blocker.

Ah, just wait for the Intel haters to chime in on your claims regarding the IGP.......

The difference is that the Air *DOES* have a "discrete" GPU. It's called the GeForce 9400M. It just happens to have a chipset integrated in to it. (Northbridge+GPU+Southbridge)

The Arrandale CPU has a chipset integrated in to it, as well, (Northbridge+GPU on CPU) plus a low-power separate Southbridge. This means that the Arrandale plus HM55 chipset will draw noticeably less power than Core 2 Duo + 9400M.

The first MacBook Air used a 25W CPU plus a 7W Northbridge and a 3.3W Southbridge. The second MacBook Air used an 18W CPU plus 13.3W Northbridge, and same 3.3W Southbridge. An Arrandale system would use a 25W CPU (with onboard Northbridge,) plus a 3.5W Southbridge.

I can't find any hard numbers on power usage of just the 9400M, but I have to imagine it was roughly the same as the 13.3W GM965+3.3W ICH of the previous generation Air. So total power consumption without GPU will drop.

Yes, adding the discrete GPU will cost more (but not as much as previous switching systems,) and when it's on, it will drain battery life. My point is that if the user desires, they can make the tradeoff, and it will only be used exactly when actually needed. Unlike existing switching techniques, like the current MBP's 9400/9600 switch, where you have to make a big effort to switch, and have "either good battery OR good performance", with no balanced option.

MattInOz
Feb 9, 2010, 09:42 PM
AnandTech just posted an article (http://www.anandtech.com/mobile/showdoc.aspx?i=3737) about nVidia's new "Optimus" switchable graphics platform.

In short: Previous 'switching graphics' systems have required a physical electronic switch on the motherboard to handle it, and a hard switch in graphics drivers, requiring a logout in OS X, and similar in Windows. (At best on Windows, you could do it when launching a 3D application, but it required disabling the currently active video driver, which meant you couldn't already be running another 3D app at the same time.) This makes it so that instead of the discrete GPU completely taking over control of the display, it just does the rendering 'blind', and pushes the finished frames over to the integrated GPU for display, using 'overlay' techniques that are already available. (In short, "DUH! Why didn't we think of this earlier?" I had a Matrox m3D back in 1998 that did this.)



Even more a DUH! moment for Apple, Quartz graphic engine already does this.

Quartz Composer can fire up a cpu based OpenGL renderer when the GPU is lacking, used to cover feature shortfalls in some low end Macs.
Never understood why it couldn't handle two GPUs in the same way. Maybe it's the physical switching issue.

I also thought it was this Quartz design that followed though to see LLVM come on board in the compiler chain to handle OpenGL to x86 compile and seems to me to have been the inspiration for Grand Central.

deconstruct60
Feb 9, 2010, 10:27 PM
Ah, just wait for the Intel haters to chime in on your claims regarding the IGP.......


If they go out and look at the benchmarks that have started to roll in fine. What they will see is that the Arrandale IGP is much, much better than previous Intel offerings. If all you are doing is mostly just 2D apps (word, editor, browser, many graphics, audio , very plain OpenGL 2.0 games. etc. , etc. etc.) it works just fine.

If they are just haters, that somewhat inevitable on these forums.


The difference is that the Air *DOES* have a "discrete" GPU. It's called the GeForce 9400M. It just happens to have a chipset integrated in to it.


One of the most Orwellian things I have ever read here .. and that is saying alot. It discrete but integrated. Pick a side. Those are opposites can't be both at the same time.

Integrated graphics means you reuse the same memory controller that is hooked to the main memory and crave out a chunk of the main memory as the frame buffer space. You might possibly throw in that is it is lower power or economizes on power but that is not a notion of "integrated". That's is it. It has nothing to do with good/bad GPUs. ( the power conservation has a bit to do with limiting horsepower. )





This means that the Arrandale plus HM55 chipset will draw noticeably less power than Core 2 Duo + 9400M.


Less power... more battery lifetime. Doesn't necessarily mean have budget for a discrete GPU + RAM ( + possibly switching) .



The second MacBook Air used an 18W CPU plus 13.3W Northbridge, and same 3.3W Southbridge. An Arrandale system would use a 25W CPU (with onboard Northbridge,) plus a 3.5W Southbridge.


18 + 13.3 + 3.3 = 34.6
25 + 3.5 = 28.5
6.1 discrete + high speed RAM power budget.

What kind of GPU + RAM you going to get with 6W ????
See anything here with a single digit TDP budget?
http://en.wikipedia.org/wiki/Comparison_of_Nvidia_graphics_processing_units#GeForce_200M_.282xxM.29_series

Never mind really haven't addressed the space argument. The current solution is a two chip package solution verus this new 3 chip package + additional RAM chip(s) solution.

I suspect the user would be happier if scrunged up a 10W budget to get a 35W i3 in there with a decent clock speed. As opposed to some underclocked i3 ( or more expensive ? ) that is at 25W

The current i3 mobiles are 35W.
http://processorfinder.intel.com/List.aspx?ParentRadio=All&ProcFam=3222&SearchKey=



My point is that if the user desires, they can make the tradeoff, and it will only be used exactly when actually needed.


The point is that the user can make the trade off when buying the system.
If the user really needs it, then will buy the more well rounded laptop. The MacBook Air is niche laptop. Expanding the niche laptop so that it overlaps the more mainstream product makes sense how??? You are increasing the overlap in capabilities. A "power gamer" GPU doesn't really help it address the niche it is aimed at.

Anonymous Freak
Feb 10, 2010, 12:10 AM
One of the most Orwellian things I have ever read here .. and that is saying alot. It discrete but integrated. Pick a side. Those are opposites can't be both at the same time.

Yeah, it didn't quite come out the way I meant it. What I meant by it is that 9400M is a low-power, but otherwise feature-full, GPU with a chipset crammed in. Whereas Intel's IGP chipsets are a chipset with a feature-limited GPU tacked on. 9400M may not be great by "discrete GPU" standards, but it is based on discrete GPUs.

Less power... more battery lifetime. Doesn't necessarily mean have budget for a discrete GPU + RAM ( + possibly switching) .
That's the entire point of this new idea from nVidia; there is no switch, and the GPU and RAM are completely powered off when not in use. (Or, at worst, draw a fraction of a Watt.)

What kind of GPU + RAM you going to get with 6W ????

The point is that with this arrangement, you can use something that uses more than just those 6W. You can go with the 14W G210M, or, if this can work with ATI, a 15W 5650, which would absolutely kick ass, figuratively.

Never mind really haven't addressed the space argument. The current solution is a two chip package solution verus this new 3 chip package + additional RAM chip(s) solution.

The first Air was a 3-chip, and you can do 256 MB of VRAM in one chip now. The HP Envy 13 is basically a MacBook Air clone, and it uses a discrete GPU with an Intel three-chip setup! (Core 2 Duo plus PM45 Northbridge plus ICH Southbridge plus ATI 4330 plus 512 MB VRAM.) HP even manages to put in an SD card reader! (Admittedly, HP doesn't go with the silly tapered edge, they 'square off' the edges like the MacBook Pro. But those tapered edges are just for show anyway, I wish Apple would cut them off, and make the footprint of the Air a little smaller as a result. Get rid of that ridiculous screen bezel.)

I suspect the user would be happier if scrunged up a 10W budget to get a 35W i3 in there with a decent clock speed. As opposed to some underclocked i3 ( or more expensive ? ) that is at 25W

The Core i7 at 25W is a much better solution than an i3 at 35W. The fastest mobile i3 is 2.26 GHz, without Turbo; while the best 25W i7 is 2.13 GHz, with two-core turbo to whip the i3 at 2.66 GHz, and a one-core turbo up to 2.93 GHz. And the i7 models (as well as i5 models,) have graphics turbo, as well. So the total 25W (or 35W for the i5,) power can be split between CPU and GPU essentially as needed for the load. Useful for making the potential discrete GPU used less, and even MORE useful if there is no discrete GPU. (And it also means that when the discrete GPU is used, the CPU gets more power available to it, since the integrated GPU will be using less. -- Near desktop-level performance when plugged in.)

The Air is targeted as a 'premium' device. It shouldn't skimp on components.

If Apple releases an i3-integrated-graphics-only Air at $499, then I'll happily accept it as my next notebook. But if they keep the Air as a premium device, it should not have the i3. I would grudgingly accept it with i7 integrated graphics. (Even then, they'd have to lower the price a little.) But with this new tech from nVidia (which ATI should have no problem duplicating, as it's already based on standard technologies,) there is no reason for the Air to not have a discrete GPU now. (And just a week ago, I was arguing that there was no major selling point for a discrete GPU on the Air!)

ayeying
Feb 10, 2010, 12:21 AM
The difference is that the Air *DOES* have a "discrete" GPU. It's called the GeForce 9400M. It just happens to have a chipset integrated in to it. (Northbridge+GPU+Southbridge)

The Arrandale CPU has a chipset integrated in to it, as well, (Northbridge+GPU on CPU) plus a low-power separate Southbridge. This means that the Arrandale plus HM55 chipset will draw noticeably less power than Core 2 Duo + 9400M.

The first MacBook Air used a 25W CPU plus a 7W Northbridge and a 3.3W Southbridge. The second MacBook Air used an 18W CPU plus 13.3W Northbridge, and same 3.3W Southbridge. An Arrandale system would use a 25W CPU (with onboard Northbridge,) plus a 3.5W Southbridge

1. The Rev A MacBook Air had a 20W CPU.
2. The 2nd MacBook Air is basically what we're at right now, but with slower processors. They don't have a Southbridge. They use 17W CPU.
3. The 9400M is a Integrated GPU, according to nVidia.

NVIDIAŽ GeForceŽ 9400M G motherboard GPU1 redefines the notebook architecture by combining a mainstream GPU, system memory controller, and system I/O into a single chip for the smallest, most power efficient visual computing experience ever available in notebooks.

Current MacBook Air uses 17W CPU + 12W GPU/Chipset = 29W
Older Rev A MacBook Air uses 20W CPU + 10W for Northbridge + 3W for Southbridge = 33W

Anonymous Freak
Feb 10, 2010, 12:37 AM
1. The Rev A MacBook Air had a 20W CPU.
2. The 2nd MacBook Air is basically what we're at right now, but with slower processors. They don't have a Southbridge. They use 17W CPU.

D-OH!

I was confusing the Air's GPU history with the plain MacBook, which went from GMA 950 to GMA X3100 to GeForce 9400M. For some reason, I thought the Air had launched with GMA 950 and done the same transition. You're right, it started at GMA X3100.

(I also thought that they had moved to 17W CPU at the same time as my mistaken 'second generation Intel graphics' phase. Obviously, they moved to 17W CPU at the same time as moving to nVidia graphics...)

deconstruct60
Feb 10, 2010, 06:24 AM
Whereas Intel's IGP chipsets are a chipset with a feature-limited GPU tacked on. 9400M may not be great by "discrete GPU" standards, but it is based on discrete GPUs.

I think you are blowing off the differences in technology. Intel using trailing edge, much more mature processes for there chipsets. In part, the Intel GPUs lagged because just didn't have a much "stuff" which is driven by a substantially smaller transistor budget (because on old tech, but must be inexpensive. )



That's the entire point of this new idea from nVidia; there is no switch, and the GPU and RAM are completely powered off when not in use.

Powered off doesn't matter if not aligning with the fixed watt dissipation cap have to work with.




The point is that with this arrangement, you can use something that uses more than just those 6W.


That is just changing to a different container and thermal constraints then what have to work with. Making a MacBook Air this is hotter than the previous gen isn't a solution that is going to fly. Either have to exactly exchange/trade-off heat sources or come in with a lower thermal budget. It is a zero-sum game.



(Admittedly, HP doesn't go with the silly tapered edge, they 'square off' the edges like the MacBook Pro. But those tapered edges are just for show anyway, I wish Apple would cut them off, and make the footprint of the Air a little smaller as a result. Get rid of that ridiculous screen bezel.)


chuckle, you want to reduce the volume of the container still more and increase the thermal dissipation at the same time? Good luck with that.



The Core i7 at 25W is a much better solution than an i3 at 35W.


Only if your workload fits in the i7 cache. The 25W i7 parts achieve that, in part, by chopping 20% off the memory speed. At this point you loose economies of scale because nothing else in your line up is going to use 800Mhz memory.

Likewise... good luck with your graphics turbo when you have chopped 20% off your GPU memory pipeline too. ( and don't have a discrete with better memory bandwidth to punt to).




The Air is targeted as a 'premium' device. It shouldn't skimp on components.

whatever. I thought it was suppose to be the lightest weight solution for those who are willing to trade off dollars for reduced weight. Not a "Grey Poupon" device.

This device is also tapered when put in the extended battery
http://www.sonystyle.com/webapp/wcs/stores/servlet/CategoryDisplay?catalogId=10551&storeId=10151&langId=-1&categoryId=8198552921644667494&N=4294954366#features

The mania for "thinest" typically throws out so much volume start through battery life out the window also. However, this is also about 1/2 the weight of the "air".

Those i7 *UM 25W processors you are pointing to cost $300. Of out a $1,500 price point that is 20% of the total system cost just on the CPU/IGP.
For a $2,300 laptop that is a more reasonable 13%.


If Apple releases an i3-integrated-graphics-only Air at $499, then I'll happily accept it as my next notebook.


Likewise the i3's are in the $120 range. $120/500 ... again 24% of total system cost just on CPU/IGP. Don't think so.

Anonymous Freak
Feb 10, 2010, 09:14 AM
Only if your workload fits in the i7 cache. The 25W i7 parts achieve that, in part, by chopping 20% off the memory speed. At this point you loose economies of scale because nothing else in your line up is going to use 800Mhz memory.

I don't know which Core i7 you're looking at, but the i7-640LM (http://ark.intel.com/Product.aspx?id=43563&code=IntelŽ+Core™+i7-640LM+Processor+(4M+Cache%2c+2.13+GHz)) that I'm looking at supports 1066 MHz RAM just fine. And even if they did go with a chip that could only use 800 MHz RAM, it's not like it matters. They're having to buy individual RAM chips so solder to the motherboard as it is, whereas all other computers they just buy commodity SO-DIMMs. (Other than the Mac Pro, of course.)

Likewise... good luck with your graphics turbo when you have chopped 20% off your GPU memory pipeline too. ( and don't have a discrete with better memory bandwidth to punt to).
Again, I don't know what you're looking at, but the Core i7-640LM is the chip I've been talking about, the 25W Core i7, which has the same RAM speed access as the Core i3, plus it has 25% more cache.


Those i7 *UM 25W processors you are pointing to cost $300. Of out a $1,500 price point that is 20% of the total system cost just on the CPU/IGP.
For a $2,300 laptop that is a more reasonable 13%.

The Core 2 Duo SL9600 that's in the current MacBook Air is also a $300 processor. But, you are right. Even if they put an i3 in, they likely couldn't hit $500. But Dell and Acer both seem to be able to hit $550, so hopefully Apple could at least hit $600, if they chose to put an i3 in the new 'low-end' system. (I still think they'll keep the Air in the 'slight price premium' category. They've already positioned it that way just by insisting on using non-ultra-low-voltage parts and with the move to the GeForce 9400M, keeping a moderately decent GPU. If it really was all about saving as much weight as possible for only a small tradeoff in price, it would have had a much slower ultra-low-voltage CPU all along; or it would have had an absolutely dismal battery life.)

Scottsdale
Feb 10, 2010, 04:35 PM
What I still haven't figured out is the TDP W required if the IGP is turned off. I have read several places that state the IGP can be turned off, but I haven't read whether turning it off saves power? ANYONE KNOW A FACTUAL ANSWER? The Core i7 25W CPU is the low voltage replacement of the 17W C2D SL9x00 CPU.

Also, right now, the Nvidia GPU uses 12W. So right now we're at 17W + 12W + a couple other for xBridge = 29W+. The Core i7 Intel with its IGP is 25W. It's definitely saving power over the Nvidia solution.

I happily report that I am an Intel HATER! I believe Intel sells not only inferior graphics solutions, but also that it couldn't compete in a fair market so it acts anti-competitively and forces the competition out. In the end, Nvidia will be able to sell chipsets again. I am disgusted by Intel's actions, and I don't believe anyone should reward Intel for acting the way it did.

I would gladly prefer Apple axing Intel right out of the picture. I don't care about Intel's Core series CPUs. The bottom line is that CPUs are NOT the problem with today's Mac computers! The graphics, drive speeds, and software are the biggest things hindering performance and a great user experience.

I will be happy to see another C2D CPU in the MBA with an Nvidia GPU. I believe we can get a better MBA by including 4GB to 8GB of RAM, improving the drive speed, and having software that takes advantage of all of the power of the computer it already isn't using. That is why Apple wrote OpenCL and Grand Central Dispatch into OS X. All we need is software venders to capitalize on it.

Lastly, I will not believe that Intel's IGP is comparable to the 15-month-old Nvidia GPU until I read a scientific study stating that it's such on the Mac OS X platform, OR I CAN USE IT MYSELF AND VERIFY IT MEETS MY NEEDS JUST AS MY CURRENT MBA FITS MY NEEDS. I don't care what is reported, if it isn't scientific nor reported using the OS X platform. What happens in Windows doesn't happen the same in OS X. Just like OS X hits the CPU terribly when Flash is running and even simple video playback can hit the CPU hard. In Windows, video playback and even HD video playback is much less CPU intensive. We had problems playing video on the original MBA with its Intel GMA 3100, and I don't want a repeat of that. Whether Intel is better than it used to be is almost irrelevant. In the end, don't we all want an improvement in our next MBA? An improvement should be required over a 15-month-old graphics system.

deconstruct60
Feb 11, 2010, 09:22 PM
I don't know which Core i7 you're looking at, but the i7-640LM (http://ark.intel.com/Product.aspx?id=43563&code=IntelŽ+Core™+i7-640LM+Processor+(4M+Cache%2c+2.13+GHz)) that I'm looking at supports 1066 MHz RAM just fine.


Sorry, right. Was looking at the UMs
http://ark.intel.com/Product.aspx?id=47700

However this is better side-by-side on the chart here:
http://techreport.com/articles.x/18218

The issue with the LMs is that they run at the lowest speed of the i3s. You'd have to get Turbo boost to turn on to run faster. If you are significantly pushing the MBA very hard with all cores active, it won't turn on. (let's say GCD is doing its thing and spawns off lots of nice concurrent threads. )


They've underclocked it. Specifically look at your graphics frequency on the i7 LM versus the graphics frequency on the i3

http://ark.intel.com/Product.aspx?id=43529&processor=i3-350M&spec-codes=SLBPK,SLBPL

The max frequency on the LM is close to being the base frequency on the i3. Head to head on graphics the i3 will blow that setup out of water. Likewise turning on the graphics full blast will most likely turn off Turbo Boost (so again now clocked down below an i3). That's where they are getting the reduction to 25W from.

Similarly, You may get more cache but has tossed clock speed to that of the lowest i3. Unless you are striding through carefully align memory vectors or plowing through database buffers you will be lucky to get back what you lost in down clocking. And still if you have space to add a discrete would make a difference but likely don't.

If somehow they eeked out motherboard space and power from somewhere to allow discrete perhaps an i7 + discrete would make a diff. However, suspect an i5 , no discrete, and more profit (or yet another price cut) will be the move. Now sure where going to get the extra power/thermal budget from though. There is not much else to throw overboard on the MBA. Nor any huge chip consolidations in other components. ( perhaps the RAM, but that may/may not be enough. )



And even if they did go with a chip that could only use 800 MHz RAM, it's not like it matters.


volume matters in pricing. The more you buy the better price discounts they get.



The Core 2 Duo SL9600 that's in the current MacBook Air is also a $300 processor.


Then perhaps the margin on the MBA is much lower than I though. I wouldn't be surprised to see them try to make it higher though.



so hopefully Apple could at least hit $600, if they chose to put an i3 in the new 'low-end' system. (I still think they'll keep the Air in the 'slight price premium' category.


If keep to past philosophy, Apple is extremely unlikely to introduce anything in the iPad price point (same reason phones don't overlap iPad, MacPros don't overlap iMac , iMacs not overalp mini , etc. ). Most likely, the laptops are stuck at the $1,000 & up range. It would be more interesting if they let the Mac OS X devices compete though.

deconstruct60
Feb 11, 2010, 10:18 PM
What I still haven't figured out is the TDP W required if the IGP is turned off. I have read several places that state the IGP can be turned off, but I haven't read whether turning it off saves power?


From a local perspective it may, but if you could solely going to run on discrete graphics, the system wide TDP budget would have to go up. None of the discrete solutions are going to run that cool as what just turned off. Second, you have just paid Intel for something that you are not going to use at all.



I don't care about Intel's Core series CPUs. The bottom line is that CPUs are NOT the problem with today's Mac computers! The graphics, drive speeds, and software are the biggest things hindering performance and a great user experience.


If AMD could get their act together. They are bigger player in the value priced processors and didn't focus so much on mobile, so not likely to end up in a MBA.





We had problems playing video on the original MBA with its Intel GMA 3100, and I don't want a repeat of that. Whether Intel is better than it used to be is almost irrelevant. In the end, don't we all want an improvement in our next MBA? An improvement should be required over a 15-month-old graphics system.

Unless you are asking for Hackintosh scientific benchmarks only Apple Engineering has any "alternative universe" benchmarks. News flash they aren't going to publish any of that info. Insisting on that means you really don't want an answer.


http://techreport.com/articles.x/18218/8

If you want to close your eyes because that isn't MacOS ... knock yourself out. The gap is so huge though between the Intel GMA X4500MHD IGP (battery saving ) option there and the i5 ( K42F ) that is there is a huge gap it would more likely be Apple's fault for shipping crappy drivers and not integrating Quicktime with the IGP. That would not be Intel's fault or at least the hardware's fault.

Basically by shifting the memory controller (and PCI-e controller ?) over to the CPU die leaves the whole second die for graphics. With a larger transistor budget Intel's graphics is significantly better.

If Apple cuts the price again, it will be slower just in top end graphics, but also more affordable and substantively faster in everything but higher end 3D graphics. If your objective is to buy a MBA to fiddle with higher end 3D perhaps selected the wrong machine.

If Flash 10.1 hooks into Quicktime and Quicktime hooks into the IGP video HD playback .... you shouldn't be able to see a noticeable difference on any video playback on this IGP solution than you would on the 9400M. Video sucked on the 3100 GMA because the CPU most likely was doing the video playback... not the IGP.

Scottsdale
Feb 12, 2010, 01:46 AM
From a local perspective it may, but if you could solely going to run on discrete graphics, the system wide TDP budget would have to go up. None of the discrete solutions are going to run that cool as what just turned off. Second, you have just paid Intel for something that you are not going to use at all.

Understand the business term "Sunk Costs?" I would consider the IGP a "sunk cost," and I wouldn't use it unless it provided my users with the experience they pay for and expect in a Mac computer. If Intel will not sell Apple a Core i-series CPU without an IGP, it doesn't matter whether Apple pays for the IGP or not. Apple doesn't use the IGP based on the fact that it already paid for the IGP with each CPU. It's a sunk cost for each CPU. It doesn't matter how much they paid for the item, if it's broken or cannot provide the performance required DO NOT USE IT as it's nothing more than a sunk cost.

Understand that what Intel is doing is anti-competitive and it will end up allowing Nvidia back to the chipset business... it's too bad for us that Intel is going to gain ground fast by forcing its business on computer users. I would absolutely LOVE a solution getting around Intel. As I said before, the CPU is not the problem in today's computers. The CPUs are far more capable than any other component in our machines today. Add better graphics, don't include them on the CPU to force your way in Intel. Since Intel couldn't beat Nvidia, it simply shut Nvidia out. Thus the American way... screw over the little guy until you can beat him because you shut him out until he's no longer relevant.

The CPU I see Apple using in the MBA are:

Core i7-640LM 25W at 2.13 GHz boost to 2.93 GHz = $332

http://ark.intel.com/Product.aspx?id=43563&processor=i7-640LM&spec-codes=SLBMK

Core i7-620LM 25W at 2.0 GHz boost to 2.8 GHz = $300

http://ark.intel.com/Product.aspx?id=43559&processor=i7-620LM&spec-codes=SLBML

These are the replacement CPUs for the SL9x00 Low Voltage series CPUs. It will not be an i3 or an i5 in the MBA. It's simple as the voltage. I assume since we can turnoff the IGP, we can save power? Forget the cost, Intel made the price of the IGP free, as it's a sunk cost priced not a part of the CPU as it's forced onto Apple and not a choice.

The Core i5-520UM will not be in the MBA.
There is not a Core i3 made that would be even in the same class as anything in an ultraportable... is there even a Core i3 mobile Low Voltage CPU yet?

The Core i3 you linked, is a Core i3 35W TDP. That's never going to fly as it's too hot. We cannot go from 17W + 12W Nvidia to 35W + anything else... Maybe I am misinterpreting what you're implying... I don't see this CPU going in an MBA, NEVER.

http://ark.intel.com/Product.aspx?id=43529&processor=i3-350M&spec-codes=SLBPK,SLBPL

We are far better off Core i7-6x0LM Low Voltage CPU at 25W. We turn off the Intel IGP and we include a dedicated graphics card. This is the MBA I foresee.

Or, how about we either use Nvidia Optimus with GT3x0 hybrid auto-switching solution? Like this one by Asus which has a GeForce 310 at 12-hours battery!:

http://www.engadget.com/2010/01/11/asus-ul80jt-spotted-with-automatic-switchable-graphics-brags-12/

Or even far better is this one by Sony which combines the Nvidia 330M:
http://www.engadget.com/2010/01/19/sony-vaio-z-brings-quad-ssd-drive-and-dynamic-graphics-switching/

I cannot find the information on the ATI hybrid auto-switching graphics between ATI dedicated and Intel integrated graphics, but I am pretty sure there is one already available.

The bottom line is Apple isn't going to use a 35W TDP Core i3 in the MBA. Even a Core i5 makes less sense. The same caliber/class/cost CPU as Apple is using in the MBA right now is the Core i7-6x0LM Low Voltage CPU at 25W... it has 4MB L3 and is the replacement for the SL9x00.

The problem Apple has with Intel and using its Arrandale IGP, is Apple just told us all why we needed to upgrade to Snow Leopard and from our Intel based graphics to get 5X the performance from the Nvidia 9400m. How is Apple going to turnaround and say, well forget all that we hoped for with OpenCL, we're back to Intel for graphics. Forget your 5X video charts, we're back to Intel. This doesn't make sense, and I am sure Apple agrees.

I believe the reason we're seeing a longer stretch for this update to both the MBP and MBA is it's taking Apple some time to research and implement its solution. I mean Nvidia decided to stop making chipsets for Nehalem/Arrandale CPUs just a few months ago. But Apple had to be prepared for the possibility that Nvidia would not be providing its chipsets with Core i-series CPUs, as it had been a legal battle played out all over the web for six months. The thing is what does Apple do about it? Do they stay with Core 2 Duo for one more update to take advantage of Nvidia chipsets/GPU? Or does it find a dedicated solution? Or is Intel's IGP really "acceptable" now? Or is there a hybrid auto-switching graphics solution that is really the right move.

This is not just about power savings, sunk costs, or a decision for the MBA. Apple's entire computer line is currently using the Nvidia 9400m GPU/chipset at least in low-end models (except the Mac Pro). I expect it has taken such a long time for this update round because Apple has chosen to move forward with Intel's Arrandale CPUs. It went with Core i5/i7 in the iMac, but that didn't really mean anything as Apple stuck with Nvidia in the low-end model. Even if we don't get an MBA update right away, I believe we will know the outcome of our beloved MBA once the MBP is updated. I expect Apple to use one set of graphics drivers and focus h.264 and OpenCL across it's entire line just as with the current Nvidia 9400m.

I would gladly take a Core i7-640LM 2.13 GHz (boost to 2.93 GHz) 25W 4MB L3 with IGP turned off to hopefully yield 17W or less... add an ATI dedicated card or an Nvidia 330m and even Optimus to use both the Arrandale IGP and dedicated card. I don't know what the numbers will support, and I don't know if turning off the IGP saves that much power? I sure wish I knew what turning off the IGP meant for TDP of the Arrandale CPU?

Of course we aren't going to know what Apple's outcome is for Intel IGP included in Arrandale CPUs. But that's just my point. A hackintosh isn't the answer either as the driver Apple uses is critical. My point really is that there's NO REASON to believe that the IGP included with Intel's Arrandale CPU is even equivalent to the 15-month-old Nvidia GPU Apple currently uses in almost all of its Macs. How does Apple sell a loss in performance to us? How after everything it did with Nvidia, OpenCL, Grand Central Dispatch, and OS X Snow Leopard does Apple just accept Intel's IGP? I just don't see Intel providing the right solution with its Arrandale IGP.

Like I said before, I am not going to accept that Intel's IGP is "comparable" in any way to the 15-month-old Nvidia GPU until I see a study using a scientific method stating such within the OS X environment, or I can use one myself. People often confuse what is true on one platform to be true on another platform, as that's not how computer software works. Different platforms have different weaknesses. I think it's very fair to state a scientific study proves it to me. I can go publish anything on the Internet, and I can go prove anything on the Internet, but that doesn't make it correct or true. Everything can be twisted around to mean anything the author wants. Forget everything you read... or you better trust your source.