Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I hate it now that each time I unplug my MacBook Pro from the adapter, I either have to close every single program and log out, then log back in, then open every single program again, then do this the other way around when i plug it back in.

With the discreet graphics I barely get 2 hours of battery time, but with the integrated graphics I can't even scroll an image in Photoshop without it looking like it's being done on a 10 year old PC.

Logging out is ridiculous! Don't the current chips support switching from one to the other without logging out? I heard that, and that the only problem is OS X doesn't support it. So I don't think we actually have to wait for new hardware, we just need Apple to update the OS. But of course that will never happen, they want us to buy a brand new computer just to get new software. Like always.
 
Sad If true!

Not so much because hot GPU switching is bad but rather because it indicates two other really bad things. One is that Apple doesn't have enough influence with Intel to block their terrible market control efforts. Two is tha Apple is still using Nvidia parts in the new laptops. Those two things combined should worry anybody that wants to see innovation move forward in the industry.

As has been pointed out Intels integrated GPUs are terrible even for the common user. What is even worst is that the GPU isn't much good for anything other than wasting power when used in conjunction with a discrete GPU.

This is really sad because one I would love to see ATI GPUs in the MBPs. And two some one with influence needs to show Intel the ignorance of their ways.

Why do I say Intel is ignorant here. Simple really, they don't seem to realize that one size doesn't fit all and that a Yugo type GPU in an Apple product discuss a lot of people. They really need to offer up an Arrandale type chip with a suitable interface for an external GPU and delete all those wasted transistors in their integrated GPU. Frankly I'd be shocked if Apple implemented these chips in their higher end products at all. Even in a Mac Book intels integrated GPU would be a step backwards for the user.


Dave
 
Now I am all for discreet GPUs, but for your average user that is probably overkill. Plus Apple seems to only boast the great battery life when running on the IGP.
My other pet peeve is Apple offering Pro notebooks without offering Quadro or FirePro graphics.

The battery stats for Apple either won't change or may get better. They base battery life on web browsing, wifi, listening to music & document editing. None of these things need the discreet GPU to be active.
 
Now I am all for discreet GPUs, but for your average user that is probably overkill. Plus Apple seems to only boast the great battery life when running on the IGP.
My other pet peeve is Apple offering Pro notebooks without offering Quadro or FirePro graphics.

The current low-end MBP uses the 9400M which is a pretty good chip for GPGPU stuff (think OpenCL and CoreImage). The Arrandale "GPU" can't do GPGPU. That's a big problem for Apple because they've been pushing OpenCL with developers so hard. That means that the new low-end MBP, without a discrete GPU, will perform a lot worse with OpenCL than the current MBP.

This isn't going to be something that just game players and 3D modelers will notice. There are all sorts of surprising things that could run a lot slower without a discrete graphics chip.
 
Yeah, it is "horrible", that's why for games and GPU intensive things you switch to dedicated. Chances are even if you were doing something remotely GPU intensive, you'd use the dedicated card, making the IGPs weakness a null point. Why pay NVIDIA twice for a GPU when one GPU is already included in Intel's processor price?

What are you talking about? As the article says, this is about 2 GPUs - one is the Intel garbage, and one is an NVIDIA dedicated GPU. The nVidia technology switches between the two on-the-fly.
 
This is horrible news. Nvidia doesn't have a single half way decent low wattage laptop GPU out there.

The ATI 5830 and 5850 blows the hell out of any of Nvidia's offering and does it while using only 24 watts of power!! Thats why the MBP knock off "The HP Envy" which features identical design, features and power usage opted to use the ATI 5830 all while offering 1920x1080 resolution screen, an i5 processor, blu ray and a freaking 9 hour battery!! for half the price of the MBP.

If ATI is out, then we can kiss any hope of the next MBP matching the performance of the HP Envy out the windo.
 
The "GPU" in the Arrandale CPU also includes the memory controller for the CPU. There is no easy way to not include it. Not to mention it's been benchmarked to be about equal to the 9400M, and supports full screen video acceleration and other good features such as an audio output!

And that's intel's design mistake. The memory controller should be on the main die.
 
A lot to disagree with here.

The "GPU" in the Arrandale CPU also includes the memory controller for the CPU.
if this is true it doesn't mean anything because memory controller technology has been nailed down for years.
There is no easy way to not include it. Not to mention it's been benchmarked to be about equal to the 9400M,
Well I'm sure in tightly selected benchmarks it can be made to look good up against three year old Nvidia technology. But that is not what Apple would build today.
and supports full screen video acceleration and other good features such as an audio output!
The thought of Intel audio has my ears cringing in pain. My ears could be wrong of course. As to video you still have the question of Intel quality here, besides the fact that other GPUs have also been accelerating video for some time.

The bigger issue is what the Intel iGPU doesn't support well. Three D being one and OpenCL another. The reality is the drop in performance can be dramatic on Apple technology because Apple does try to leverage the GPUs in it's systems. Thus what might look like a good Windows GPU ends up sucking balls on a Mac. In a nut shell I don't believe the performance numbers with respect to these GPUs until I see them in Apple hardware running benchmarks suitable for the workloads there. Even then I'd be scepticale.



Dave
 
The "GPU" in the Arrandale CPU also includes the memory controller for the CPU. There is no easy way to not include it. Not to mention it's been benchmarked to be about equal to the 9400M, and supports full screen video acceleration and other good features such as an audio output!

There you go bringing things like "facts" into the weekly "Intel IGP Two Minutes Hate"... :)
 
The thought of Intel audio has my ears cringing in pain. My ears could be wrong of course. As to video you still have the question of Intel quality here, besides the fact that other GPUs have also been accelerating video for some time.
I'm sure Realtek audio is a premium as well. :rolleyes:

Intel passes this time around for video playback and light gaming. I'd still rather have a dedicated GPU though.
 
What are you talking about? As the article says, this is about 2 GPUs - one is the Intel garbage, and one is an NVIDIA dedicated GPU. The nVidia technology switches between the two on-the-fly.

Ah, misunderstanding on my part. There I was thinking NVIDIA just wanted to put a shiny 9400M + 9600M combo again. But then if that's the case, don't Radeon cards already do this?
 
When referring to the issue of separate GPUs:

discreet |disˈkrēt|
adjective ( -creeter, -creetest )
careful and circumspect in one's speech or actions, esp. in order to avoid causing offense or to gain an advantage : we made some discreet inquiries.
:(


discrete |disˈkrēt|
adjective
individually separate and distinct : speech sounds are produced as a continuous sound signal rather than discrete units.
:)

Mac OS 10.6.2 Dictionary Widget
 
If iGPU performance takes a hit

I remember the fiasco (or complaints, whatever you want to call them) the first time around that Apple used the Intel integrated GPUs in their laptops...all the lines suffered for it and that's why they jumped to have the 9400 integrated GPU in there.

If they go the integrated Intel route again and its "only as good as the 9400" or less (including if there is no support for newer encoding and graphic language standards), that could seriously impact their laptop sales on everything from the Air up to the Pros. Then their only choice will be lower price due to lower perceived value of their offerings (yes, lower price is always better from a buyer's perspective...but would prefer that not be due to a performance hit from a slower next-gen Apple product!)

An additional thought is that even if the new Intel iGPUs are as good as current offerings, software companies aren't sitting still. Every time a software company puts out a new version of their software or a new product, it inevitably raises the bar (at least most of the time) on CPU, GPU, and memory used. So even sitting even on iGPU performance could be a step backwards for any of the Apple laptop versions that have integrated graphics only.

Lets hope that Apple looks at that carefully and arrives at a good solution...I wonder if AMD has any half decent integrated chipsets for the new Intel processors? There website isn't that specific from what I can find, and all the motherboards for the i'series processors seem to have Intel chipsets only.
 
Others have already mentioned this but many don't understand the technical issues.

The battery stats for Apple either won't change or may get better. They base battery life on web browsing, wifi, listening to music & document editing. None of these things need the discreet GPU to be active.

This is a blanket statement that simply isn't valid. That is do to the way Apple has implemented GPU technology to effectively accelerate the Mac. If you have a GPU that implements OpenCL poorly or not at all you will see a dramatic drop in performance with respect to some apps and the system over all. Plus Apple continues to leverage the tech in newer software releases, I would not be surprised to see GPU accelerated Safari in the future (beyound what is already done with system libraries).

In a nut shell a crappy GPU screws the owner with todays Mac OS/X and is even worst when more software is updated or created to leverage the GPU. To put it bluntly I can't reccomend to anybody right now the purchase of a Mac with an integrated Intel GPU. That for general purpose usage.

Look at it in another way. IPhone gets rave reviews for it's performance graphically. The rumor is that Apple leveraged the GPU in IPhone to a greater extent than they have yet in Mac OS/X. In other words the Mac GPUs are underused relative to other Apple hardware. As Apple slowly implements and refines the techniques used in Snow Leopard the GPU becomes more important to the user experience.

I just see your statements as really bad advice that effective mis inform people as to what they would be getting. The reality is Apple would like be better off with AMD tech than an Arrandale with a crappy Intel integrated GPU. One could hope that I'm wrong and that the intel stuff works better in an Apple environment than implied but that is a given. Thus the suggestion to not buy without consideration to what is offered. There is a real possibility of a step backwards farther than you might imagine.


Dave
 
This is a blanket statement that simply isn't valid. That is do to the way Apple has implemented GPU technology to effectively accelerate the Mac.
OpenCL is moving along much too slowly to begin with.

I'm sure the 500 ms it takes to swap over from the IGP to the discrete solution will be noticeable. :rolleyes:
 
I do not quite get it. Why so much commotion about integrated GPU in the latest Intel processors in Apple world? I do not see any of it in PC world. Is it because Apple produces only one laptop model for each screen size? PC vendors produce IGP based laptops for those who do not need powerful graphics and laptops with IGP and discrete graphics (NVIDIA or ATI) for those who needs them. What is the problem? Is it because Apple only works with one GPU supplier at a time?
 
I really hope Apple goes with ATI graphics for their next GPU. ATI's HD5000 series are more forward looking with DX11 support and supposedly OpenCL 1.1 support, while nVidia's 300M series is just a rebranded 200M series. ATI's current GPUs also offer better performance/watt with the HD5730, HD5750, and HD5830 having TDPs close to the nVidia GT330M's 23W TDP (same TDP as the current 9600M GT) while offering much better performance.

While nVidia Optimus is convenient, it's probably actually increase max power and heat since in discrete GPU mode the IGP also has to be active too since the discrete GPU copies it's frame buffer to the IGP's frame buffer and uses the IGP's display pipeline. This will also limit max CPU performance, since requiring the IGP to be active all the time eats into the CPU's TDP preventing higher Turbo Boost modes. If Apple can implement no-logout IGP/GPU switching with ATI's HD5000 series, I would prefer this option over nVidia Optimius and the current 300M series, at least until nVidia releases their DX11 mobile GPUs.
 
Why not just put ONE powerful GPU's in the Macbook instead of two so-so's? :confused:

It'a about battery life. If you're on the go and don't need POWER graphics then you can not have them and have great battery life. If you're at your desk with you MBP and battery life doesn't matter then you can have POWER graphics too
 
I always thought hot-switching would come to the current gen. So this is potentially disappointing news. I certainly won't be upgrading my original Unibody 15" MBP until the hot switching is confirmed. I wish I got more than 2 hours battery life though...
 
I always hoped some driver update would make this possible... I guess I'll have to wait till my next MBP for this functionality...

Windows can do it on the fly, even without using the new nVidia technology. It just blanks the screen for a second while it switches video cards. My point is, technically the MBP should be able to do it too, but Apple just doesn't want to/can't implement it in OS X as a software feature... otherwise I'd be willing to bet it would have been in SL.
 
Powerful equals battery-draining.

Furthermore, stupid Intel insists on packaging a weak-ass GPU in its chip packages, whether you want them or not.
In theory with well partitioned multiple clock domains, low-leakage transistors, smart downclocking and shutting down of unused sections, even a big chip can have decent minimum power levels. In theory of course. Although, ATI's HD5000 series seems to have very decent idle power with the HD5870 idling something like 25% lower than the HD4870 despite having double the transistors.
 
Now I am all for discreet GPUs, but for your average user that is probably overkill. Plus Apple seems to only boast the great battery life when running on the IGP.
My other pet peeve is Apple offering Pro notebooks without offering Quadro or FirePro graphics.

You're right wouldn't be useful for the avg user, but they should buy a non "pro" model then.
 
Powerful equals battery-draining.

Furthermore, stupid Intel insists on packaging a weak-ass GPU in its chip packages, whether you want them or not.

Not necessarily true.

You don't have to have the GPU run at it's full capacity all the time and it can be calibrated to conserve battery juice. I'd rather have more powerful GPU stepped down than two obsolete cards for the computer to dynamically transition between.

Replaceable batteries with a greater cell count are also another solution...except for a unit with a non-removable battery....:rolleyes:
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.