hook up a 24" display and try doing any editing....
SLOW SLOW SLOW on the 9400m
I'm not sure what leads you to believe that driving an external display is even the slightest bit taxing to a modern GPU. It uses some more memory and that's about it.
hook up a 24" display and try doing any editing....
SLOW SLOW SLOW on the 9400m
I'm not sure what leads you to believe that driving an external display is even the slightest bit taxing to a modern GPU. It uses some more memory and that's about it.
well yeah, and 9400m has shared memory...
What I'd really like to see is running the 9400M to drive your display and do some light OpenCL/CUDA work, then using the 9600M GT entirely in display-less computation mode. That'd be awesome.
whatever.
do you people do anything but browse the web?
edit:
i mean, here you are, a bunch of "professionals" that claim 9600GT and 9400m with shared memory have as little to None performance difference, with no benchmarks or anything to back you up.
okay then, its no difference, 256mb shared memory is as great as 512mb discrete memory! alright! because 4gigabytes is more than enough anyway
This is simply not true. Aperture is GPU accelerated. Motion relies heavily on the GPU. FXPlug plugins in Final Cut often use the GPU. Color is largely GPU-based. There are a fair number of GPU-accelerated operations in Photoshop CS4 now. And OpenCL means a lot more software will probably be leveraging the GPU in the future.
I do like to relax but I don't play computer games and I haven't for years. I do have a console to play games which is probably why I don't own any computer games.Don't any of you like to relax and game a little on your laptop? Maybe on a trip when you have free time.
What entitles me to have a better notebook with higher build quality, performance, priority service, and exclusive design and label?
Well the answer to that is pretty simple: because I paid more. And people who pay more deserve more, it's pretty straightforward.
I have to add something small and maybe stupid to these posts.
Don't any of you like to relax and game a little on your laptop? Maybe on a trip when you have free time.
Maybe I am in the minority but I would like my laptop to be even capable of having fun sometimes (playing games).
I'm glad you posted, you're evidence that this entire GPU confusion is only generating Apple more sales on higher end laptops - when the user does not need a GPU.
I
Says Who? Apple has made a claim that Snow Leopard is paving the way for such technology - but at the time, their roadmap shows no promises of what this technology promises. By the time OpenCL is used, you'll be buying a different Mac.
And you're right, allowing Aperture to rotate your picture 10-20% faster or allowing the pixels to populate faster must really have warranted a $300 add-on.
I'm a computer engineer when I'm not pressing "refresh" on the macrumors site.whatever.
do you people do anything but browse the web?
Nobody has ever claimed that. There is clearly a performance discrepancy between the two for certain (ie 3D) workloads. What is being claimed is that the majority of applications people use for "video editing" or "graphics work" are in reality CPU-bound and rely on the GPU very little. Yes, there are certainly applications which take better advantage of the GPU than others. Someone provided a list of GPU-accelerated programs a few posts ago. If you use those applications on a regular basis you're well aware of what they are and why you need a powerful GPU.i mean, here you are, a bunch of "professionals" that claim 9600GT and 9400m with shared memory have as little to None performance difference, with no benchmarks or anything to back you up.
If you're not actually using that physical memory anyway, then the actual performance difference is very very little. Adding more RAM to your machine does nothing if you're not using it. I'd be very curious to know what you think you're doing on your laptop that regularly consumes 4GB of physical memory though.okay then, its no difference, 256mb shared memory is as great as 512mb discrete memory! alright! because 4gigabytes is more than enough anyway
I think the main point here is that for MOST purposes, people (including many pros) will be fine using either GPU on a PORTABLE computer. If you want genuine speed, go with a desktop. Many apps don't even use the GPU which means there will be no difference since nothing will be offloaded to it. Aperture runs very well with the 9400m and connected to a 22 or 24" display-- I would be hard pressed to notice a difference. Sure, I can benchmark, but the key for me is whether the speed gets in my way,and it doesn't.
For what it's worth, operations running on the GPU in CUDA (under Windows) or, presumably, OpenCL can be something on the order of twice as fast on a 9600M GT vs a 9400M. The 9400M has 16 stream processors; the 9600M GT has 32. It's probably clocked a lot faster than the 9400M as well, but I can't be bothered to look up what they're set at in the newest MBPs.
A lot of operations do not depend on the GPU, the OP is right in this regard.
Power users, however, I believe are justified in wanting the discreet GPU even if they don't play games. Anything that does a lot of stupid, parallel math will benefit greatly from OpenCL. Engineers doing finite element analysis, 3D scanning, people doing video encoding and other operations on images and video - they will all see a huge benefit from offloading the "dumb" math to the GPU.
I don't believe OpenCL will be fully realized (or even fully baked) within the life cycle of this MacBook Pro model, but there will almost certainly be updates to major programs that will take advantage of its capabilites, and that is certainly not hurt by having over twice the GPU power available.
What I'd really like to see is running the 9400M to drive your display and do some light OpenCL/CUDA work, then using the 9600M GT entirely in display-less computation mode. That'd be awesome.
Say me. I've tested Snow Leopard and there is a difference between the 9400 and 9600.
And Aperture's difference can be alot more than 20% depending on what you are doing.
Man are you arrogant. Suddenly wake up this morning and decide you know more than the rest of the universe? Give it a rest.
Right, and I developed Snow Leopard.
I will not believe rumors from some random person on the forums. That and Snow Leopard has evolved drastically over the past few months, according to other rumors.
I sugest everyone read up on a piece of technology called Quartz Extreme, something that Apple has included in the OS for the past few years now. After your read, install X-Code Tools and enable a program called "Quartz Debug".
With Quartz Debug running
- Select Tools > Show Frame Meter
- Select Tools > Disable Quartz Extreme
After this is complete, go about your normal day to day work duties and keep track of the Frame rate, and CPU Utilization. You will get strange artifacts on the screen when using some menus and such because OS X relies heavily on the GPU for normal screen rendering processing.
After you have a good test sample, go back to the quartz Debug tool, and re-enable Quartz extreme. You should notice a reduction in CPU use, and increase in average frame rate. Regardless of what task you are doing.
In my test I was watching a 720p movie trailer that was downloaded to my hard disk (Surveilance.mov) with no other programs running. Each test was done after a normal reboot with WIFI enabled, and Quartz Debug, and Quicktime as my only open applications.
Below are my results
The big gage indicates frame rates, while the tiny gauge is showing CPU Utilization. The test was done in a MacBook Pro 2.4GHZ Dual graphics.
9400 Quartz Extreme Disabled
View attachment 175503
9400 Quartz Extreme Enabled
View attachment 175504
9600 Quartz Extreme Enabled
View attachment 175505
Please note the differences between the 9400 and 9600 framerate and CPU utilization, and compare to that of the 9400 with Quartz Extreme disabled. This scene was a simple steady office scene, and in scenes when there is more action, the gap will widen considerably (and be harder to capture in a controlled experiment). When you have more GPU aware OS X tasks going on, or applications that can utilize the GPUs, the gap does widen and starts to become more noticeable to the user.
This is just an example to prove that the Current version of OS X already does utilize the GPU for some tasks, and that there already is an appreciable difference in perforamnce between the discrete and intigrated GPUs for something as simple as playing a movie. When Multi tasking (watching a movie while doing light photo edits in iPhoto), and especially while multitasking on multiple monitors, the gap widens noticeably without the use of software meters, depending on the user, and the tasks they are working on.
In all cases, feelings of computer performance are always going to be highly subjective, and equally opinionated.
Regardless,
Buy what you need, and can afford, not what someone else's benchmarks can prove. If you don't need the more expensive system, and don't think you need or want dedicated graphics, don't get them. I simply posted this to show that there is a day to day difference between the 9400 and 9600 in regular simple uses today.
![]()
bcaslis: great post...Interesting results! Thank you for evidence.
That said, 95% of MBP users still will not and do not need discrete GPUs.
Interesting results! Thank you for evidence.
That said, 95% of MBP users still will not and do not need discrete GPUs.