Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Apple doesn't release sytems with discrete GPUs to appease the Gaming Market.

They find a sweet spot that provides the best power consumption, price in volume, and provides the deepest and broadest capabilities it can leverage for it's OS and that means GCD/OpenCL/OpenGL in which more and more games are leveraging but in fact far more scientific, graphic and multimedia applications now expect to be optimized.

Nvidia continues to design it's GPGPUs around an optimized pipeline for CUDA. OpenCL has won this world and they are finally accepting it with their dump of NVidia PTX code stack into the LLVM/Clang project a few weeks back.

I'm building against it daily. The PTX Stack has come a long way, without Nvidia's help before they drop this source stack in trunk. AMD has been using LLVM/Clang for several releases to optimize their designs.

LLVM/Clang 3.1 is stamped.

LLVM/Clang 3.2 will finish off the C11 and C98 compliance needs and more goodies added by Apple and other third parties that will not only with FreeBSD moving to LLVM/Clang for versino 11 but more and more Linux platforms giving the option of their entire distributions built with LLVM/Clang or GCC as options.

All this is made possible thanks to Apple seeing and filling needs in the world where Hardware is not optimally being leveraged and more or less with GPUs having wasted over a decade pandering to Gaming as if its a real measure of what a GPGPU can do.

Games are benefiting but lets face it, games aren't the end all be all standard in driving graphics capabilities like it once did. More and more parallel programming is impacting everyone's lives, whether they know it or not. In 5 years people won't be waxing on about fps and 1080p as the bench standards, but photorealism without tearing in games and without taxing a system. Developers will have to do less trickery and consumers will only notice how pretty the textures are so long as their systems have discrete gpus on them.

Intel will have to copy AMD's APU approach sooner rather than later.
 
NVIDIA - no thank you! My iMac didn't make it past 4 years before the NVIDIA graphics card went up in smoke.
 
Bear in mind that Optimus does not work with 3D panels. The discrete video is stuck running even at idle.

I believe that there is no Optimus on the Asus with the non 3D panel either. I saw the Best Buy version of the Asus and it is a beast. Over 9lbs, cooling vents the size of jet engine air intakes, about 2" thick at the hinge side. The battery life must be attrocious. But it is one powerful laptop.
 
Here's the full specs of the GPU with benchmarks:


http://www.notebookcheck.net/NVIDIA-GeForce-GT-650M.71887.0.html

Text:
The NVIDIA GeForce GT 650M is a mid-range, DirectX 11.1 compatible graphics card that was announced in the first quarter of 2012 for laptops. It is a Kepler-based GPU built on the GK107 architecture and is manufactured in 28nm at TSMC. The graphics card uses a 128-Bit wide memory interface with either the more common but slower DDR3 for VRAM or the more expensive and faster GDDR5. Due to a higher core clock of up to 850 MHz the GT 650M is noticably faster than the 640M.

Architecture

The Kepler architecture is the successor to the Fermi architecture that first appeared in laptops with the GeForce 400M series. The GK107 Kepler core offers two shader blocks, called SMX, each with 192 shaders for a total of 384 shader cores that are clocked at the same speed as the central core. Although more shader cores are available in the Kepler architecture as compared to the Fermi design, the Kepler shaders are still expected to be up to twice as power efficient. However, due to the missing hot clock of the shader domain, two shaders of a Kepler chip are about as fast as one shader of a Fermi chip (as the latter is clocked twice as fast). PCIe 3.0 is now supported by the mobile Kepler series and an optional Turbo mode can automatically overclock the Nvidia card by a theoretical 15 percent if the laptop cooling system allows it. The implementation of this boost mode is done in the BIOS, but it is ultimately dependent upon the manufacturer of the laptop.

Performance

The gaming performance of the GeForce GT 650M equipped with DDR3 graphics memory lies somewhere in the former 2011 high-end category between the GeForce GTX 460M and GTX 560M. The performance is exceptionally good in shader-heavy DirectX 11 games and benchmarks. However, the 128-Bit memory interface can be a bottleneck if DDR3 graphics memory is employed. Despite the slower core clock of only 735 MHz, the GDDR5-version of the card should be much faster. Demanding games of 2011 like Battlefield 3 will be playable in 1366x768 and medium or high settings. Less demanding games, such as Modern Warfare 3, are easily playable with maxed out settings and 1080p resolution.

Features

The improved feature set now includes support for up to 4 active displays. Furthermore, high resolution monitors of up to 3840x2160 pixels can now be connected using DisplayPort 1.2 or HDMI 1.4a if available. HD-Audio codecs, such as Dolby TrueHD and DTS-HD, can be transmitted via bitstream mode through the HDMI port. However, as most laptops will feature Optimus, the integrated GPU will likely have direct control over the display ports and may limit the feature set available by the Nvidia Kepler cards.

The 5th generation PureVideo HD video processor (VP5) is also integrated in the GK107 core and offers hardware decoding of HD videos. Common codecs such as MPEG-1/2, MPEG-4 ASP, H.264 and VC1/WMV9 are fully supported up to 4K resolutions while VC1 and MPEG-4 are supported up to 1080p. Two streams can be decoded in parallel for features such as Picture-in-Picture. Another novelty is the inclusion of a dedicated video encoding engine similar to Intel QuickSync that can be accessed by the NVENC API.

The power consumption of the GeForce GT 650M should be best suited for medium-sized notebooks 15-inches or greater.
 
Apple writes the driver so don't throw out the Catalyst excuse.
I'm pro-ATi, and have been for some time (also pro-AMD), but you have to admit that the latest nVidia chips are quite better than ATi's chips performance/consumption wise. Though ATi currently has better yields than nVidia, weird taking into consideration that they're made in the same fab with the same process.

Also...

Apple doesn't write graphics drivers. What have you been smoking lately?
 
I believe that there is no Optimus on the Asus with the non 3D panel either. I saw the Best Buy version of the Asus and it is a beast. Over 9lbs, cooling vents the size of jet engine air intakes, about 2" thick at the hinge side. The battery life must be attrocious. But it is one powerful laptop.
The 500M series introduced Optimus for the higher end GTX variants. Anything based on Kepler or just rebranded Fermi is going to support Optimus on a 2D panel.

ASUS G7x series are desktop replacement notebooks that are not going to be heard at load.
 
Well the 650m is currently what the Alienware M14x appears to be using:

http://www.alienware.com/Landings/laptops.aspx

higher end models use the GTX 675 M

I wonder if apple will be aple to compete there on their higher end models?

I must say im impressed, If Apple can even approach the bottem line of a respectable AlienWare laptop...

Apple seems only half interested at best in gamers, which is unfortunate IMHO. I think healthy imagination is good for all people not just kids... Also games IMHO are the best way to test a computers muscle.
 
NVIDIA is the reason my iMac keeps freezing. Thanks for the faulty GPU :mad:

Nvidia is the reason my Core i7 minitower screams with 1.5 GiB of VRAM - after the factory 512 MiB Radeon HD smoked itself and died.

As they say in the auto world - YMMV (your mileage may vary).

But, shouldn't you be blaming Apple for not extending the warranty after shipping a system with a known faulty part? Apple is the reason that your defective Imac freezes.

My dead Radeon was a random failure, not like the Apple GPUs that are failing left and right due to faulty design. Apple sold a ton of systems with defective GPUs, but isn't standing behind them.
 
Last edited:
The 500M series introduced Optimus for the higher end GTX variants. Anything based on Kepler or just rebranded Fermi is going to support Optimus on a 2D panel.

ASUS G7x series are desktop replacement notebooks that are not going to be heard at load.

So they just chose not to use Optimus? I only read a little bit about the new G75 and the Samsung 700. I was curious how loud the Asus would be. It looks like it has an awesome cooling set up.
 
So they just chose not to use Optimus? I only read a little bit about the new G75 and the Samsung 700. I was curious how loud the Asus would be. It looks like it has an awesome cooling set up.
The GTX 560M in the older models supports Optimus according to nVidia's documentation.

NotebookCheck.net has a complete review of the new G75V. (V for 3D panel)
 
Apple doesn't release sytems with discrete GPUs to appease the Gaming Market.

They find a sweet spot that provides the best power consumption, price in volume, and provides the deepest and broadest capabilities it can leverage for it's OS and that means GCD/OpenCL/OpenGL in which more and more games are leveraging but in fact far more scientific, graphic and multimedia applications now expect to be optimized.

Nvidia continues to design it's GPGPUs around an optimized pipeline for CUDA. OpenCL has won this world and they are finally accepting it with their dump of NVidia PTX code stack into the LLVM/Clang project a few weeks back.

I'm building against it daily. The PTX Stack has come a long way, without Nvidia's help before they drop this source stack in trunk. AMD has been using LLVM/Clang for several releases to optimize their designs.

LLVM/Clang 3.1 is stamped.

LLVM/Clang 3.2 will finish off the C11 and C98 compliance needs and more goodies added by Apple and other third parties that will not only with FreeBSD moving to LLVM/Clang for versino 11 but more and more Linux platforms giving the option of their entire distributions built with LLVM/Clang or GCC as options.

All this is made possible thanks to Apple seeing and filling needs in the world where Hardware is not optimally being leveraged and more or less with GPUs having wasted over a decade pandering to Gaming as if its a real measure of what a GPGPU can do.

Games are benefiting but lets face it, games aren't the end all be all standard in driving graphics capabilities like it once did. More and more parallel programming is impacting everyone's lives, whether they know it or not. In 5 years people won't be waxing on about fps and 1080p as the bench standards, but photorealism without tearing in games and without taxing a system. Developers will have to do less trickery and consumers will only notice how pretty the textures are so long as their systems have discrete gpus on them.

Intel will have to copy AMD's APU approach sooner rather than later.

100% agree - I've been following LLVM since Apple became involved and although many don't see it today the improvements are coming on stream with Mountain Lion being the first integration of that deep integration. From what I understand in Mountain Lion the core technologies have moved from OpenGL 2.1 to OpenGL 3.2 (which is why some hardware support has been dropped) which will hopefully translate to a better experience over all when it comes snappiness, GPU accelerated web technologies such as canvas, javascript etc.

Btw have you been following their development of libc++ ( link )? is there a status on when it might be merged and made available in Mac OS X or is it still a while before one starts to see it appear? having had a look at many of the presentations given on the matter it really does seem to be a major game changer for C++ developers wanting to take advantage of the high level features that make life easier when working on larger projects. When it comes to Clang/LLVM it will be interesting to see how things work out once the whole operating system is compiled from top to bottom using Clang/LLVM and what benefits will be yielded in terms of debugging and fixing bugs with less regressions given the technologies they've merged in over the last few years.
 
Nvidia is the reason my Core i7 minitower screams with 1.5 GiB of VRAM - after the factory 512 MiB Radeon HD smoked itself and died.

As they say in the auto world - YMMV (your mileage may vary).

But, shouldn't you be blaming Apple for not extending the warranty after shipping a system with a known faulty part? Apple is the reason that your defective Imac freezes.

My dead Radeon was a random failure, not like the Apple GPUs that are failing left and right due to faulty design. Apple sold a ton of systems with defective GPUs, but isn't standing behind them.

It was discovered that the part was faulty after they shipped it. NVIDIA fixed the problem after some of the first ones were shipped. My dad bought the iMac I am using release-date or something, so it has the problem. Other people with the same iMac are problem-free, though. There are a few people online who complain about the same problem on the same model, though.

That makes it hard for me to sue Apple if I wanted to. It's easy for anyone to say that I just broke it myself or something because it's 2012, and I got the Mac in 2006, and others with the same model are not having the problem.
 
Your comment upsets me because you're probably right. It sucks that they can't get a GPU on their computers that's able to keep up with it's PC counterpart. I would like to be able to own a Mac that could play current generation games. I would like Mac gaming to be possible considering they have such nice screens.

I guarantee you will not see 2 Gb. It's Apple we are talking about. They'll give you 512 Mb, custom version.
 
When was the last time machines held 500GB of RAM?

If you meant to say 500MB of RAM I routinely see > 1GB of RAM on tasks.

Yes my post was in accurate, I was attempting to discuss (video) VRAM on the graphics card, not RAM.

500GB was suppose to be 512 on the graphics card. :(

----------

I'm assuming you mean 500MB, and BF3 easily caps out at 2GB Ram AND 2GB VRam at 1080p.
Here are two pictures of BF3 at only 720p using almost 800mb of VRAM and Adobe plus BF using almost 8 gigs of RAM.
http://i113.photobucket.com/albums/n223/stilev1/VRAM720p.png
http://i113.photobucket.com/albums/n223/stilev1/FrapsCPU.png

Yes I was trying to reference VRAM on the graphics board.

I think BF is a hard core reference and most people who require to run these 3D high end games would be better off with a gaming computer with a specific GPU. Would you agree?
 
NVIDIA - no thank you! My iMac didn't make it past 4 years before the NVIDIA graphics card went up in smoke.

Too right mate. My 2006 iMac with a 7600 GT lived to the ripe old age of 3 and 1/2 years when the bloody graphics card popped. A new one would have cost well over 200 euros and no luck on eBay.
 
Apple seems only half interested at best in gamers, which is unfortunate IMHO. I think healthy imagination is good for all people not just kids... Also games IMHO are the best way to test a computers muscle.

Apple's primary market are still professionals- while gamers have been an expanding portion of the market, fact is that they aren't quite big enough of a portion to seriously influence Apple's product decisions, although Apple has been listening, as evidenced by the fact that they upgraded the laptop adapter to 85W in order to stick a 6790M w/ 1 GB of GDDR5 memory in a Macbook Pro.

Sorry if this has been posted...

From the looks of it if they built it with GDDR5 then it's going to be very snappy.

If the GDDR5 in the 6770/6790M is any indication, then yes it will be.

Nvidia's Kepler architecture's actually pretty awesome, now that I think back on the fact that an ultrabook can run BF3 at native resolution w/ settings maxed out.

I think BF is a hard core reference and most people who require to run these 3D high end games would be better off with a gaming computer with a specific GPU. Would you agree?

If gaming's all your interested in, then yes. But for me, portability, non-gaming performance, and non-gaming battery life are pretty big factors as well, so I think that the MBP is still something to seriously consider as well.
 
[url=http://cdn.macrumors.com/im/macrumorsthreadlogodarkd.png]Image[/url]


[url=http://cdn.macrumors.com/article-new/2011/11/nvidia_logo-150x120.jpg]Image

[/url]As long ago as last November, SemiAccurate claimed that Apple would be switching back to NVIDIA from AMD for the graphics chips in the next-generation MacBook Pro. With MacBook Pro rumors flooding out today, that claim is gathering renewed momentum from several sources.

In its roundup of the latest MacBook Pro rumors, ABC News specifically claims that Apple will be using NVIDIA graphics chips in the new MacBook Pro reportedly set for introduction at next month's Worldwide Developers Conference.The Verge offers similar claims, with the growing reports suggesting that the switch from AMD to NVIDIA may indeed be taking place.Apple has moved back and forth between NVIDIA and AMD several times over the years, taking advantage of whichever graphics chip firm is offering the better product with the right pricing and timing. Consequently, a shift to NVIDIA should not be taken as an indication that it is a long-term decision, although Apple has been rumored to be making a similar move for the Mac Pro.

Article Link: More Reports of NVIDIA Graphics Chips in Next-Generation MacBook Pro

They do like to switch these up from time to time. Frankly, I think it's silly to judge either NVIDIA or AMD as brands making video cards for these MacBook Pros. Yes, the GeForce 8600M GT was a disaster of epic proportions that affected not only the MacBook Pros at that time but the entire industry. That doesn't mean that the GeForce GT 330M was a bad card or that NVIDIA is a bad company with its GPUs. Similarly most of the heating problems with early MacBook Pros could've been attributed to the ATI Radeon Mobility X1600; this doesn't mean that many years later the AMD Radeon HD 6770M is necessarily a bad card both in terms of performance and reliability; because last I checked, for a laptop such as that one, it isn't. Analyzing these GPUs on a case-by-case basis seems to make more logical sense. In most cases, whether NVIDIA or AMD is the manufacturer, they will be faster than the GPU in the preceding rev of MacBook Pro.
 
Does anyone think there will be a chance we will see a proper graphics in the 13"?
I love the size of the machine but the graphics kill it for me!
I guess I will find out in a matter of weeks though!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.