Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster


Architosh points us to a Guardian.co.uk article from last week which details the upcoming trend of using GPUs (graphics processors) for day to day computing. As they point out, if you have a computer with either an ATI or nVidia graphics card, chances are you have more than 100 microprocessor cores waiting for use. While these cores have been optimized to deliver high performance graphics for games and video, there's an effort to harness these processors for general use.
Those GPU cores are the piranhas of processing. Because there are so many of them, they can chomp through tens of gigabytes of data in a second. But it has to be the right kind of data - something that can be parcelled up and delivered in bite-sized chunks to each core. In many cases, almost as soon as they have started working, the GPU piranhas will be waiting for the next chunk of meat. Managing that is hard and often it is just easier for a developer to have all the software run on a regular CPU.
Due to their specialized function, some tasks are better suited for GPU use. So far, research has focused on scientific tasks such as weather predictions, but there are efforts to standardize this programming.

Most industry support is focused around Apple's OpenCL specification which they announced will be coming in the next major version of Mac OS X ("Snow Leopard"). Of course, not everyone is behind the initiative. As usual, Microsoft seems to have their own plans, and been involved in their own research on GPU computing.
Michael Dimelow, director of marketing for media processing at ARM, said: "I don't think Microsoft will be sitting and watching. I would never underestimate Microsoft's ability to come up with alternative positions."
Also relevant to Apple's recent mobile phone push is the fact that GPUs may provide handheld devices with extra computing power with less power consumption. According to the president of Khronos, GPUs can be 10 times more power-efficient than using a CPU. This can improve both video and audio performance on mobile devices.

Since the iPhone shares the underlying OS X codebase, these upcoming improvements in Mac OS X should trickle down to benefit the iPhone.

Article Link
 
OpenCL looks simply amazing, if Apple and the rest of the industry can get the issues worked out. For example, I could see iTunes 8 for Snow Leopard utilizing this technology to drastically improve encoding speed. In some cases, improvements of 10x could be seen here alone.

Also, the GPU is a great place to look for hardware acceleration in video decoding. QuickTime X would be the perfect place for this, and such improvements would benefit iChat.
 
Kind of like NVIDIA's Tegra, GPU-powered mobile phones sounds like a great idea to me. The day-to-day computing on a phone is simple enough for a GPU, and then it has all that graphics power, all in one chip, saving power. Maybe this will trickle up to notebooks and desktops and we'll no longer have a two processor CPU-GPU combo, but one GPGPU or one CPU that's great at graphics!
 
If the way the Folding app has been optimized for GPU's is any indication of the speed increases possible, this is pretty exciting stuff, particularly for the mobile market.
 
could this mean the end of integrated graphics?

I doubt it. Or at least not for a while. Integrated graphics are cheap and work decently. There may be a time when this new technology is filtered down into lower end computers, but of course, time will tell how soon that is.
 
I'm hoping it will be the beginning of replacable integrated graphics.

That... doesn't make sense. One of the defining points of integrated graphics is that it's built in to the motherboard. If it's replaceable, it's just really wimpy dedicated graphics.

Computer are going down two routes. Either CPUs will become good at GPU functions (Intel Larrabee) or GPUs will become more general-purpose (GPGPU). GPGPUs have been more available in recent times, but Larrabee looks promising.
 
It could also mean the beginning of newer,better integrated graphics.

This is exactly what I was thinking. Once you have an OpenCL spec then the people who make GPUs can build them to run OpenCL well. I'm sure Intel is looking at this.
 
This technology is going to simply rock the industry. If Microsoft has their own plans to make a mockup, then this technology could become Mac exclusive, increasing the already great allure of Mac technology to all kinds of groups of people! At any rate, I want to see my Cinebench Score when this comes out!
 
dedicated graphics for the macbook

I just see this as another good reason to put dedicated graphics in the macbook. Or give us a 13 inch macbook pro. It's really all the same to me.

:apple: make it happen!!!
 
The notion that a piece of software could improve the performance specs of my current hardware makes me salivate.

Only :apple:
 
I just see this as another good reason to put dedicated graphics in the macbook. Or give us a 13 inch macbook pro. It's really all the same to me.

:apple: make it happen!!!
As I see it, Apple could just as easily use Intel's upcoming GMA X4?00 (I don't remember what the ? is, but I know it's a number) in future MacBooks, which will probably work much better with OpenCL than the GMA X3100 does.
 
I just see this as another good reason to put dedicated graphics in the macbook. Or give us a 13 inch macbook pro. It's really all the same to me.

:apple: make it happen!!!

I second the 13" MacBook Pro..... but we have been beating that issue since the demise of the 12" PowerBook
 
I hope the folks who make Handbrake make it work with the GPU. It's one of the few programs I use that brings my mac to its knees and it "hotter than July."😉

If I can afford an 8-core Mac Pro, it's wouldn't be such a big deal, since those monster can run through a DVD faster than Mexican water through a first time tourist.😀😛
 
Sounds great. But considering that all of NVIDIA's mobile GPUs are defective and will fail under their own massive thermal output, I hope Apple starts to go back to ATi, because once we start pushing those NVIDIA GPUs, they will start bursting into flames (figuratively). And I do have some reservations about this line:
According to the president of Khronos, GPUs can be 10 times more power-efficient than using a CPU. This can improve both video and audio performance on mobile devices.
I think an entire order of magnitude of energy efficiency is wishful thinking. Perhaps highly integrated vector logic, like the itty-bitty PowerVR2 found in the iPhone/PDAs can be efficient, but most full-featured mobile GPUs--the ones that have hundreds of little vector units--use way more electricity than Intel's latest round of 45 nm Dual-Core CPUs and throw out about 80 deg of heat, necessitating active cooling. Once you need a couple of big badass fans just to keep an idling device from burning itself out, energy efficiency goes right out the window. That's why super high performance graphics arrays (the 256 parallel-VPU-core, 10th+ generation ones) will still be desktop-bound for quite a long time. Look at the latest generation of game-capable GPUs--they all need their own independent 300W power supplies, solid copper heat pipes and at least a couple of PCI slots for all the fans.
 
OpenCL looks simply amazing, if Apple and the rest of the industry can get the issues worked out. For example, I could see iTunes 8 for Snow Leopard utilizing this technology to drastically improve encoding speed. In some cases, improvements of 10x could be seen here alone.
You mean Media Me iTunes X? 😉😀

  • A Core 2 Duo at 2.5 GHz is 20.00 GigaFLOPS using SSE.
  • 2 quad-core Xeons at 3.2 GHz is 102.4 GigaFLOPS using SSE.
  • A GeForce 8600M GT is 91.20 GigaFLOPS.
  • A GeForce 8800 GT is 504.0 GigaFLOPS.
  • The Larrabee (Intel GPU using x86 cores) is expected to reach at least 960 GigaFLOPS.

Larrabee is 20x more efficient in terms of performance per watt than a Core 2 Duo, despite having half the single thread performance.

As I see it, Apple could just as easily use Intel's upcoming GMA X4?00 (I don't remember what the ? is, but I know it's a number) in future MacBooks, which will probably work much better with OpenCL than the GMA X3100 does.
X4500
 
Heavy GPU utilization

Cheers to all Macbook Pro owners, whose degenerate Nvidia GPU's are going to fail out-of-warranty.
 
As I see it, Apple could just as easily use Intel's upcoming GMA X4?00 (I don't remember what the ? is, but I know it's a number) in future MacBooks, which will probably work much better with OpenCL than the GMA X3100 does.

The X4500 is the newer card from the Montevina platform. Along with up to 3.06 Penryn processors, 1066MHz FSB and WiMax support.

I hope WiMax finds its way in macs.
 
You mean Media Me iTunes X? 😉😀
  • A Core 2 Duo at 2.5 GHz is 20.00 GigaFLOPS using SSE.
  • 2 quad-core Xeons at 3.2 GHz is 102.4 GigaFLOPS using SSE.
  • A GeForce 8600M GT is 91.20 GigaFLOPS.
  • A GeForce 8800 GT is 504.0 GigaFLOPS.
  • The Larrabee (Intel GPU using x86 cores) is expected to reach at least 960 GigaFLOPS.
Larrabee is 20x more efficient in terms of performance per watt than a Core 2 Duo, despite having half the single thread performance.
Wow, that's impressive. GPU performance is a resource we couldn't have found a better time to tap, in my humble opinion. 😀
Thanks - just out of curiosity, is that info on Intel's site?
 
It could also mean the beginning of newer,better integrated graphics.

It should do, but don't hold your breath for Intel to implement it.

nVidia and AMD both have integrated chipsets with more powerful and up-to-date GPUs on them. I don't know how suitable they are for OpenCL / CUDA at the moment.

Intel's Larrabee will come out eventually, but it might be 2010 or 2011 before its technology gets integrated into a chipset.

I'm hoping that this means that Apple will move to discrete graphics across the line again later this year in order to boost performance.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.