Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

matthutch

macrumors regular
Original poster
Jul 26, 2004
149
13
Perth, Western Australia
I just read this on CNet (might be old news to most by now but hey why not :) ) and thought it was pretty cool.

http://news.cnet.com/8301-13579_3-9962117-37.html?tag=nefd.top

This would be awesome if they could leverage GPU power for Quicktime encodes/transcodes. I wonder if Tyler from Visual Hub would be able to use it to speed up Visual Hub's encoding even more ;)

EDIT: Sorry if this is in the wrong section as well, Mods please move if I fubar'd it :)
 

Cromulent

macrumors 604
Oct 2, 2006
6,802
1,096
The Land of Hope and Glory
I just read this on CNet (might be old news to most by now but hey why not :) ) and thought it was pretty cool.

http://news.cnet.com/8301-13579_3-9962117-37.html?tag=nefd.top

This would be awesome if they could leverage GPU power for Quicktime encodes/transcodes. I wonder if Tyler from Visual Hub would be able to use it to speed up Visual Hub's encoding even more ;)

EDIT: Sorry if this is in the wrong section as well, Mods please move if I fubar'd it :)

Yes, it is available in Mac OS X. Not sure what the point of the post is though?

CUDA is a pretty impressive technology if you are doing lots of maths heavy programming but because it is pretty limited (only Nvidia hardware for a start with more than 128MBs of graphics RAM) it's use is limited to a small section of the Mac community.
 

matthutch

macrumors regular
Original poster
Jul 26, 2004
149
13
Perth, Western Australia
Yes, it is available in Mac OS X. Not sure what the point of the post is though?

Was more to see what other people thought about it. From what you have said it seems like it is already implemented? Do you know any applications that are actively using it?

CUDA is a pretty impressive technology if you are doing lots of maths heavy programming but because it is pretty limited (only Nvidia hardware for a start with more than 128MBs of graphics RAM) it's use is limited to a small section of the Mac community.

This is more where I was wanting to get into, given the increasing performance of hardware more people would be opened up to these sort of areas so it would be interesting to see how current developers envisage their ability to leverage the technology today. Seeing that even the X3100 in the current MacBook's can utilize 144Mb of RAM quite a few machines would be able to use it, if only a little bit (I'm not to clued in on the details of the memory requirements beyond what you've mentioned I admit), for a little bump in encoding speed.

Apologies if this doesn't make much sense either, working early on a Saturday morning has its draw backs :)
 

MacRumors

macrumors bot
Apr 12, 2001
63,537
30,845
Apple to Support Nvidia's CUDA at WWDC?



CNet interviewed Nvidia CEO Jen-Hsun Huang and found that Apple may have an interest in Nvidia's CUDA technology:
CUDA is a programming technology that allows software developers to take advantage of the unique parallel processing characteristics of graphics processors such as Nvidia's GeForce 8600M, found in the MacBook Pro.
According to Huang, "Apple knows a lot about CUDA" and may announce support for the technology at next week's' Worldwide Developers Conference (WWDC).

During a demo for CNet, Nvidia engineers demonstrated how a CUDA-enabled version of a program could dramatically speed up converting video from one format to another. Transcoding video can be useful to convert existing video to be played on another device (such as the iPhone).


Article Link
 

mahonmeister

macrumors 6502
Jun 9, 2006
297
0
Redlands, CA
So then will Apple start offering more powerful video cards, at least as a bto? They're a little better now, but there is tons of room for improvement.

Perhaps they could even end the (mostly true) stereotype that Macs suck at gaming.:p
 

iceman1234

macrumors member
May 27, 2008
50
0
So then will Apple start offering more powerful video cards, at least as a bto? They're a little better now, but there is tons of room for improvement.

Perhaps they could even end the (mostly true) stereotype that Macs suck at gaming.:p

I know really, why can't apple use any decent graphic cards anymore:(
 

mrgreen4242

macrumors 601
Feb 10, 2004
4,377
9
I don't see Apple integrating a vendor specific hardware function into the OS. Look at Quartz Extreme, for example... it works on any manufacturers hardware, as long as it meets certain requirements (certain level of support for OpenGL, I believe).

I could see them adding this as a feature to Final Cut or something, but to roll it into to OS would only benefit a very small number of their users and just would seem to a very Apple thing to do.
 

DavidCar

macrumors 6502a
Jan 19, 2004
525
0
Does anyone know how CUDA compares with the EyeTV Turbo.264? Could a CUDA graphics processor do the same thing as the Turbo?
 

RTee

macrumors regular
May 26, 2008
117
0
Australia
I don't see Apple integrating a vendor specific hardware function into the OS. Look at Quartz Extreme, for example... it works on any manufacturers hardware, as long as it meets certain requirements (certain level of support for OpenGL, I believe).

I could see them adding this as a feature to Final Cut or something, but to roll it into to OS would only benefit a very small number of their users and just would seem to a very Apple thing to do.

That's true, if it's available for those that want to use it then that's great.
 

geerlingguy

macrumors 6502a
Feb 11, 2003
562
6
St. Louis, USA
I'd be quite happy if this happened, but I don't expect it. Half of my computer's waking hours are spent converting video (transcoding my DVD collection, and also some of my HD-DVDs... although they take much longer), so this would make my life a lot easier.
 

exabytes18

macrumors 6502
Jun 14, 2006
287
0
Suburb of Chicago
I believe CUDA is now bundled with all 8 and 9 series drivers for Windows though it seems like most implementations of CUDA are done through Linux.

CUDA is capable of some incredible feats, but programming for it looks brutal. I believe there is also talk of nVidia adding a feature to the CUDA compiler to allow for compiling the code for multi-core CPUs in addition to the GPU. Currently, CPUs can emulate the GPU code, but it's inefficient. The idea is that the programmer writes the code once and then the compiler takes care of the rest. If there's a CUDA enabled GPU, the program will utilize the GPU, else the program will execute entirely within the CPU (utilizing any cores available.) I read this awhile back so it could either be.... a) scraped, b) still in the works, c) implemented in CUDA 2.0 or something, d.) I misunderstood and it never happened.

Anyway, CUDA-like technologies are amazing. It would be awesome to see it included in OS X... or at least for the 8-series cards.
 

matthutch

macrumors regular
Original poster
Jul 26, 2004
149
13
Perth, Western Australia
I believe there is also talk of nVidia adding a feature to the CUDA compiler to allow for compiling the code for multi-core CPUs in addition to the GPU. Currently, CPUs can emulate the GPU code, but it's inefficient. The idea is that the programmer writes the code once and then the compiler takes care of the rest. If there's a CUDA enabled GPU, the program will utilize the GPU, else the program will execute entirely within the CPU (utilizing any cores available.) I read this awhile back so it could either be.... a) scraped, b) still in the works, c) implemented in CUDA 2.0 or something, d.) I misunderstood and it never happened.

That sounds pretty cool, if it is being implemented :)

On a side note: First page news, awesome :) I sure didn't think that would happen. I suppose there are benefits about working early on a Saturday :)
 

winterspan

macrumors 65816
Jun 12, 2007
1,008
0
Yes, it is available in Mac OS X. Not sure what the point of the post is though?

CUDA is a pretty impressive technology if you are doing lots of maths heavy programming but because it is pretty limited (only Nvidia hardware for a start with more than 128MBs of graphics RAM) it's use is limited to a small section of the Mac community.

So what! It's good that ANY existing Macs have the technology for it. All middle and higher end Nvidia 8-series and higher cards can support CUDA. It's really going to be an incredible technology.


Seeing that even the X3100 in the current MacBook's can utilize 144Mb of RAM quite a few machines would be able to use it, if only a little bit (I'm not to clued in on the details of the memory requirements beyond what you've mentioned I admit), for a little bump in encoding speed.
Apologies if this doesn't make much sense either, working early on a Saturday morning has its draw backs :)

Intel's integrated graphics are a joke compared to even low-end Nvidia graphics cards. Only the middle-range/upper-range Nvidia Geforce 8-series and later cards can run CUDA.

I know really, why can't apple use any decent graphic cards anymore:(
This would be GREAT if implementing CUDA spurs apple to take graphics cards seriously!

I don't see Apple integrating a vendor specific hardware function into the OS. Look at Quartz Extreme, for example... it works on any manufacturers hardware, as long as it meets certain requirements (certain level of support for OpenGL, I believe). I could see them adding this as a feature to Final Cut or something, but to roll it into to OS would only benefit a very small number of their users and just would seem to a very Apple thing to do.

CUDA is capable of some incredible feats, but programming for it looks brutal. I believe there is also talk of nVidia adding a feature to the CUDA compiler to allow for compiling the code for multi-core CPUs in addition to the GPU. Currently, CPUs can emulate the GPU code, but it's inefficient. The idea is that the programmer writes the code once and then the compiler takes care of the rest. If there's a CUDA enabled GPU, the program will utilize the GPU, else the program will execute entirely within the CPU (utilizing any cores available.) ...

@mrgeen & exbytes

I bet Apple is going to either buy or work with one of those startup companies that have been developing code and compiler technology to speed up applications on multi-core CPUs and GPUS. This way they could implement a common abstraction layer for both CUDA and ATI's Close-To-Metal, not to mention Intel's Larabee in the future, and potentially even just multi-core processors. Then just like some of the frameworks in OSX, it could actively run the application's code on whatever particular hardware you have on the machine it available. That would be the best way to keep their suppliers diversified, while being able to take advantage of all these new technologies...
 

Analog Kid

macrumors G3
Mar 4, 2003
8,868
11,409
You know, the last time a CEO made a public statement about Apple adopting their technology was Sun saying ZFS would be "the" file system for Leopard.

We see how that turned out...

I agree this doesn't belong in the core OS. Apple tends to bring generic technologies into the system when they can abstract them across hardware vendors. Core Image and the Accelerate framework do a lot of what people here are asking for.
 

exabytes18

macrumors 6502
Jun 14, 2006
287
0
Suburb of Chicago
This is not the sort of thing that's included in the core or the OS (kernel?) Not everyone is going to use it, but its offerings are too good to be ignored.

The observed FLOPS for a card like the 8800GT totally trounces the theoretical peak FLOPS of any Intel processor. If you talk theoretical, the 8800GT can do 336 GFLOPS while a pair of 3.2ghz Harpertowns can do 204.8 GFLOPS. That's the kind of power Apple's interested in. :)
 

Cromulent

macrumors 604
Oct 2, 2006
6,802
1,096
The Land of Hope and Glory
Seeing that even the X3100 in the current MacBook's can utilize 144Mb of RAM quite a few machines would be able to use it

Unfortunately the X3100 is not an Nvidia graphics card and thus would not work with CUDA.

I've been thinking about looking into CUDA for a financial program that I need to write at some point which will need to do some analysis on a large amount of data very quickly. As others have mentioned video would also benefit a lot from the technology. But CUDA is a very specialised technology which can only benefit mathematical and scientific applications at the end of the day.
 

fuziwuzi

macrumors regular
Nov 29, 2007
242
0
Bris, Australia
Unfortunately the X3100 is not an Nvidia graphics card and thus would not work with CUDA.

I've been thinking about looking into CUDA for a financial program that I need to write at some point which will need to do some analysis on a large amount of data very quickly. As others have mentioned video would also benefit a lot from the technology. But CUDA is a very specialised technology which can only benefit mathematical and scientific applications at the end of the day.

here is a medical application of the CUDA. seriously, i am very excited about this technology. i am so shocked that it's taken this long for someone(nvidia) to figure out that people would use video card for more than games.

http://www.youtube.com/watch?v=DnIvodB2RzU
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.