Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Wirelessly posted (iPhone: Mozilla/5.0 (iPhone; U; CPU like Mac OS X; en) AppleWebKit/420.1 (KHTML, like Gecko) Version/3.0 Mobile/4A102 Safari/419.3)

This is really cool. Does anyone know how much it will increase conversion times? Like 10% or more?
 
Unfortunately the X3100 is not an Nvidia graphics card and thus would not work with CUDA.

Yeah I didn't realise that CUDA required nVidia cards to work at the time of my post unfortunately :(. I've been reading up on it throughout the course of the day, it seems like it could be really cool if developers can leverage its potential in the right way.
 
This is not the sort of thing that's included in the core or the OS (kernel?) Not everyone is going to use it, but its offerings are too good to be ignored.

The observed FLOPS for a card like the 8800GT totally trounces the theoretical peak FLOPS of any Intel processor. If you talk theoretical, the 8800GT can do 336 GFLOPS while a pair of 3.2ghz Harpertowns can do 204.8 GFLOPS. That's the kind of power Apple's interested in. :)

CUDA is basically just marketing branded General-Purpose computation on GPUs.

This is something all DirectX 10 hardware can do easily with their programmable nature.

If you drool about the Geforce 8800 GT theoretical GigaFLOPS, you should really be interested in the GT200 chip that delivers 933GigaFLOPS and the Radeon HD 4800 series that deliver 1 TeraFLOPS of computational power.
 
As others have mentioned video would also benefit a lot from the technology. But CUDA is a very specialised technology which can only benefit mathematical and scientific applications at the end of the day.

Surely the applications which could/would benefit from the technology would range in their real world application, it is merely a case of the technologies full potential is not truly realised. Video was one application mentioned directly in the article (see below), and I remember reading [on Macrumors if I'm not mistaken] that Adobe was planning to utilize GPU based processing in an upcoming release of their CS packages [CS4 or CS5]. Found the article

From the Article in my original post.
For example, during my visit on Wednesday, Nvidia engineers demonstrated how a CUDA-enabled version of a program similar to QuickTime running on a desktop or laptop could dramatically speed up the processor of transcoding a movie or television show into a format suitable for the iPhone.

If you drool about the Geforce 8800 GT theoretical GigaFLOPS, you should really be interested in the GT200 chip that delivers 933GigaFLOPS and the Radeon HD 4800 series that deliver 1 TeraFLOPS of computational power.

I saw some people referring to those GT200 chips today during my hunting around on this, they looked pretty insane with what they could do.

Mind you it would be interesting to see GPGPU develop if this IBM technology is implemented on GPU's. Multi-Core GPU's in collaboration with Multi-Core CPU's could pump out some pretty impressive results. It would be a fair way off though I would imagine as they are just working on the technology now, and from what I understand multi-core GPU's would be too big and too hot if they tried with current tech (please correct if I am wrong on this. The more I have been looking into this the more intrigued I am :) )
 
Mind Boggle

You've gotta watch this video. CUDA is the next step to transform desktop computing. Intel is working furiously to catch up to nVidia. Microsoft is also working very hard and appears ready with a ground breaking HPC server 2008.

Really folks, watch this video to understand how HPC has the capacity to change lives in ways we've never understood. Every wild-ass student or scientist likely can afford his/her own supercomputer in a few years. That's what its going to take for real breakthrus in some 'in the box' constrained research areas.

BTW: This video is (intentionally I think) one of the most borderline hilarious nerdy, geeky things ever. But for those whose lives are saved by their tomo process finding a cancer etc. it is also over the line miraculous.

http://www.youtube.com/watch?v=DnIvodB2RzU
 
You've gotta watch this video. CUDA is the next step to transform desktop computing. Intel is working furiously to catch up to nVidia. Microsoft is also working very hard and appears ready with a ground breaking HPC server 2008.

Really folks, watch this video to understand how HPC has the capacity to change lives in ways we've never understood. Every wild-ass student or scientist likely can afford his/her own supercomputer in a few years. That's what its going to take for real breakthrus in some 'in the box' constrained research areas.

BTW: This video is (intentionally I think) one of the most borderline hilarious nerdy, geeky things ever. But for those whose lives are saved by their tomo process finding a cancer etc. it is also over the line miraculous.

http://www.youtube.com/watch?v=DnIvodB2RzU
i already posted it, last post, first page
 
Correct me if I am wrong... but doesn't CUDA already have Mac version toolkit/SDK?

What's the point of "to support CUDA at WWDC"?

http://www.nvidia.com/object/cuda_get.html#macos

yeah, but i think you'll find that this rumours suggests a implementation of cuda within OS X. so, either the release, or suggestion, that apple will have products that take advantage of cuda.

the fact u can develop CUDA on OS X is unimportant to most.

that is what is meant in the front page by "support"
 
i hope cuda is a cute name like google or yahoo, and not an acronym like msn. accronyms are too wide spread, they mean totally different things to different people and they create barriers in coversation. "TLPD" tension line propelling device, better named "rope thrower" or "roper". its not just accronyms its the fact we overcomplicate our titles and names to the point that they need abbreviation. and begging letters are not the answer because they mean different things in different fields. Ex: SC=star-craft, safety-commisioner, social-consensus, spinal-calapse. we need to ween ourselfs off of accronyms and get back to inventing new words when they are warrented. ex: cell-phone, much better than MCD (mobile-communication-device) or is that one for mini-compact-disk.
some may say that an accronym like spd can be used frequently and quickly in a conversation that is about spd. to this i say use the word "it" for what its for.
 
here is a medical application of the CUDA. seriously, i am very excited about this technology. i am so shocked that it's taken this long for someone(nvidia) to figure out that people would use video card for more than games.

Folding@Home have been doing this on ATI cores since late 2006. They got a 30x speed increase out of it.
 
Surely the applications which could/would benefit from the technology would range in their real world application, it is merely a case of the technologies full potential is not truly realised. Video was one application mentioned directly in the article (see below), and I remember reading [on Macrumors if I'm not mistaken] that Adobe was planning to utilize GPU based processing in an upcoming release of their CS packages [CS4 or CS5]. Found the article

Well Photoshop is a graphical program and thus is very maths based. CUDA will not help one jot with speeding up say a word processor or a web browser. Graphics card are good for one thing and one thing only. Executing mathematical functions, therefore anything that requires the use of the maths will benefit to an extent.
 
Hopefully, Apple will get in good with nVidia and get early, low-yield parts like they do from Intel. That would be sweet. Apple makes Intel look good, and they could probably make nVidia look good with a resonable partnership.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.