....But yeah, I think their release date of October is crap. CS3 JUST came out!
For god sakes, read the thread. Not only has it been mentioned 3 times in this thread, it is now even on the article itself that most of the info from TGdaily.com is B.S., including the October release date. The actual Adobe presentation said absolutely NOTHING about any release date of any product, nor did they actually say that the GPGPU technology would be included in a future version of photoshop. Apparently, the journalist that wrote the TGdaily article is either a complete moron, was inadvertently misled, or both.
This move towards making GPU's more GPGPU has been coming a long time. Nvidia saying that the CPU was dead, ATI moving to integrate GPU's with AMD CPU's, and Intel now playing for the discrete GPU market.
And yes, all this GPGPU thing is based on CUDA. In fact, Folding@Home will be available for CUDA-enabled GPU's IIRC soon.
All this is coming on the heels of Nvidia's next gen GPU: GT200. Supposedly its performance will be to the G80 what the G80 was to the G71
However, knowing Apple, this won't be available until its refresh so until then, PC users will have a big advantage in CUDA enabled programs

(the rumored specs include 240 shaders, fixed MUL commands for true 2MADD+MUL per shader, and total 933 GFlops power)
Yep! They are saying that on GT200, the stream processors/shaders are 50% more efficient than on G80. So that 240-shader beast should run like it has ~360 of the older G80 shaders! That is INSANE! I'm not a big gamer, but I like to dabble in 3D animation and simulations. And being a developer, I can't wait to learn how to use the 'CUDA' API for offloading parallel computations.
For people that are wondering what applications will benefit from GPGPU/CUDA tech, think of the types of floating-point heavy applications that benefit from multi-core processing.
- HPC "high-performance computing", Supercomputer applications, Grid computing
- Raytracing, global illumination, and other professional 3D rendering.
- Digital Image processing, biometric recognition, computer vision science
- Video encoding, decoding, editing, compositing, effects rendering, iDCT/IQ
- Audio encoding, editing, compositing, effects rendering
- Digital and Analog signal processing, Speech processing
- Scientific computation and scientific simulations
- Neural networks and Artificial intelligence
- Weather forecasting and simulations
- Molecular dynamics and Biological mechanism simulations
- Oil and Gas industry geology simulations
- Computational Finance / Financial forecasting
- Cryptography
For the average user, they will see vastly-increased MP3 encoding, DVD-ripping, MPEG-4/H264 video encoding, home video editing and effects, digital image editing, etc
That is absolutely not true. Video cards have included 2D GUI acceleration since the mid 1990's, and the windows interface used the cards for acceleration. Ditto for the Mac when you used the high end nubus graphics cards back in the day.
What I have noticed in Vista is that the aero interface keeps the GPU in 3D mode and keeps powermizer from kicking in, killing battery life on a laptop. It also still consumes 8-12% CPU just moving a window around. It is a resource hog compared to Compiz on Linux or Core Image on a mac (which do exactly the same thing).
Yes, "Aero" is a totally pathetic attempt at a polished and 3D accelerated GUI. You can run Quartz in OSX and compiz on Ubuntu with even 3-4 yr old machines with no problem -- especially Compiz. Comparatively, Aero on Vista is incredibly resource hungry and bloated to all hell. I couldn't believe how it was taxing my Quadro 1500M in my laptop. I can get MUCH better battery life with compiz on in Unbuntu than Aero on Vista.