ATI 3870 support???
Better late than never, Open GL 3.0 was released July 11, 2008...
3.2 on August 3, 2009...
http://en.wikipedia.org/wiki/Opengl#OpenGL_3.0
Anyone bashing DirectX is a fanboy, plain and simple.
I'm a former OpenGL developer, and the Khronos Group ROYALLY screwed up the release of GL3.
They basically decided to please CAD developers instead of game developers when they released GL3.
Originally they promised to rewrite the state-based model of OpenGL into an object-based model. They didn't.
DirectX (Direct3D) is object-based, and there are many benefits (performance and others) that arise from an object-based system.
Hell, geometry shaders in GL were STILL an extension until a few months ago. Geometry shaders have been in hardware for YEARS nowAnyone saying that you should just use the extensions has never been bitten in the ass by different GPU vendors supporting extensions to varying degrees. I recall one experience where I needed functionality (fp16 blending of framebuffers) and I couldn't even query the hardware via GL to see if the functionality was supported in hardware. I sent an email to the developer relations of ATi and their response was to maintain a list of cards that support it(!)...
GL is so messed up, its really sad now. It could have been AWESOME. Khronos told us it would be![]()
Imagination Technologies announces POWERVR SGX545 graphics IP core with OpenGL 3.2 and OpenCL 1.0
New core takes mobile & embedded graphics family to a new level; delivers unrivalled capabilities
Las Vegas, USA, 8th January 2010: Imagination Technologies, a leading multimedia chip technologies company, announces POWERVR SGX545, the first and only DirectX10.1 capable embedded graphics IP core available for immediate licensing. SGX545 will also deliver OpenGL ES 2.x and OpenGL 3.2 to deliver class leading 3D graphics performance, and will also support OpenCL 1.0 full profile capability which will enable mobile and embedded applications to take maximum advantage of the capabilities offered by these GPU APIs for both 3D graphics and general purpose applications.
POWERVR SGX545 is available for licensing now. The IP is already proven in silicon in a test chip from Imagination and licensed by a lead partner.
Says Tony King-Smith, VP Marketing, Imagination: "Combining our many years of experience in the embedded, mobile and PC-based DirectX graphics worlds, POWERVR SGX 545 takes the possibilities of hand-held graphics to a new level by delivering a full DirectX 10.1 and OpenGL 3.x feature set as well as delivering GPU powered OpenCL heterogeneous parallel processing capabilities for the mobile and embedded markets. This makes POWERVR SGX545 a compelling solution for application processor SoC designers targeting the next generation of netbook and MID mobile products demanding exceptional graphics capabilities."
The debut of POWERVR SGX545 reinforces the SGX family’s outstanding scalability which ranges from ultra-small OpenGL ES 2.0 mobile cores through solutions for feature-rich mobile and HDTV platforms, to high-performance gaming and computing solutions. The SGX family supports a wide range of APIs including DirectX9 & 10, OpenGL ES 2.x, OpenGL 3.x, OpenVG 1.x and OpenCL 1.x.
POWERVR SGX545 delivers real-world performance of 40 million polygons/sec and 1 Gpixels/sec fillrate at 200MHz,* and is capable of driving HD screens with ultra smooth high frame rate 3D graphics content.
New features in POWERVR SGX545 include:
- DirectX10.1 API support
- Enhanced support for DirectX10 Geometry Shaders
- DirectX10 Data assembler support (Vertex, primitive and instance ID generation)
- Render target resource array support
- Full arbitrary non power of two texture support
- Full filtering support for F16 texture types
- Support for all DirectX10 mandated texture formats
- Sampling from unresolved MSAA surfaces
- Support for Gamma on output pixels
- Order dependent coverage based AA (anti-aliased lines)
- Enhanced line rasterisation
SGX545 was also designed to deliver full profile OpenCL 1.0 capabilities, with advanced features including:
- Support of round-to-nearest for floating-point math
- Full 32-bit integer support (includes add, multiply and divide)
- 64-bit integer emulation
- 3D texture support
- Support for the maximum 2D and 3D image sizes specified in the full profile.
Inside POWERVR SGX545
USSE (Universal Scalable Shader Engine), the main programmable processing unit within each POWERVR SGX545 pipeline, is a scalable multi-threaded GPU shader processing engine that efficiently processes graphics as well as many other mathematically-intensive tasks. USSE can be programmed using the GLSL language that forms part of the OpenGL ES 2.0 specification, or in the C-based parallel processing language used in the OpenCL specification – both APIs from the Khronos Group.
POWERVR SGX545 delivers the broadest range of graphics API feature sets in the industry, while also enabling developers to gain greater access to the full capabilities of the USSE-powered GP-GPU in a broad range of applications including digital imaging, video processing, game physics, cryptography, and other general computing tasks that can benefit from parallel processing.
Editor's Notes
* All fill rate figures stated assuming a scene depth complexity of x2.5.
About Imagination Technologies
Imagination Technologies Group plc (LSE:IMG) – a global leader in multimedia and communication silicon technologies – creates and licenses market-leading processor cores for graphics, video, multi-threaded embedded processing/DSP and multi-standard communications applications. These silicon intellectual property (IP) solutions for systems-on-chip (SoC) are complemented by strong array of software tools and drivers as well as extensive developer and middleware ecosystems. Target markets include mobile phone, handheld multimedia, home consumer entertainment, mobile and low-power computing, and in-car electronics. Its licensees include many of the leading semiconductor and consumer electronics companies. Imagination has corporate headquarters in the United Kingdom, with sales and R&D offices worldwide. See: www.imgtec.com.
http://www.khronos.org/news/press/r...ces-powervr-sgx545-graphics-ip-core-with-ope/
I already posted this last week on AppleInsider.
There is a reason Apple is getting it's software stacks sewn up to leverage OpenGL 3.2/OpenCL1.0 together, across all markets they play in.
Nvidia is currently the only vendor with OpenCL1.0 not ready inside their GPGPUs for general consumption. It's still in beta.
OpenGL 3.1 rolled out the object model and it's been extended in OpenGL 3.2.
http://en.wikipedia.org/wiki/OpenGL#OpenGL_3.2
Instead of sending you to the OpenGL specs I figured Wiki can summarize their changes more rapidly.
Apple, in one .6.x release will have their drivers up to the current solutions for both Nvidia and AMD.
I'm pretty sure you need to actually write your software in OpenGL 3.0 to take advantage of the benefits.Will OpenGL 3.0 offer increased performance for OpenGL 2.0 supporting software?
Any DX10 GPU should support OpenGL 3.0, except Intel GMAs. Basically ATI HD2xxx and nVidia 8xxx and up.Is there a list of supported cards/machines floating about anywhere?
http://techreport.com/discussions.x/16080When was support for Open GL 3 added to Windows and Linux?
I think geometry shaders were promoted to core in OpenGL 3.2.Hell, geometry shaders in GL were STILL an extension until a few months ago.
http://techreport.com/discussions.x/17701OpenGL 3.2 support didn't enter until this past December on Nvidia GPUs.
So has it actually been confirmed that 10.6.3 will bring full support for OpenGL 3.2 rather than just OpenGL 3.0? Because while this leaked screenshot confirms OpenGL 3.0 is on the way and some progress has been made on OpenGL 3.2, it doesn't show any progress since Leopard on OpenGL 3.1. In fact, on my older MacBook Pro with the X1600 and 10.5.8, I'm still getting 1/8 OpenGL 3.1 extensions supported (ARB_texture_rectangle) which is the same as the leaked screenshot.Apple, in one .6.x release will have their drivers up to the current solutions for both Nvidia and AMD.
That will be the culmination of work done over the past 3 years.
OpenGL is still better.A lot of the comments written so far in this thread seem to be by people who don't realize that DirectX is so much more than OpenGL, and the two aren't equivelent. DirectX also includes full support for Sound, Input, Networking and a bunch of other useful stuff. That's why it took off and became the defacto standard. OpenGL is simply concerned with graphics, and needs other libraries to do that other stuff.
I hear a HD5870 MacPro calling.
Somehow I doubt that's going to happen.
I think given the length of the HD5870 and the cost, the HD5850 is more likely. I do hope that Apple adopts the HD 5000 series though, seeing it is the latest tech. The GPUs will likely be refreshed when the Gulftown Mac Pros roll out which seems likely to occur before Fermi arrives in any numbers.Somehow I doubt that's going to happen.
I hear a HD5870 MacPro calling.
Somehow I doubt that's going to happen.
I expect it will be available when the 7870 comes out for twice as much as the PC card. There will then be a 2000 post thread about flashing the PC version to run in the MacPro sometime when the Mayan calender blows up or Al Gore's personal carbon footprint exceeds that of Rwanda, which ever comes first.
edit: I see heistax beat me to it. Nice to know it's not merely a personal notion of mine.
Why does Apple hate modern, high performance video cards?
It's too bad that Microsoft won't license their DirectX on anything else. Imagine if there was a Mac version of DirectX so that games could be ported easily and require very little re-write.
If developers stopped emulating their games on OS X and started actually porting them then maybe we'd see a performance increase from this, but that'll never happen - it's too easy and cost effective to wrap games in an emulator and push it out than to actually take the time to build a native binary.
http://www.khronos.org/news/press/r...ces-powervr-sgx545-graphics-ip-core-with-ope/
I already posted this last week on AppleInsider.
There is a reason Apple is getting it's software stacks sewn up to leverage OpenGL 3.2/OpenCL1.0 together, across all markets they play in.
Nvidia is currently the only vendor with OpenCL1.0 not ready inside their GPGPUs for general consumption. It's still in beta.
FYI most Mac ports are native binaries. Every game from Feral has been a native Macintosh binary I think the only people how try and emulate are Transgaming with the Cider powered games.
Edwin
Yes, companies like Aspyr and Feral Interactive are actually porting the game over natively, but it's still not going to run as well as the native game. From what I understand, they have to convert the game from DirectX to OpenGL which are 2 very different animals. It's probably not an easy task, and then it's probably not an easy task to optimize so that it runs as well as it's Windows counterpart.
...Why does Apple hate modern, high performance video cards?
That's is close enough to the truth, although other things like the performance levels and supported features of OpenGL on Mac (compared to OpenGL on Windows), no direct access to the graphics hardware (unlike Windows) and the range of OS versions and cards we need to support also cause extra work and effort when porting.
Things like the new 3.0 support is great but it assumes the user has 10.6.3 installed and a compatible card, you cannot port a game assuming all the users will have the latest version of the latest operating system with a high end card. Because of this new features can take a while before they can be used in a new game as they need to become standard across enough machines before you can use them.
Although GL (Graphics) is an important part of a games code other items like Audio, Physics, Threading etc are all done differently on a Mac so if everyone suddenly used GL ports would not suddenly become easy and instantly run quickly.
Edwin
Anyone bashing DirectX is a fanboy, plain and simple.
I'm a former OpenGL developer, and the Khronos Group ROYALLY screwed up the release of GL3.
They basically decided to please CAD developers instead of game developers when they released GL3.
Originally they promised to rewrite the state-based model of OpenGL into an object-based model. They didn't.
DirectX (Direct3D) is object-based, and there are many benefits (performance and others) that arise from an object-based system.
Hell, geometry shaders in GL were STILL an extension until a few months ago. Geometry shaders have been in hardware for YEARS nowAnyone saying that you should just use the extensions has never been bitten in the ass by different GPU vendors supporting extensions to varying degrees. I recall one experience where I needed functionality (fp16 blending of framebuffers) and I couldn't even query the hardware via GL to see if the functionality was supported in hardware. I sent an email to the developer relations of ATi and their response was to maintain a list of cards that support it(!)...
GL is so messed up, its really sad now. It could have been AWESOME. Khronos told us it would be![]()
Pretty much what I've heard too. DirectX has Microsoft fully behind it with support, documentation, consistently released drivers, an actual development staff, huge support by virtually every card maker out there, and a library of games that will make your head spin. OpenGL, from what I've read, has a shoddy documentation through their website, random help articles online, random forum/blog posts, and is overall more difficult to develop for.
And it's not like Apple is has exactly been amazing at pushing for gaming on OS X either.