Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Krevnik

macrumors 601
Sep 8, 2003
4,100
1,310
This isn't quite on target. The new Leopard OpenGL includes a technology known as LLVM, which is a dynmic code generator or JIT, and is invoked (as far as I can tell) if the application gives GL a shader that cannot run directly on the hardware of the GPU - examples would be, vertex shaders on Intel GMA graphics type systems.

It is invoked on the OGL stack itself as well as shaders. OGL and DX both have some CPU code that needs to run before a fragment can be passed onto the GPU. Light settings, texture settings, etc, etc... this all translates from the OGL state machine into instructions to feed to the GPU, and tends to be branchy, wasting CPU time checking if LIGHT_0 is enabled, yadda, yadda, yadda.

Shaders get a boost by using LLVM to optimize them as an intermediate step between the language that they are written in (GLSL or whatever), and the instructions the GPU receives.

But yeah, you are right about LLVM. In fact, if you want to get a better understanding of what Apple said about LLVM at WWDC 2006, there is a thread on the LLVM mailing lists that discuss it: http://lists.cs.uiuc.edu/pipermail/llvmdev/2006-August/006492.html

Pre-Leopard OpenGL already did a form of dynamic code generation for those cases, but LLVM does it better.

It isn't accurate to say that the new GL is based on bytecode or runs in some kind of virtual machine. It's a large C library.

A lot better in certain cases. I use WoW as an example not because it received a huge speedup (it only received something like a 7-8% speedup /at most/), but a large CPU usage drop, using the same WoW build on 10.4.10 versus 10.5. That CPU usage drop is important in games that are more CPU-bound.

Using LLVM or a JIT /is/ effectively running it in a VM. It might be an extremely light-weight VM, but hell, that is the whole /point/ of LLVM, to be a Low-Level Virtual Machine. :D

As for bytecode, the shell APIs aren't bytecode, but the stack itself is. If you poke around Leopard's OGL framework, you will even see the arch-targeted bytecode files.

A goal for a developer writing a high performing app is to stay off as many of those paths that might invoke the dynamic code generator as posible, because all those paths lead to cycles being spent on the CPU instead of the GPU. In the case of the GMA 950 it is unavoidable (no VS HW, it can only do the pixel shaders) - but LLVM will do a better job there than the old code in Tiger.

What you want to do is avoid invoking the code generator on the same code block repeatedly. You do this by making sure you don't keep mucking with the OGL state unless you have to.

A good app running on a discrete GPU such as ATI or NVIDIA parts should never have to invoke any LLVM-generated code if it's set up right.

Actually, you will get LLVM-generated code, but if you do it right, you shouldn't have LLVM constantly regenerating that code. The whole point is to more reliably and accurately optimize away branches and the like. You lose the benefit if you keep turning LIGHT_2 on and off, for example.

BTW, this change of adding LLVM to GL in Leopard really had no connection with the speedups on WoW starting in Intel Mac GL in 10.4.6 and continuing through 10.4.9. Those came from other factors (new GL extensions and multi-threaded driver).

Never really argued that, and as I said before, the CPU usage change of the LLVM-based OGL stack versus the old one is the huge win. You get fewer random CPU bottlenecks which cause drops in FPS, or prevent you from performing X level of AI, etc.
 

rbarris

macrumors 6502
Oct 28, 2003
358
0
Irvine CA
Interesting. So, you would then expect this dynamically generated code to show up in, say, a Shark profile?

Even dynamically generated code has to be executed, so it should be visible right ? Even if it was only say 0.5% or something.

I'll go looking for that, thanks for the added info.
 

Krevnik

macrumors 601
Sep 8, 2003
4,100
1,310
Interesting. So, you would then expect this dynamically generated code to show up in, say, a Shark profile?

Even dynamically generated code has to be executed, so it should be visible right ? Even if it was only say 0.5% or something.

I'll go looking for that, thanks for the added info.

Yes, the catch is that how you use OGL, and exactly what features your GPU has (modern ones take on a chunk of the task of vertex processing, so LLVM's usefulness may be limited there)...

In some ways you are right that on a modern GPU, LLVM will mostly provide benefit on shaders by using LLVM bytecode as an intermediate optimization language for the shaders, that lets OGL quickly go to GPU instructions, or CPU instructions if the shader won't operate or fit on the GPU.

But really, anything that provides me more CPU cycles for other things beyond processing polygons is a boon in my book.
 

voyagerd

macrumors 65816
Jun 30, 2002
1,498
251
Rancho Cordova, CA
And is it going to be moot, because OpenGL 3.0 is supposed to be 3 months beyond that?

Mac OS 10.5 does have all the OpenGL 2.1 extensions which include GLSL 1.20.

OpenGL 3.0's release is imminent. A lot of people expected it by late September/Early October. I've been looking forward to it. Khronos just has to approve it and make the specifications public. After that, Apple, ATI, and NVIDIA can start upgrading their OpenGL software/drivers with new extensions.
 

AppleWoW

macrumors regular
Sep 22, 2007
125
0
SLC, Utah
So could we see a OpenGL 3.0 update some time when leopard has been out (10.5.3? etc) <--- meaning in a future update of leopard.
 

MacsRgr8

macrumors G3
Sep 8, 2002
8,290
1,783
The Netherlands
Let me get this straight:

Tiger supports OpenGL 1.1 - 1.5
Tiger 10.4.8 (Intel) can support OpenGL 2.0 on some grfx cards
Leopard supports OpenGL 2.1 (highend grfx?)
Leopard 10.5.x will hopefully support OpenGL 3.0

Right...

Now, as this is the "Apple Games" section, does anybody have any idea which games actually use any of the OpenGL 2.x features?

When a new DirectX api comes our way, EVERY gamer knows what the advantages are, which OS is required, which hardware is necessary, and.... which games use these new technologies, and the developers show-off those new technologies in their new games as new features which are put into online movies.

Any demos or movies showing-off OpenGL 2.x (or 3.0) features and technologies which could find its way in a Mac in the near future on the 'net?
 

Sijmen

macrumors 6502a
Sep 7, 2005
709
1
time of day lighting
real-time ambient lightmaps
dynamic soft shadows
lightbeams
long-range viewdistance
parallax occlusion mapping
motion blur
depth of field

I've seen some of these features implemented in DX 9.
 

voyagerd

macrumors 65816
Jun 30, 2002
1,498
251
Rancho Cordova, CA
Let me get this straight:

Any demos or movies showing-off OpenGL 2.x (or 3.0) features and technologies which could find its way in a Mac in the near future on the 'net?

This has benchmarks for all the version of OpenGL up to 2.1. http://www.realtech-vr.com/glview/download.html

ATI demos:
SmartShader 1.0: http://ati.amd.com/developer/demos/macss1/index.html
SmartShader 2.0 and HD: http://ati.amd.com/developer/demos/macss2/index.html

I assume the HD ones at least use extensions from new versions of OpenGL, such as the Subsurface Scattering one.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.