Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

fredrum

macrumors newbie
Original poster
Dec 14, 2008
11
0
Hello,

maybe someone here will be able to help me spot what I am doing wrong.

I am trying to do a 3D lut CG shader as in,
http://http.developer.nvidia.com/GPUGems2/gpugems2_chapter24.html

that gets its data from my small OpenGL test program.

The problem is that I can't pass a texture colour to my tex3D() call as the xyz coordinates. This is what I am trying to do, it compiles but the tex3D() returns black,

float3 colorXYZ = tex2D( testTexture, inUV).rgb;
output.pixel = tex3D( lut, colorXYZ ).rgb;

I did some testing to make sure my 'colorXYZ' rgb values were within 0.0-1.0.
Also I made test and the following works and looks like I would expect.

-----------------------------------------------
float3 colorXYZ = tex2D( testTexture, inUV).rgb;
output.pixel = colorXYZ;
-----------------------------------------------
and
-----------------------------------------------
float3 testUV = float3(inUV[0], 0.0, inUV[1]);
output.pixel = tex3D( lut, testUV ).rgb;
-----------------------------------------------



So I am wondering what could be the difference that breaks my 'colorXYZ' lookup coordinates? What is the difference between,

float3 colorXYZ = tex2D( testTexture, sUV).rgb;
and
float3 testUV = float3(UV[0], 0.0, UV[1]);

both should be just float3's right? Is anything else passed along by the tex2D that makes the data not suitable as texture coordinates?
I am sure this is how they do it in the OpenEXR example application source.

I am using cg-runtime, i think it is called, when I compile the shader from within the application, and I am working in arbfp1.



Grateful for any suggestions.
cheers
fred
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.