Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Bunkum

macrumors newbie
Original poster
Jun 4, 2010
11
0
I'm trying to create a CIImage with a texture from an FBO, so I can then apply some filters to it and then draw it to my OpenGL view.

What I'm doing so far is, creating a CIContext with the CGLContextObj that the FBO and texture are apart of, then setting up the OpenGL state for the viewport, projection matrix, etc. so when the CIContext draws the result, it will be drawn the way I want. Then I create the CIImage using imageWithTexture:size:flipped:colorSpace: where the texture is the one from the FBO. Then skipping the filters at the moment, I draw the image with the CIContext with drawImage:inRect:fromRect:.

Now I believe the problem is with the CIImage (though it's returning a CIImage object, not nil) or possibly I'm not setting up some state that I should be, since if I change the CIImage to a different kind, such as imageWithColor: then it works correctly. And when I try to draw the CIImage (when it's of the texture) as an NSImage (creating the NSCIImageRep for it, and then adding that as the representation for the NSImage), I get the same result as when I try draw it. Which is just a completely black image.


Another question I have (unrelated to the problem), is how do we use Core Image in a similar way (GL texture -> apply some Core Image filter -> GL draw buffer) with OpenGL 3.x (when using Core only, e.g. gl3.h). Since the projection matrix and other things like that we have to handle ourselves, so how do you make Core Image aware of that?


EDIT: As far as the problem, it turns out Core Image expects a GL_TEXTURE_RECTANGLE rather than a GL_TEXTURE_2D.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.