Question

I'm writing an app for Mac OS >= 10.6 that creates OpenGL textures from images loaded from disk.

First, I load the image into an NSImage. Then I get the NSBitmapImageRep from the image and load the pixel data into a texture using glTexImage2D.

For RGB or RGBA images, it works perfectly. I can pass in either 3 bytes/pixel of RGB, or 4 bytes of RGBA, and create a 4-byte/pixel RGBA texture.

However, I just had a tester send me a JPEG image (shot on a Canon EOS 50D, not sure how it was imported) that seems to have ARGB byte ordering.

I found a post on this thread: (http://www.cocoabuilder.com/archive/cocoa/12782-coregraphics-over-opengl.html) That suggests that I specify a format parameter of GL_BGRA to glTexImage2D, and a type of GL_UNSIGNED_INT_8_8_8_8_REV.

That seems logical, and seems like it should work, but it doesn't. I get different, but still wrong, color values.

I wrote "swizzling" (manual byte-swapping) code that shuffles the ARGB image data into a new RGBA buffer, but this byte-by-byte swizzling is going to be slow for large images.

I would also like to understand how to make this work "the right way".

What is the trick to loading ARGB data into an RGBA OpenGL texture?

My current call to xxx looks like this:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, newWidth, newHeight, 0, format, GL_UNSIGNED_BYTE, pixelBuffer);

where is either RGB or RGBA.

I tried using:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, newWidth, newHeight, 0, GL_BGRA, GL_UNSIGNED_INT_8_8_8_8_REV, pixelBuffer);

When my image rep's reports that it is in "alpha first" order.

As a second question, I've also read that most graphics card's "native" format is GL_BGRA, so creating a texture in that format results in faster texture drawing. The speed of texture drawing is more important than the speed of loading the texture, so "swizzling" the data to BGRA format up-front would be worth it. I tried asking OpenGL to create a BGRA texture by specifying an "internalformat" of GL_RGBA, but that results in a completely black image. My interpretation on the docs makes me expect that glTexImage2D would byte-swap the data as it reads it if the source and internal formats are different, but instead I get an OpenGL error 0x500 (GL_INVALID_ENUM) when I try to specify an "internalformat" of GL_RGBA. What am I missing?

No correct solution

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top