Question

I want to dump raw texture data to disk (to read back later), and I'm not sure about glReadPixel will read from the currently bound texture.

How can I read the buffer from my texture?

Was it helpful?

Solution

glReadPixels function reads from framebuffers, not textures. To read a texture object you must use glGetTexImage but it isn't available in OpenGL ES :(

If you want to read the buffer from your texture then you can bind it to an FBO (FrameBuffer Object) and use glReadPixels:

//Generate a new FBO. It will contain your texture.
glGenFramebuffersOES(1, &offscreen_framebuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, offscreen_framebuffer);

//Create the texture 
glGenTextures(1, &my_texture);
glBindTexture(GL_TEXTURE_2D, my_texture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA,  width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);

//Bind the texture to your FBO
glFramebufferTexture2DOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_TEXTURE_2D, my_texture, 0);

//Test if everything failed    
GLenum status = glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES);
if(status != GL_FRAMEBUFFER_COMPLETE_OES) {
    printf("failed to make complete framebuffer object %x", status);
}

Then, you only must call to glReadPixels when you want to read from your texture:

//Bind the FBO
glBindFramebufferOES(GL_FRAMEBUFFER_OES, offscreen_framebuffer);
// set the viewport as the FBO won't be the same dimension as the screen
glViewport(0, 0, width, height);

GLubyte* pixels = (GLubyte*) malloc(width * height * sizeof(GLubyte) * 4);
glReadPixels(0, 0, width, height, GL_RGBA, GL_UNSIGNED_BYTE, pixels);

//Bind your main FBO again
glBindFramebufferOES(GL_FRAMEBUFFER_OES, screen_framebuffer);
// set the viewport as the FBO won't be the same dimension as the screen
glViewport(0, 0, screen_width, screen_height);

OTHER TIPS

Thanks for the answer Gergonzale. I spent some time this morning trying to figure out how to get this to work with 16-bit textures, this code snippet may be useful to anyone else converting GL_UNSIGNED_SHORT_5_6_5 to GL_UNSIGNED_BYTE

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB,  tSizeW, tSizeH, 0, GL_RGB, GL_UNSIGNED_SHORT_5_6_5, NULL);
GLubyte* pixels = (GLubyte*) malloc(tSizeW * tSizeH * sizeof(GLubyte) * 2);
glReadPixels(0, 0, tSizeW, tSizeH, GL_RGB, GL_UNSIGNED_SHORT_5_6_5, pixels);

int index = (x*tSizeH + y)*2;
unsigned int rgb = pixels[index + 1]*256 + pixels[index + 0];
unsigned int r = rgb;
r &= 0xF800;    // 1111 1000 0000 0000
r >>= 11;       // 0001 1111
r *= (255/31.); // Convert from 31 max to 255 max

unsigned int g = rgb;
g &= 0x7E0;     // 0000 0111 1110 0000
g >>= 5;        // 0011 1111
g *= (255/63.); // Convert from 63 max to 255 max

unsigned int b = rgb;
b &= 0x1F;      // 0000 0000 0001 1111
//g >>= 0;      // 0001 1111
b *= (255/31.); // Convert from 31 max to 255 max
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top