Question

I have a three dimensional array

unsigned int window_RGBData[3][640][480];
void display(){
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
    glDrawPixels(window_Width,window_Height,GL_RGB,GL_UNSIGNED_INT, window_RGBData);
    glutSwapBuffers();
}

it only shows me a black screen, no matter what the values in the array. The array is like this, dimension 1 is red, dimension 2 is green and third is blue. When I use gl_unsinged_byte instead of int I get lines of black and white(if all array is 255). So it is reading from the array. But I think I am not able to specify the correct format of the array to opengl. I can use the

glBegin(GL_POINT)
// code here
glEnd

but for a specific reason this way is out of bounds.

Any ideas on how to specify the format here in gldrawpixels function or any other way.

initialization etc:

int Image::OpenGLShow(){
// Display onscreen
    int argc=1;
    char* argv[1];
    argv[0]=strdup("Helloguys");
    glutInit(&argc, argv);
    window_Height=Height;
    window_Width= Width;
  glutInitDisplayMode(GLUT_RGB | GLUT_DOUBLE | GLUT_DEPTH);
  glutInitWindowSize(Width, Height);
  glutCreateWindow("OpenGL glDrawPixels demo");

  glutDisplayFunc(display);
  glEnable(GL_DEPTH_TEST);

  glutMainLoop();
    return 0; 
}

EDIT: I believe the problem is with the values I am putting in my array. Now this is how the array is being populated:

for(int i =0;i<200;i++){
        for (int j=0; j<200;j++){
            // rgb values at i*j point on screen. 
            window_RGBData[0][i][j]=red_value;  // 0-255
            window_RGBData[1][i][j]=green_value;
            window_RGBData[2][i][j]=blue_value;
        }
    }

So basically I was assuming that the gldrawpixels method will combine the three together and display the pixel, depending on the format I provided. Which defininetly is not what gldrawpixels is doing. So should I change the way the array is being populated or change the formate value?

Was it helpful?

Solution

That array is too big to put on the stack.

Increase your stack size, allocate it on the heap, or try something smaller:

#include <GL/glut.h>
#include <cstdlib>

const unsigned int W = 200;
const unsigned int H = 200;

void display()
{
    glClearColor( 0, 0, 0, 1 );
    glClear( GL_COLOR_BUFFER_BIT );

    unsigned int data[H][W][3];
    for( size_t y = 0; y < H; ++y )
    {
        for( size_t x = 0; x < W; ++x )
        {
            data[y][x][0] = ( rand() % 256 ) * 256 * 256 * 256;
            data[y][x][1] = ( rand() % 256 ) * 256 * 256 * 256;
            data[y][x][2] = ( rand() % 256 ) * 256 * 256 * 256;
        }
    }

    glDrawPixels( W, H, GL_RGB, GL_UNSIGNED_INT, data );

    glutSwapBuffers();
}

int main( int argc, char **argv )
{
    glutInit( &argc, argv );
    glutInitDisplayMode( GLUT_RGBA | GLUT_DOUBLE );
    glutInitWindowSize( W, H );
    glutCreateWindow( "GLUT" );
    glutDisplayFunc( display );
    glutMainLoop();
    return 0;
}

OTHER TIPS

Instead of

unsigned int window_RGBData[3][640][480];

try

unsigned int window_RGBData[480][640][3];

and of course fill the buffer accordingly.

Multidimensional arrays in C(++) are merely a syntactic sugar. window_RGBData is passed to glDrawPixels as a pointer. It points to an area in memory, and glDrawPixels interprets the values there according to the format and type parameters. In the current case the values should be unsigned integers, and their order is the following:

row1_column1.r row1_column1.g row1_column1.b
row1_column2.r row1_column2.g row1_column2.b
...
row1_column640.r row1_column640.g row1_column640.b
row2_column1.r row2_column1.g row2_column1.b
row2_column2.r row2_column2.g row2_column2.b
...
row2_column640.r row2_column640.g row2_column640.b
...
row480_column1.r row480_column1.g row480_column1.b
row480_column2.r row480_column2.g row480_column2.b
...
row480_column640.r row480_column640.g row480_column640.b

See also http://mycodelog.com/tag/gldrawpixels/

Be aware of the alignment as well:

width × height pixels are read from memory, starting at location data. By default, these pixels are taken from adjacent memory locations, except that after all width pixels are read, the read pointer is advanced to the next four-byte boundary. The four-byte row alignment is specified by glPixelStore with argument GL_UNPACK_ALIGNMENT, and it can be set to one, two, four, or eight bytes.

Source: http://www.opengl.org/sdk/docs/man2/xhtml/glDrawPixels.xml

An alternative way of declaring and filling the buffer using a single dimensional array:

unsigned window_RGBData[640 * 480 * 3]; // 640 mod 4 == 0
...
window_RGBData[3 * (y * 640 + x) + 0] = ColorYouWantAtXY(x, y).red;
window_RGBData[3 * (y * 640 + x) + 1] = ColorYouWantAtXY(x, y).green;
window_RGBData[3 * (y * 640 + x) + 2] = ColorYouWantAtXY(x, y).blue;

Note that here we exploited 640 modulo 4 == 0 and therefore had no issues with the alignment.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top