Question

I'm trying to render an object (say cube) with OpenGL 1.1 (I know that doesn't makes sense nowadays, but I've to use this). Everything works fine until I try some lighting.

Here's the problem:

Screenshot

The Global variable set are:

static GLfloat light_position[] = {1.0, 1.0, 2*cZ.x , 0.0};   
// cZ.x is the minimum z of the mesh.  I know 
// this is at infinity, but don't work also with w=1.0

In the main function:

...
    glMatrixMode(GL_MODELVIEW);     // Select The Modelview Matrix
    glLoadIdentity();               // Reset The Modelview Matrix
    glEnable(GL_LIGHT0);
    glEnable(GL_LIGHTING);
    glShadeModel(GL_SMOOTH);        // Enable Smooth Shading

....

Drawing a mesh k

// While drawing mesh k

        GLfloat light_ambient[] = {COLOUR[k][0], COLOUR[k][1], COLOUR[k][2], 1.0}; 
        GLfloat light_diffuse[] = {COLOUR[k][0], COLOUR[k][1], COLOUR[k][2], 1.0};  
        glLightfv(GL_LIGHT0, GL_POSITION, light_position);
                glLightfv(GL_LIGHT0, GL_DIFFUSE, light_diffuse);
        glLightfv(GL_LIGHT0, GL_AMBIENT, light_ambient);


....
        //This is a mesh, so will be drawn using triangles
        glBegin(GL_TRIANGLES);
        //Triangles will be defined by vertex indices in faces
        for (unsigned int i = 0; i<mesh->faces.size(); i++){

            int index1 = mesh->faces.at(i).x;
            int index2 = mesh->faces.at(i).y;
            int index3 = mesh->faces.at(i).z;
            glNormal3f(mesh->normals.at(i).x,mesh->normals.at(i).y,mesh->normals.at(i).z);
            glVertex3f(mesh->vertices.at(index1).x, mesh->vertices.at(index1).y, mesh->vertices.at(index1).z);
            glVertex3f(mesh->vertices.at(index2).x, mesh->vertices.at(index2).y, mesh->vertices.at(index2).z);
            glVertex3f(mesh->vertices.at(index3).x, mesh->vertices.at(index3).y, mesh->vertices.at(index3).z);
        }
        glEnd();

....

Whereas the normal are computed as:

    glm::vec3 currFace = m->faces.at(faceIndex);
    glm::vec3 vert1 = m->vertices.at(currFace.x);
    glm::vec3 vert2 = m->vertices.at(currFace.y);
    glm::vec3 vert3 = m->vertices.at(currFace.z);

    glm::vec3 side1 = (vert2 - vert1);
    glm::vec3 side2 = (vert3 - vert1);

    glm::vec3 normal = glm::cross(side1, side2);

    normal = glm::normalize(normal);

I'm really struggling to understand what's wrong, can you point me in the right direction?

EDIT: This happens similarly with the standford bunny (taken from standford repo, so it's well formed)

http://imgur.com/Z6225QG

Was it helpful?

Solution

Looking at your normals picture it looks like that more than not being shaded, some of your object faces are transparent.

I used to have a similar problem while learning OpenGL, in my case I forgot to enable DEPTH_TEST. You can do it simply adding this line to your GL init function function:

 glEnable(GL_DEPTH_TEST);

Give it a try

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top