Question

I am trying to create some basic lighting shaders in LWJGL. Everything seems to be working except that when I change the rotation of the camera, the lighting changes. I think it is because of the rotation of the normals being messed up as well when I rotated the camera.

Here is my original Vertex Shader:

uniform vec3 lightDir;
varying vec3 normal;

void main()
{       
    normal = gl_NormalMatrix*gl_Normal;
    gl_Position = ftransform();
}

My Fragment Shader:

varying vec3 normal;

void main(){
    vec3 color = vec3(1,1,1);
    vec3 lightDir = vec3(1,1,0);
    float inten = 1;
    color = color*dot(normal, lightDir)*inten;
    gl_FragColor = vec4(color,1);

}

To transform the camera I used:

public static void applyTranslations() {
    glPushAttrib(GL_TRANSFORM_BIT);
    glMatrixMode(GL_MODELVIEW);
    glRotatef(pitch, 1, 0, 0);
    glRotatef(yaw, 0, 1, 0);
    glRotatef(roll, 0, 0, 1);
    glTranslatef(-x, -y, -z);
    glPopAttrib();
}

I realized that this method of transforming the camera might be actually changing the model while the camera remained static, which would have messed up the normals, so I tried to input a uniform matrix containing the model's rotation to transform the normals, but that doesn't seem to work either. Now the entire model is black. (It was working before, except the rotation.) I used this to pass the changes to the Shader:

Vector3f scale, rot, trans;
Matrix3f modelMatrix = new Matrix3f();
modelMatrix.setIdentity();
scale = new Vector3f(1,1,1);
rot = new Vector3f(xRot,yRot,zRot);
trans = new Vector3f(x,y,z);

Matrix4f.scale(scale, modelMatrix, modelMatrix);
Matrix4f.translate(trans, modelMatrix, modelMatrix);
Matrix4f.rotate((float) Math.toRadians(rot.z), new Vector3f(0,0,1), modelMatrix, modelMatrix);
Matrix4f.rotate((float) Math.toRadians(rot.y), new Vector3f(0,1,0), modelMatrix, modelMatrix);
Matrix4f.rotate((float) Math.toRadians(rot.x), new Vector3f(1,0,0), modelMatrix, modelMatrix);

FloatBuffer modelBuff = BufferUtils.createFloatBuffer(9);
modelMatrix.store(modelBuff);

int loc = ARBShaderObjects.glGetUniformLocationARB(e.m.shader.programID, "modelMatrix");
ARBShaderObjects.glUniformMatrix4ARB(loc, false, modelBuff);

And then I changed my vertex shader to:

uniform vec3 lightDir;
uniform modelMatrix;
varying vec3 normal;

void main()
{       
    normal = modelMatrix*gl_Normal;
    gl_Position = ftransform();
}

Is this what I should be doing the create and pass transformation matrices? Also is there another way to rotate the camera without using glRotatef()?

Was it helpful?

Solution

vec3 lightDir = vec3(1,1,0) is constant while normal is transformed every time the camera changes.

To prevent this make sure lightDir and normal are in the same space. Compute the dot before transforming normal or, transform lightDir by gl_NormalMatrix then compute the dot.

In addition to the effect you want to achieve you have an issue in the fragment shader: normal is not normalized. This is because a linear interpolation of unit vectors won't always produce a unit vector.

What you should do is something like:

color*dot(normalize(normal),  transform_to_same_space(lightDir))*inten;

Some minor issues in the second part:

You're declaring modelMatrix without a type.

You cannot transform vec3 with a 4x4 matrix. You can pad gl_Normal with a 0 since it's a vector or cast down the matrix to float3x3.

You're using Matrix4f.scale with Vector3f(1,1,1) and the translation won't affect vectors so modelMatrix is orthonormal. However, in the general case you should transform gl_Normal with the transposed inverse of modelMatrix.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top