2D LWJGL with OpenGL - Enabling depth testing for layering 2D textures?
-
28-05-2021 - |
Question
What would I need to add to my OpenGL init method to enable depth testing, and how would I actually use it for texture layering?
I would have to extend the last parameter of glOrtho to something more extreme than -1, and of course glEnable depth testing. Then to use it, I can only assume that I change the third parameter of glVertex to something that isn't 0 to send it in front / behind of other textures.
I try this, and the damn textures don't even show. xD I must be missing something.
EDIT: RE: Tim's response
whenever i made the image's z more extreme than -1 it didnt show the screen was just black.
void initGL(){
GL11.glEnable(GL11.GL_TEXTURE_2D);
GL11.glEnable(GL11.GL_DEPTH_TEST); //depth test enabled
GL11.glMatrixMode(GL11.GL_PROJECTION);
GL11.glOrtho(-width/2, width/2, -height/2, height/2, 1, -10);//far changed to -10
GL11.glMatrixMode(GL11.GL_MODELVIEW);
}
and
void loadBG(int theLoadedOne){
GL11.glBindTexture(GL11.GL_TEXTURE_2D, theLoadedOne);
GL11.glBegin(GL11.GL_QUADS);
GL11.glTexCoord2f(0,0);
GL11.glVertex3f(-width/2,height/2, -2);//new z value
GL11.glTexCoord2f(1,0);
GL11.glVertex3f(width/2,height/2,-2);//new z value
GL11.glTexCoord2f(1,1);
GL11.glVertex3f(width/2,-height/2,-2);//new z value
GL11.glTexCoord2f(0,1);
GL11.glVertex3f(-width/2,-height/2,-2);//new z value
GL11.glEnd();
GL11.glFlush();
}
and
while(!Display.isCloseRequested()){
GL11.glClear(GL11.GL_COLOR_BUFFER_BIT | GL11.GL_DEPTH_BUFFER_BIT);
...
for(int i=0;i<1;i++){ //dont mind this for loop
bg.loadThisBG(0); //its here for reasons
}
updateFPS();
Display.update();
} Display.destroy();
}
Solution
Seems like you switched near and far plane. Have a look at gluOrtho2D. It just calls glOrtho
with near=-1
and far=+1
, resulting in the z coordinates switching sign (m33=-2/(far-near)
). However, with the values given above, m33=-2/(-10-1)
is positive, and the z axis reversed to standard workflow.
This consequences in the quad being viewed from the back.
OpenGL matrix manipulation methods do no care what you feed them; except when values would led to a division by zero.
Assuming there is no modelview transform, and only the one matrix contributing to the projection one, here is what I think is happening:
The z value transform from world to NDC space is z_ndc = -9/11 * z_w + 2/11
(set near and far into the orthographic matrix and take the third row). Now, z_w=-2
, and so z_ndc = 20/11
. This is out of the NDC space boundaries and thrown away.
Well, I assume that this test is implicitly enabled/disabled with the Z test itself. Next suspect would be backface culling...
OTHER TIPS
Provided your context includes a depth buffer (not sure about lwjgl buffer creation...)
All you need should be:
- Call
glEnable(GL_DEPTH_TEST)
during initialization - Add depth buffer bit to glClear
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
- Define z coordinate to be between near and far values of orthographic matrix.