Question

I want to calculate screen x and y coordinates of same vertex for clipping. It does in shader actually and i just want to do the same thing in my C-code. My shader is

uniform mat4 u_MvpMatrix;
attribute vec4 a_Position;
void main(){
    gl_Position = u_MvpMatrix * a_Position;
}

gl_Position.x and gl_Position.y is screen.x and screen.y coordinates of verticle i assume?

There is my C-code, where model_view_projection_matrix is the same matrix, that i put in shader as u_MvpMatrix.

vec4 screen_coords;
vec4 vertex = {0,0,0,1};
mat4x4_mul_vec4(screen_coords, model_view_projection_matrix, vertex);

static inline void mat4x4_mul_vec4(vec4 r, mat4x4 M, vec4 v)
{
int i, j;
for(j=0; j<4; ++j) {
    r[j] = 0.f;
    for(i=0; i<4; ++i)
        r[j] += M[i][j] * v[i];
}
}

But it gives me the vector screen_coords, where screen_coords.x and screen_coords.y independed to model Z-coordinate. screen_coords.x only depend to model x-position, and screen_coords.y analogical. But both useless screen_coords.z and screen_coords.w depend to model Z-coordinate.

Do I use wrong multiplication function or some another wrong?

Was it helpful?

Solution

gl_Position.x and gl_Position.y is screen.x and screen.y coordinates of verticle i assume?

No, they're in clip coordinates. To get from clip to screen space you have to

  1. perform coordinate homogenization p_NDC = p_clip/p.w (now coordinates are in NDC space)
  2. map NDC to viewport coordinates p_window = (p_NDC + 1)/2 * viewport.{width, height} + viewport{x, y}

Another pitfall people often trip into is that OpenGL's matrix element ordering is column major, which "looks" (but isn't) transposed standard notation. I see you're using code from my linmath.h library, in which I took care of that, i.e. calculations are done column major and if you fetch from and pass to OpenGL mat4x4 it's in the right order.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top