Question

I want to set the yaw of a rotation matrix so an object points to a specific position using this code

Vector3 dist = transform().position() - mPlayerTarget;
transform().rotationZ(atan2(dist.x(), dist.y()));

This would produce the right results, except the rotation is inverse, so instead of following the target point it rotates away from it.

Vector3 dist = transform().position() - mPlayerTarget;
transform().rotationZ(-atan2(dist.x(), dist.y()));

(with -atan2) the object follows the target, but it's offset by a 90 degrees to the right. The rotationZ implementation looks like this:

float cz = cosf(rotation);
float sz = sinf(rotation);

matrix.mMatrix[0] = cz;
matrix.mMatrix[1] = sz;
matrix.mMatrix[2] = 0;

matrix.mMatrix[3] = -sz;
matrix.mMatrix[4] = cz;
matrix.mMatrix[5] = 0;

matrix.mMatrix[6] = 0;
matrix.mMatrix[7] = 0;
matrix.mMatrix[8] = 1;

I'm using iOS OpenGL ES 2.0. Something seems fundamentally wrong here, the first version should be the one producing the right results? All the other transformations seem to work properly. What could go wrong here? I don't know where to look for errors...

Was it helpful?

Solution

First thing is atan2 - it is usually defined as atan2(y, x), whereas you have it the other way around.

Another source of issues might be the direction of your dist vector - it goes from the target towards the transform position. Try reversing it.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top