Question

I wanted to calculate the angle between two triangles in 3D space. The two triangles will always share exactly two points. e.g.

Triangle 1:

Point1 (x1, y1, z1),  
Point2 (x2, y2, z2),   
Point3 (x3, y3, z3).  

Triangle 2:

Point1 (x1, y1, z1),  
Point2 (x2, y2, z2),  
Point4 (x4, y4, z4).

Is there a way to calculate the angle between them efficiently in CUDA?

Was it helpful?

Solution

For each plane, you need to construct it's normal vector (perpendicular to all lines in that plane). The simple way to do that is to take the cross-product of two non-parallel lines in the triangle. (ex (P3-P1) X (P2-P1) and (P4-P1) X (P2-P1).

Normalize those.

The dot product of those two direction vectors gives you the cosine of the angle.

The tricky bit is to watch out for degenerate triangles! If all 3 points defining either triangle are colinear, (that triangle is just a line) then what you're asking for is undefined, and the cross-product will divide by zero. You need to decide what you're going to do in that case.

Since you're trying to do this on a GPU, you'll ideally want to write this function without any branches, if you're concerned about efficiency. That would mean instead of testing for degenerate triangles with an if clause, you should try and do it with a ternary A ? B : C

OTHER TIPS

The angle between the triangles is the same as the angle between the planes defined by the three points of each triangle.

Since both Point 1 or Point 2 lie in both planes, figure out the direction cosines from one of those points to Point 3, and then to Point 4. Then, the cosine of the angle between these two lines is just the sum of the products of the corresponding direction cosines.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top