Question

When using OpenCV's findHomography function to estimate an homography between two sets of points, from different images, you will sometimes get a bad homography due to outliers within your input points, even if you use RANSAC or LMEDS.

// opencv java example:
Mat H = Calib3d.findHomography( src_points, dst_points, Calib3d.RANSAC, 10 );

How can you tell if the resulting 3x3 homography matrix is acceptable or not?

I have looked for an answer to this here in Stackoverflow and in Google and was unable to find it.

I found this article, but it is a bit cryptic to me:

"The geometric error for homographies"

Was it helpful?

Solution

The best way to tell if the homography is acceptable is.

1- Take the points of one image and reproject them using the computed homography.

//for one 3D point, this would be the projection
px' = H * px;
py' = H * py;
pz' = H * pz;

2- Calculate the euclidean distance between the reprojected points and the real points in the image.

Reprojection error for one point. p is the projected point and q is the real point.

enter image description here

3- Establish a treshold that decides if the reprojection error is acceptable.

For example, an error greater than one pixel wouldn't be acceptable for many tracking applications.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top