Question

I have two photos of a house, the camera is only moved about 1 meter or approx 1 foot from its position for the second photo. So for the first and second photo, they are very much the same but with some minor difference being the perspective.

I want to generate a mapping, a correspondence between the first photo and the second photo. I wish to know, for the pixels in the first photo, where do they map to in the second photo and in the opposite direction.

I guess that there is some way to detect similar structures between photos and using this way I can find a rough guess on where the pixels went to.

For a second part of this question, how can I solve this problem if some features were hidden or revealed, for example there is a tree between the house and the camera. So the tree will reveal some pixels and hide some as I moved the camera due to it being between the camera and the house.

Was it helpful?

Solution

You might want to look into the SIFT algorithm.

OTHER TIPS

I want to note that SIFT and SURF are not going to solve this problem, they find a image in another image. They recognize the location of the image A inside the image B.

However, when the camera has moved a slight distance, some objects have moved and overlap each other. So what is needed to is to find which objects have moved relative to each other, to find which ones are overlapping others.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top