문제

I have two photos of a house, the camera is only moved about 1 meter or approx 1 foot from its position for the second photo. So for the first and second photo, they are very much the same but with some minor difference being the perspective.

I want to generate a mapping, a correspondence between the first photo and the second photo. I wish to know, for the pixels in the first photo, where do they map to in the second photo and in the opposite direction.

I guess that there is some way to detect similar structures between photos and using this way I can find a rough guess on where the pixels went to.

For a second part of this question, how can I solve this problem if some features were hidden or revealed, for example there is a tree between the house and the camera. So the tree will reveal some pixels and hide some as I moved the camera due to it being between the camera and the house.

도움이 되었습니까?

해결책

You might want to look into the SIFT algorithm.

다른 팁

I want to note that SIFT and SURF are not going to solve this problem, they find a image in another image. They recognize the location of the image A inside the image B.

However, when the camera has moved a slight distance, some objects have moved and overlap each other. So what is needed to is to find which objects have moved relative to each other, to find which ones are overlapping others.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top