Estimating relative rotation between two uncalibrated cameras from the fundamental matrix

266 Views Asked by At

I have two images of the same object that were taken from different positions and an uncalibrated camera. I want to treat these images pair as stereo pairs. Then I can estimate a fundamental matrix, without any knowledge of the scene. I want to know the rotation angle between camera two relative to the first one.

  1. pts1 and pts2 are the array of matching key points that are obtained from one of the known Feature Descriptor methods. RANSAC is used to exclude outliers.

F, mask = cv2.findFundamentalMat(pts1,pts2,v2.FM_RANSAC, 3,0.99 )

  1. p1fNew and p2fNew - inliers of key points With the stereoRectifyUncalibrated method, I can obtain two matrices to rectify images into the same plane. retBool, H1, H2 = cv2.stereoRectifyUncalibrated(p1fNew,p2fNew, F, image_size)

By multiplying these matrices I can estimate transformation of one image plane to another: T = np.linalg.inv(np.mat(H1)) * np.mat(H2)

How do I decompose this T matrix and extract from it rotation information and vector direction from the first image center to another?

0

There are 0 best solutions below