Different representation orders in Transformation matrix (4x4) extrinsic calibration of two sensors

21 Views Asked by At

I obtained a transformation matrix that represents the rotation+translation of one coordinate system to another (trying to fuse lidar and camera data). My 2 sensors are upright and pointing roughly the same direction. Hence, their 3 coordinate axes each should all be pointing the same direction. However, I came across a problem in fusion where the lidar points were exactly 45 deg wrongly rotated on top the camera image. I put my matrix in a visualisation tool and notice that the 2 axes (x,y,z) are not pointing the same direction, why is this so?

My matrix is:

0.1369 -0.9898 0.0400 0.0541
0.0236 -0.0371 -0.9990 0.1163
0.9903 0.1377 0.0183 0.0519
0.0000 0.0000 0.0000 1.000

When I place it in this matrix converter tool (by koide3), I see a visualisation that the axes of the 2 sensors are not pointing the same direction:

https://staff.aist.go.jp/k.koide/workspace/matrix_converter/matrix_converter.html

I then use the button Camera => Robot until I see visually the 2 pairs of x,y,z axes are pointing the same direction.

The following matrix is the result:

0.98980   -0.04000  0.13690   0.05410   
0.03710   0.99900   0.02360   0.11630   
-0.13770  -0.01830  0.99030   0.05190   
0.00000   0.00000   0.00000   1.00000 

Passing this matrix into fusion application, I have very accurate fusion. However, I’m still unsure what caused the problem with my orginal matrix. Isn’t extrinsic calibration supposed to find the right orientation between the 2 sensors’ coordinate systems? When I inspect the 2 matrices, it appears the values are all the same, only the order has changed (for rotation 3x3 matrix). It doesn’t seem to be a calibration issue, the rotation is off by exactly 45 degrees, and when the order in the matrix has changed, it is very precise. I was thinking it might be due to some applications storing their matrices in row/column major, but that would only be a transpose of the original matrix which doesn’t seem to be the case here. If anyone could provide some insight into this, I would greatly appreciate it!

Another matrix visualisation tool I used might provide better visualisation (I tried visualising with ‘bunny’): https://harry7557558.github.io/tools/matrixv.html

I tried different calibration tools.

  1. cam_lidar_calibration ROS package by acfr
  2. MATLAB’s Lidar Camera Calibrator toolbox

The fusion package I’m using is lidar_camera_fusion ROS package by EPVelasco. This fusion package expects extrinsic parameters to take the camera’s coordinate system as the reference frame, ie. convert lidar points into camera’s coordinate system.

For (2), it provided extrinsic parameters (transformation matrix) already in the camera’s coordinate system. I tried it in the fusion package, and saw 45 degrees of wrong rotation of lidar points projected onto camera image.

For (1), it provided extrinsic parameters in lidar’s coordinate system. I inverted the transformation matrix to convert it to the camera’s coordinate system. Then tried it in fusion package, and encountered the same issue. Just like with (2), visualising the transformation matrix using a tool, showed the 2 sets of axes are not aligned in the right orientation.

Both with the help of matrix converter tool by koide3, I used the Camera=>Robot function to change the orientation of 1 of the axes, to visually align the 2 sets of axes, and it showed good fusion. All values in matrices are the same even after changing orientation of axes, simply the order of the rotation part of the matrix is different.

Both (1) and (2) calibration tools I used provided visualisation of projected lidar points onto camera image after calibration and obtaining the extrinsic parameters (transformation matrix), and both projections were good. However, I don’t suspect a problem with the fusion package as visualisig the two sensors’ orientation (direction of their x,y,z) in a matrix visualisation tool, both showed that the orientation is ‘90 degrees off?’. My lidar and camera are both upright, with the camera pointing to the front of the lidar. Hence it was no surprise that when I pointed the x,y,z of each to the same direction using Camera=>Robot in the matrix converter tool, fusion was good.

0

There are 0 best solutions below