Hand-Eye Calibration Discrepancy: Seeking Guidance on Aligning Results with Expected Values

30 Views Asked by At

I am using two circular objects as a calibration target for profile sensors. I have initial estimates of the circle centers, distances between the circles, and their respective radii obtained from the sensor library. To enhance the accuracy of these results, I am leveraging the Handeye camodocal library for optimization. However, the outcomes significantly deviate from the expected values of [70, 58, 61].

For obtaining the basetotip or end effector (EE) poses, I am using get_actual_tcp_pose() from the UR robot, considering that my TCP is at the end flange. I have utilized a rough estimate of the sensor to EE as the ground truth by employing the information above. This involves obtaining the sensor-to-base transformation matrix and using the circle offset in the image frame, along with the distance between the circles, to construct a camera-to-object transformation.

camera frame (Image center) = (x positive right, y positive forward, z positive up)

Expected result (mm) = (70, 58, 61)

Sample poses

Camera TF with respect to base: 0.69707078 -0.71647673 -0.02744854 0.45573 ] [-0.70122018 -0.68921554 0.18240667 -0.472279 ] [-0.14960809 -0.10790289 -0.98283996 0.550253 ] 0. 0. 0. 1. ]]

TipTobase [[ 0.69707078 -0.71647673 -0.02744854 0.38573 ] [-0.70122018 -0.68921554 0.18240667 -0.522279 ] [-0.14960809 -0.10790289 -0.98283996 0.489253 ] 0. 0. 0. 1. ]]

CamtoObj [[-0.03914843 0.01842608 0.9990635 -0.51285579] [-0.99525835 -0.08981271 -0.03734288 0.39422273] 0.08904052 -0.99578821 0.02185474 -0.57194535] 0. 0. 0. 1. ]]

def estimate_calibration_matrix(camera_to_base_poses, end_effector_to_base_poses, sphere_parameters):
    print("sphere_parameters", sphere_parameters)
    sphere_parameters = sphere_parameters / 1000
    sphere1_offsets = sphere_parameters[0:3]
    sphere2_offsets = sphere_parameters[3:6]
    sphere1_radius = sphere_parameters[6]
    sphere2_radius = sphere_parameters[7]
    sphere_distance = sphere_parameters[8]  # Distance between sphere centers

    camera_to_base = camera_to_base_poses
    offset1 = sphere1_offsets
    offset2 = sphere2_offsets

    object_position1 = np.dot(camera_to_base, np.append(offset1, 1))[:3]
    object_position2 = np.dot(camera_to_base, np.append(offset2, 1))[:3]

    object_position1 = object_position1.reshape(1, 3)
    object_position2 = object_position2.reshape(1, 3)

    # Use Rotation object to directly obtain axis angles:
    rotation, _ = Rotation.align_vectors(object_position1.reshape(1, 3), object_position2.reshape(1, 3))
    axis_angle = rotation.as_rotvec()

    object_pose = np.concatenate((object_position1.flatten(), axis_angle))
    # camera_to_object = np.linalg.inv(np.array(object_pose).reshape(4, 4))  # Inverse of object_pose
    object_transform = np.eye(4)
    object_transform[:3, :3] = rotation.as_matrix().T  # Rotation
    object_transform[:3, 3] = -np.dot(rotation.as_matrix().T, object_position1.flatten())  # Translation

    # Calculate camera-to-object pose:
    # camera_to_object = np.linalg.inv(object_transform)
    return object_transform

EEwrtobase = get_actual_tcp_pose(), I get the EE pose with respect to base in Angle Axis representation, Based on the above offsets camera to base is calculated,

def calculate_camera_transform_with_respect_to_base(end_effector_tcp_pose):
    camera_offset = np.array([0.070, 0.050, 0.061])
    camera_transform = np.eye(4)
    rotation = Rotation.from_rotvec(end_effector_tcp_pose[3:])# Assuming angle-axis 
    rotation_submatrix = camera_transform[:3, :3]
    rotation_submatrix = np.dot(rotation.as_matrix(), rotation_submatrix)
    camera_transform[:3, :3] = rotation_submatrix
    camera_transform[:3, 3] = end_effector_tcp_pose[:3] + camera_offset

    print("Camera Transform with Respect to Base:\n", camera_transform)
    return camera_transform

I have tried ParkMartin Calibration, Dual Quaternions etc from the reference below http://math.loyola.edu/~mili/Calibration/index.html, but data has significant gap.

0

There are 0 best solutions below