-
Notifications
You must be signed in to change notification settings - Fork 2.8k
Description
Hi,
I am working on a simple environment to test camera calibration (eye to hand). I used a similar approach as in this example:
https://github.com/andyzeng/visual-pushing-grasping/blob/master/calibrate.py
where I just control the robot to the grid points using the DifferentialIKController.
so basically I collect the points from a chessboard with the camera using rgb and depth images and the tcp coordinates to later make the calibration. The camera is set to convention="world".
My set up is: robot (ur5) to the origin and a
front view camera with pos=(1.9, 0, 1.2), rot= (0,-0.24, 0, 1)The problem is when calculating the camera transformation the translation is almost same as in the simulation
camera to world: [ 1.87515023 -0.01142518 1.17738762]however, the rotation matrix is wrong, it seems like there is some rotation or change in the axis:
Reference Rotation Matrix:
[[ 0.89107413 -0. -0.45385779]
[ 0. 1. -0. ]
[ 0.45385779 0. 0.89107413]]camera rotation: [[ 0.01442082 0.99988884 0. ]
[ 0.49407605 0. -0.86941025]
[-0.86929908 0.01440935 -0.49407638]]
camera quaternion: [ 0.61357978 0.60766878 -0.35358656 0.35949277]which ofc yields to a completely different rotation (pointing to the sky).
So I am wondering what I am missing to make it work.
Thank you in advance