Hand-eye calibration, 9-point calibration process and calculation

In the industrial field, it is often encountered that the camera is installed in the robot hand, and the camera quickly guides the robot hand to work. Among them, the function of the 9-point calibration is to convert the coordinates of the image into the coordinates of the robot hand.
The difference between different calibration files: The difference between different calibrations is: whether the image coordinate system matches the robot coordinate system, and whether the single-pixel accuracy matches. The single-pixel accuracy is affected by the height difference between the camera and the measured object. If the height difference between the camera and the measured object remains unchanged, the calibration file may not be changed.
The meaning of the 9-point calibration:
to calibrate the field of view, the calibrated field of view should be greater than 2/3 of the camera's field of view, so as to ensure that the camera can be found in time when distortion occurs.
1. Calculate the pixel equivalent. Through the calculation after the 9-point calibration, a change from the pixel value to the movement value of the robot hand can be obtained. After we calibrate, we can get a fixed value of pixel equivalent.
2. Transform the image coordinate system into the same direction coordinate system of the robot arm.
Why is it called the same coordinate system?
When we calibrate, we need to make a one-to-one correspondence between the image coordinates and the corresponding robot hand coordinates. The image coordinate system usually defaults to the critical point of view in the upper left corner as the starting origin of the image, which is (0, 0) [image coordinate system]. The east of the image is the positive direction of the X axis, and the south of the image is the positive direction of the Y axis. Of course, due to various problems, the coordinate system of our robot hand is often chaotic and not fixed, and the actual working unit is the robot hand. Our camera needs to guide the robot hand, so even if the coordinate system of the robot hand is different In some cases, 9-point calibration can be used to link the coordinate system of the robot hand with the coordinate system of the image and calculate the state that the coordinate system is in the same direction.
Our calibration steps:
1. Start hand-eye calibration after installing the equipment.
2. Determine the coordinate system of the work calculation. In actual production, the coordinate system of visual calculation is often diversified. Select the work coordinate system according to the actual situation, and at the same time It also needs to be in the working coordinate system when teaching. I directly take the Base0 coordinate system of the robot hand. The reason is: the Base0 coordinate system is the fixed coordinate system of the robot hand, the origin of this coordinate system is on the base of the robot, and the change of the coordinate system is only related to the installation error of the robot.
3. Move the manipulator to the camera height selected by the plan, then adjust the focal length of the camera, and adjust the aperture to the maximum, and adjust the exposure time of the camera in the camera control software, so that the object is clearly visible in the field of view, and the source of interference is as far as possible few.
4. Take the KUKA robot as an example. Select the robot as the global coordinate system, which is the Base0 coordinate system (base coordinate system). Move the robot hand so that the center of the camera is aligned with the detected object (or the center of the calibrated object), and record the current robot coordinates X, Y, Z, A and other values.
5. Set the moving distance of the robot hand. Set the moving distance of the robot hand according to the actual situation. Usually, 9 points are all in the screen, and the area occupied by the connection of 9 points is about 2/3 of the field of view. Here I use 20mm as the offset value. Then set the center point of the field of view as point 4, and the machine coordinates are (0, 0). Then the corresponding coordinates are:
insert image description here
For example, if the mechanical coordinates of point 0 are set to (-20, -20), we will move the X-axis by -20 and the Y-axis by -20 in the robot hand, and nothing else will change. The same is true for our other coordinates.
6. Then generate the calibration file for verification.
Refer to the Halcon program

*已知A坐标系的9个点
Ax:=[-20,-20,-20,0,0,0,20,20,20]
Ay:=[-20,0,20,20,0,-20,-20,0,20]
 
*待识别的B坐标点,和上面的A坐标系点一一对应
Bx:=[]
By:=[]
dev_get_window (WindowHandle)
* Image Acquisition 01: Code generated by Image Acquisition 01
list_files ('E:/762halcon/标定', ['files','follow_links'], ImageFiles)
tuple_regexp_select (ImageFiles, ['\\.(tif|tiff|gif|bmp|jpg|jpeg|jp2|png|pcx|pgm|ppm|pbm|xwd|ima|hobj)$','ignore_case'], ImageFiles)
for Index := 0 to |ImageFiles| - 1 by 1
    read_image (Image, ImageFiles[Index])
    * Image Acquisition 01: Do something
    dev_display (Image)
    * 以下过程为在B坐标系找A中的9个点
    draw_rectangle1 (WindowHandle, Row1, Column1, Row2, Column2)
    gen_rectangle1 (Rectangle, Row1, Column1, Row2, Column2)
    reduce_domain (Image, Rectangle, ImageReduced)
    threshold (ImageReduced, Regions, 0, 211)   
    *找到了B坐标系的点(Row,Column)
    area_center (Regions, Area, Row, Column)
    Bx:=[Bx,Column]
    By:=[By,Row]
endfor
*得到目标变换矩阵HomMat2D
vector_to_hom_mat2d (Bx, By, Ax, Ay, HomMat2D)
 
*保存变换矩阵
serialize_hom_mat2d (HomMat2D, SerializedItemHandle)
open_file ('my_vector.mat', 'output_binary', FileHandle) 
fwrite_serialized_item (FileHandle, SerializedItemHandle) 
close_file (FileHandle)
 
stop ()
*读取变换矩阵,测试
open_file ('my_vector.mat', 'input_binary', FileHandle) 
fread_serialized_item (FileHandle, SerializedItemHandle) 
deserialize_hom_mat2d (SerializedItemHandle, HomMat2D_9p) 
close_file (FileHandle)
tx:=20
ty:=30
affine_trans_point_2d (HomMat2D_9p, tx, ty, Qx, Qy)

7. At the same time, the calibration coordinate system needs to correspond to the actual situation of the program. If the current coordinates in the program are the teaching coordinates, then the coordinate system we calibrate needs to be opposite to the direction of the robot coordinate system.
KUKA robot reference calibration file program:
lin_rel{x -15}
lin_rel{y-15} //the first point
lin_rel{x 15}
lin_rel{x 15}
lin_rel{y 15}
lin_rel{x -15}
lin_rel{x -15}
lin_rel{y 15}
lin_rel{x 15}
lin_rel{x15} //The last point
lin_rel{x-15}
lin_rel{y -15} (back to the origin)
After calibration, the image coordinate system is opposite to the robot coordinate system , applicable to the current coordinate-teaching coordinate program.

Calibration problem:

1. The photos that follow the robot movement are slanted.
As long as it moves according to the coordinate system of the robot, no matter whether the connection line of the image is oblique or not, it will not affect the accuracy.
2. There is an error in the calibration result.
Calibration coordinate system problem, if you calibrate in Base0, the error in the result is the robot installation error, it is recommended to re-install the robot base correctly; if you calibrate in other robot coordinate systems, the error in the result is due to the Z axis of the robot The transformation is performed, but there is no change in the current coordinate system. The actual Z-axis under Base0 starts to change. You can change the coordinate system to Base0 for calibration.
3. Why is the mechanical coordinate of point 4 (the center of the field of view) set to 0 during calibration?
When we are in the calculation process, the advantage of choosing the mechanical coordinates in the field of view to be 0 is that when we teach, we can directly record the point in the center of the field of view as the teaching point, then the offset can directly map the coordinates The coordinates based on the current coordinate system can be obtained after the coordinates + the taught coordinates, which is convenient for calculation.

Reference calibration picture

insert image description here
insert image description here
insert image description here
insert image description here
insert image description here
insert image description here
insert image description here
insert image description here
insert image description here

Guess you like

Origin blog.csdn.net/m0_51559565/article/details/128695107