ZED2 camera calibration--binocular, IMU, joint calibration

In order to obtain the relevant parameters in the VINS configuration file and to make the actual output data of the binocular camera and IMU sensor more accurate, the ZED2 camera was calibrated, including camera calibration, IMU calibration and joint calibration.

1. Install calibration tools

1. Use the kalibr tool to calibrate the ZED2 binocular camera.
2. Use imu_utils to calibrate the IMU, install and compile code_utils and imu_utils in sequence.

This part has been installed before. For the specific process, please refer to the blog: https://blog.csdn.net/xiaojinger_123/article/details/120849737?spm=1001.2014.3001.5501

However, since the system has changed from ubuntu18.04 to 20.04, pay attention to system dependency issues and modify it according to the corresponding version.

1 Download and compile kalibr

sudo apt update
sudo apt-get install python3-setuptools python3-rosinstall ipython3 libeigen3-dev libboost-all-dev doxygen libopencv-dev ros-noetic-vision-opencv ros-noetic-image-transport-plugins ros-noetic-cmake-modules python3-software-properties software-properties-common libpoco-dev python3-matplotlib python3-scipy python3-git python3-pip libtbb-dev libblas-dev liblapack-dev libv4l-dev python3-catkin-tools python3-igraph libsuitesparse-dev 
pip3 install wxPython
sudo pip3 install python-igraph --upgrade
mkdir ~/kalibr_ws/src
cd ~/kalibr_ws/src
git clone --recursive https://github.com/ori-drs/kalibr

cd ~/kalibr_ws
source /opt/ros/noetic/setup.bash
catkin init
catkin config --extend /opt/ros/noetic
catkin config --merge-devel # Necessary for catkin_tools >= 0.4.
catkin config --cmake-args -DCMAKE_BUILD_TYPE=Release

catkin build -DCMAKE_BUILD_TYPE=Release -j4

2 Download and compile code_utils, imu_utils
1) Error Could not find a package configuration file provided by “Ceres”
Solution: Install ceres https://blog.csdn.net/weixin_48083022/article/details/118282363

2) ceres compile error: 'integer_sequence' is not a member of 'std'.
When compiling some projects that use ceres, error: 'integer_sequence' is not a member of 'std' will be reported. This is because of the newer version. Ceres in has requirements for the c++ version

In the CMakeList of the project reporting the error, change
set(CMAKE_CXX_FLAGS “-std=c++11”) to set(CMAKE_CXX_STANDARD 14)

Reference for other issues: https://github.com/gaowenliang/imu_utils/issues/32

2. Select calibration board

Regarding the selection of calibration boards, checkerboard and aprilgrid are more commonly used. (The camera needs to be 1-2m away from the calibration plate, which occupies more than 60% of the field of view). Since Aprilgrid can provide serial number information, it can prevent jumps during attitude calculation. Therefore, it is recommended to use Aprilgrid for calibration.

Note: During the calibration process, the calibration plate should not leave the camera's field of view. The start and end should be smooth, and try to make the calibration plate appear in all corners of the field of view.

checkerboard:
targetCols and targetRows count the number of interior corner points

target_type: 'checkerboard' #gridtype
targetCols: 6               #number of internal chessboard corners
targetRows: 8               #number of internal chessboard corners
rowSpacingMeters: 0.17      #size of one chessboard square [m]
colSpacingMeters: 0.17      #size of one chessboard square [m]

aprilgrid:
tagSpacing=Small grid side length/Large grid side length

target_type: 'aprilgrid' #gridtype
tagCols: 6               #number of apriltags
tagRows: 6               #number of apriltags
tagSize: 0.088           #size of apriltag, edge to edge [m]
tagSpacing: 0.3          #ratio of space between tags to tagSize

Calibration board download: https://github.com/ethz-asl/kalibr/wiki/downloads#calibration-targets

Three ZED2 calibration data recording

The current resolution of ZED2 can be found in common.yaml in the ZED2_WS/src/zed-ros-wrapper/zed_wrapper/params folder. The resolution is 3, which is VGA mode. The actual resolution size is 672*376.

Start the ZED2 ROS node:

roscore
roslaunch zed_wrapper zed2.launch

View the topic, turn on the visualization image, and ensure that the calibration plate is in the left and right eye images:

rostopic list
rosrun image_view image_view image:=/zed2/zed_node/left/image_rect_color
rosrun image_view image_view image:=/zed2/zed_node/right/image_rect_color

1) Error: Command 'rosrun' not found
Solution:sudo apt install ros-noetic-rosbash

2) Error: [rospack] Error: package 'image_view' not found
Solution:sudo apt-get install ros-noetic-image-view

Because zed cameras usually record at a frequency of about 60Hz, and kalibr calibration requires that the frequency not be too high, 4HZ is recommended, so it is necessary to reduce the original topic publishing frequency (too many calibrated images lead to too much calculation), refer to reducing images The data reaches 4Hz, (some blogs recommend 20HZ, either is acceptable, and the processing time is longer), the IMU data reaches 200Hz, ros's approach is to subscribe first and then republish.

Here is a command to lower the frame

rosrun topic_tools throttle messages old_topic 4.0 new_topic//kalibr推荐的帧率为4HZ

Another point here is that there are several topics in zed camera that correspond to the original image, such as:
/zed2/zed_node/left/image_rect_color. This topic can also be used: the resolution has not changed, but after edge cropping of the original image Stretching, which may result in loss of information.

Reduce the image frequency and view the current frequency:

rosrun topic_tools throttle messages /zed2/zed_node/imu/data_raw  200 /zed2/zed_node/imu/data_raw2
rosrun topic_tools throttle messages /zed2/zed_node/left/image_rect_color 4.0 /zed2/zed_node/left/image_rect_color2
rosrun topic_tools throttle messages /zed2/zed_node/right/image_rect_color 4.0 /zed2/zed_node/right/image_rect_color2

rostopic hz /zed2/zed_node/left/image_rect_color2
rostopic hz /zed2/zed_node/right/image_rect_color2

Start recording the calibration bag file, refer to the Youtube video: https://youtu.be/puNXsnrYWTY?t=57 :
Note: During the recording process, ensure that the calibration board does not exceed the screen. When recording, try to ensure that the image is clear, do not move violently, and try to It is possible to activate the IMU at all angles and in all directions.

rosbag record -O Kalib_data_vga.bag /zed2/zed_node/imu/data_raw2 /zed2/zed_node/left/image_rect_color2 /zed2/zed_node/right/image_rect_color2

After the recording is completed, a Kalib_data_vga.bag file will be obtained.

View logged data
Insert image description here

4. Start camera calibration

The Aprilgrid calibration board is used here, and my parameters in the corresponding april.yaml (april_6x6_80x80cm.yaml) are:

target_type: 'aprilgrid' #gridtype
tagCols: 6               #number of apriltags
tagRows: 6               #number of apriltags
tagSize: 0.021           #size of apriltag, edge to edge [m]
tagSpacing: 0.285714          #ratio of space between tags to tagSize

In the kalibr folder, perform calibration, where april.yaml represents the parameter file downloaded together with the calibration board.

Single target setting:

source ~/kalibr_workspace/devel/setup.bash
rosrun kalibr kalibr_calibrate_cameras --bag Kalib_data_vga.bag --topics /zed2/zed_node/left/image_rect_color2 --models pinhole-radtan --target april.yaml 

When executing single-target determination and single-target + IMU joint table timing, the following error will be reported:

Cameras are not connected through mutual observations, please check the dataset. Maybe adjust the approx. sync. tolerance.
Traceback (most recent call last):
  File "/home/ipsg/tool/kalibr_ws/devel/bin/kalibr_calibrate_cameras", line 15, in <module>
    exec(compile(fh.read(), python_script, 'exec'), context)
  File "/home/ipsg/tool/kalibr_ws/src/aslam_offline_calibration/kalibr/python/kalibr_calibrate_cameras", line 447, in <module>
    main()
  File "/home/ipsg/tool/kalibr_ws/src/aslam_offline_calibration/kalibr/python/kalibr_calibrate_cameras", line 204, in main
    graph.plotGraph()
  File "/home/ipsg/tool/kalibr_ws/src/aslam_offline_calibration/kalibr/python/kalibr_camera_calibration/MulticamGraph.py", line 311, in plotGraph
    edge_label=self.G.es["weight"],
KeyError: 'Attribute does not exist'

Solution:

Find the following code in src/Kalibr/aslam_offline_calibration/kalibr/python/kalibr_calibrate_cameras under the workspace, and then comment it out (line 201)

if not graph.isGraphConnected(): 
    obsdb.printTable() 
    print "Cameras are not connected through mutual observations, please check the dataset. Maybe adjust the approx. sync. tolerance." 
    graph.plotGraph() 
    sys.exit(-1)

After commenting it out, the calibration result can be obtained.

Double goal setting:

source ~/kalibr_workspace/devel/setup.bash
rosrun kalibr kalibr_calibrate_cameras --bag Kalib_data_vga.bag --topics /zed2/zed_node/left/image_rect_color2 /zed2/zed_node/right/image_rect_color2 --models pinhole-radtan pinhole-radtan --target april.yaml 

You can add –show-extraction. During the calibration process, you can visualize whether the corner detection is good or not, and it is found that there are serious errors in the corner reprojection; --approx-sync 0.04, of which 0.04 can be adjusted to 0.1 depending on the situation. The function is to make each camera data synchronization.
–bag-from-to parameter, if there is a movement of lifting the camera at the beginning and end of the data set, please add this parameter to remove this part of the data. The lifting and lowering movements will have a certain impact on the calibration. There is no need to add it.

Error 1):

[FATAL] [1636711641.843028]: No corners could be extracted for camera /zed2/zed_node/left/image_rect_color2! Check the calibration target configuration and dataset.
Traceback (most recent call last):
  File "/home/sjj/kalibr_workspace/devel/lib/kalibr/kalibr_calibrate_cameras", line 15, in <module>
    exec(compile(fh.read(), python_script, 'exec'), context)
  File "/home/sjj/kalibr_workspace/src/kalibr/aslam_offline_calibration/kalibr/python/kalibr_calibrate_cameras", line 447, in <module>
    main()
  File "/home/sjj/kalibr_workspace/src/kalibr/aslam_offline_calibration/kalibr/python/kalibr_calibrate_cameras", line 185, in main
    if not cam.initGeometryFromObservations(observations):
  File "/home/sjj/kalibr_workspace/src/kalibr/aslam_offline_calibration/kalibr/python/kalibr_camera_calibration/CameraCalibrator.py", line 56, in initGeometryFromObservations
    success = self.geometry.initializeIntrinsics(observations)
RuntimeError: [Exception] /home/sjj/kalibr_workspace/src/kalibr/aslam_cv/aslam_cameras/include/aslam/cameras/implementation/PinholeProjection.hpp:716: initializeIntrinsics() assert(observations.size() != 0) failed: Need min. one observation

This problem is suspected to be due to the calibration plate being too small and the camera resolution being insufficient. After the change, the corner point information can be successfully detected.

Error 2):
During the calibration process, if it prompts that the initial focal length cannot be obtained, you can set: export KALIBR_MANUAL_FOCAL_LENGTH_INIT=1. Then run the program. When the program fails, it will prompt you to manually enter a focal length, Initialization of focal length failed. Provide manual initialization: At this time, you manually enter, for example, 400. Convergence can also be achieved by giving a relatively large value.

After calibration is completed, 3 files will be generated in the directory

.yaml  主要作用为了后期IMU+相机联合标定
.pdf   以图的方式显示效果
.txt   含有相机的内参以及重投影误差

Insert image description hereAttachment: .yaml file introduction

camera_model//相机模型
T_cam_imu //IMU extrinsics:从IMU到相机坐标的转换(T_c_i)
camera projection type (pinhole / omni)
intrinsics//相机内参
vector containing the intrinsic parameters for the given projection type. elements are as follows:
pinhole: [fu fv pu pv]
omni: [xi fu fv pu pv]
distortion_model//畸变模型
lens distortion type (radtan / equidistant)
distortion_coeffs//畸变参数
parameter vector for the distortion model
T_cn_cnm1//左右摄像头的相对位姿
camera extrinsic transformation, always with respect to the last camera in the chain
(e.g. cam1: T_cn_cnm1 = T_c1_c0, takes cam0 to cam1 coordinates)
timeshift_cam_imu//在捕捉数据时,imu数据和图像时间偏移,相机和IMU时间戳之间的时间间隔,以秒为单位(t_imu = t_cam + shift)
timeshift between camera and IMU timestamps in seconds (t_imu = t_cam + shift)
rostopic //摄像机图像流的主题
topic of the camera's image stream
resolution //相机分辨率[width,height]
camera resolution [width,height]

Calibration result detection:

The distortion_coeffs values should be small, even if not equal to zero
The last value of the first row of the T_cn_cnm1 matrix should be very near to -0.06
The last value of the second row of the T_cn_cnm1 matrix should be near to zero
The last value of the third row of the T_cn_cnm1 matrix should be near to zero
The values on the diagonal of the T_cn_cnm1 matrix should be very near to 1.0
The remaining values of the T_cn_cnm1 matrix should be near to zero

Five IMU parameter calibration

This part can be obtained in the following two ways:
1) By recording static data (more than 2 hours), modify the launch file, use imu_utils for calibration, and obtain the calibration result file, thereby creating the corresponding imu-params.yaml (get the calibration results Acc and The average value of Gyr is filled in the imu.yaml file)
2) In addition to using your own calibrated IMU parameter information, you can also directly use the parameters provided on the official website. It has been verified that it can also be run in the system after calibration.

Here we choose the second method to set imu.yaml. The official parameters have a certain degree of credibility. If the later experimental data is not good, you can recalibrate manually. Reference: https://blog.csdn.net/sinat_16643223/article/details/115416277?spm=1001.2014.3001.5506

Create imu-params.yaml

gedit imu-params.yaml

And fill in the following content:

#Accelerometers
accelerometer_noise_density: 1.4e-03   #Noise density (continuous-time)
accelerometer_random_walk:   8.0e-05   #Bias random walk
 
#Gyroscopes
gyroscope_noise_density:     8.6e-05   #Noise density (continuous-time)
gyroscope_random_walk:       2.2e-06   #Bias random walk
 
rostopic:                    /zed2/zed_node/imu/data_raw2      #the IMU ROS topic
update_rate:                 200.0     #Hz (for discretization of the values above)

Six-camera IMU joint calibration

Joint calibration is mainly to obtain the conversion relationship between the camera and IMU axis system

rosrun kalibr kalibr_calibrate_imu_camera --bag Kalib_data_vga.bag --cam camchain-Kalib_data_vga.yaml --imu imu-params.yaml --target april.yaml

Error reported:

[ERROR] [1637115378.140707]: Optimization failed!
Traceback (most recent call last):
  File "/home/sjj/kalibr_workspace/devel/lib/kalibr/kalibr_calibrate_imu_camera", line 15, in <module>
    exec(compile(fh.read(), python_script, 'exec'), context)
  File "/home/sjj/kalibr_workspace/src/kalibr/aslam_offline_calibration/kalibr/python/kalibr_calibrate_imu_camera", line 246, in <module>
    main()
  File "/home/sjj/kalibr_workspace/src/kalibr/aslam_offline_calibration/kalibr/python/kalibr_calibrate_imu_camera", line 209, in main
    iCal.optimize(maxIterations=parsed.max_iter, recoverCov=parsed.recover_cov)
  File "/home/sjj/kalibr_workspace/src/kalibr/aslam_offline_calibration/kalibr/python/kalibr_imu_camera_calibration/icc_calibrator.py", line 179, in optimize
    raise RuntimeError("Optimization failed!")
RuntimeError: Optimization failed!

This error report can successfully read its corresponding imu information and monocular information. Therefore, it is no problem to say that the data has a high probability. Finally, I found a solution to the problem in kalibr's issues.
If you encounter problems when kalibr is running, you can search on the official website. Generally, someone will help you solve most of the problems you encounter.

Solution:
Please open the kalibr/aslam_offline_calibration/kalibr/python/kalibr_calibrate_imu_camera.py file and search for timeOffsetPadding. I first checked its value and it was 0.03, and then increased the value of this variable. First I increased it to 0.3, but it didn’t work. , increase to 3, the optimization time is too long, resulting in a stuck situation, you can lower it again and debug it according to your own situation. (PS: It cannot be too large. I changed it to 100 at first, and the direct memory was exhausted).

Re-run to get the joint calibration result camchain-imucam-Kalibr_data.yaml file and complete PDF report.
Insert image description here

Guess you like

Origin blog.csdn.net/xiaojinger_123/article/details/121232948#comments_28397014