Weekend study summary (LIO calibration + use of kitti data set + use of evo tool + open source data set + use of ssh)

LIO calibration

The most commonly used open source calibration packages on the Internet are lidar_IMU_calib, which is open source from Zhejiang University, and lidar-align, which is open source from the Autonomous Driving Laboratory of the Swiss Federal Institute of Technology in Zurich. It is said on the Internet that the latter method cannot be obtained because it is pure IMU points. Accurate Odom data, so pure imu and lidar calibration cannot be performed. I haven't looked at the code in detail yet, I will make changes after reading it later.

lidar_IMU_calib

github code
paper

The installation process can be found here

The first thing you need to understand is what is calibration?
Refer here for the answer

1.Lidar’s external parameters

Generally, manufacturers have done the calibration, so you can leave it alone for the time being.

2.IMU internal parameters

According to the IMU measurement model:

am = RGI ( a − g ) + ba ( t ) + na \mathrm{a}_{\mathrm{m}}=\mathrm{R}_{\mathrm{G}}^{\mathrm{I}} (\mathrm{a}-\mathrm{g})+\mathrm{b}_{\mathrm{a}}(\mathrm{t})+\mathrm{n}_{\mathrm{a}}am=RGI(ag)+ba(t)+na

w m = w + b g ( t ) + n g \mathrm{w}_{\mathrm{m}}=\mathrm{w}+\mathrm{b}_{\mathrm{g}}(\mathrm{t})+\mathrm{n}_{\mathrm{g}} wm=w+bg(t)+ng

The left side of the equal sign is the measured value, a, w are the actual true values, and g is the acceleration of gravity (it can be seen that the IMU will have a +g output when the z-axis is square, and it is normalized, indicating "several g", ba b_aba b g b_g bgis random walk noise, changing with time, and ng , na n_g,n_ang,nais a measurement of Gaussian white noise. After the IMU is started, the rotation matrix R to the world system/terrestrial system/Earth system needs to be determined , and two biases need to be estimated in real time .

3. Conversion between Lidar and IMU

The transformation between Lidar and IMU includes two parts: 6DoF rigid body transformation , and the time difference between the sensor delay . First of all, let’s talk about the time difference. Since the sensor is not synchronous when measuring (sampling), and because there is a delay from the sensor to the system, there is a time difference between the two signals, which affects the accuracy of the measured value interpolation. As shown in the figure, ta, tb t_a,t_bta,tbThey are the delays of IMU and camera respectively. The difference after delay is td t_dtd, so you need to add this td t_d to the sampled cameratdTo get the moment sampled in the IMU time system (generally, papers use IMU time as the system reference time).

Alt
Rigid body transformation , that is, calculating the rigid body transformation from Lidar to IMU. In general papers, for convenience, the IMU system is often used as the robot system, and the Lidar point cloud data is transformed into the IMU system. Therefore, it is necessary to obtain the rigid body transformation from Lidar to IMU. That is, the parameters R, t R,tR,t

Since I am using an RS-16 radar, and this code only supports VLP-16 radar, I need to make some adjustments myself.
There is one online that has been modified to RS-32, refer to here

For the compilation process and code understanding, you can take a look here

lidar_IMU_calib dataset

The data obtained by calibration can be used to run some LIO, such as lego-loam. The program for adding IMU data can be found here . For problems encountered when running lego-loam, please refer here
. Note: rosbag play --clock *.bag

lidar_align

github code connection

It is said that lidar_align is not suitable for calibration of pure IMU and Lidar! Accurate Odom data cannot be obtained by pure IMU integration.
Lidar_align compilation and some errors encountered can be found here and Lidar and Odom external parameter calibration: lidar_align code learning

Ubuntu16.04 lio_sam algorithm records its own data set to build a map

Code explanation can be found here

Use of kitti data set

Regarding the method of converting bin files into rosbag, there is corresponding code. Please refer to here . The corresponding README has detailed steps.

There is also a way to directly publish the topic in the form of pointcloud through a program through the bin file, so that there is no need to generate a special rosbag package to occupy space. This author modified it based on the above program. Please refer to here for details .

The introduction and explanation of the kitti data set are available here , here , and here , but I think it is useless. You can also check the website. I think the official website is more clear and provides several tools for use. About Just knowing the data set is enough. Specific usage examples are given here .

Open source dataset

1. kitti data set (RGB+Lidar+GPS+IMU) (scene: urban area, rural area, highway)
2. ASL EuRoC data set (binocular RGB+IMU) (scene: micro aircraft in two different rooms, Large industrial environment)
3. TUM VI banchmark (fisheye(fisheye)+imu) (should be indoors)

Reference: SLAM Dataset
SLAM Related Dataset Survey
SLAM Learning – Open Source Test Data Collection

How to use the evo tool

evo is an evaluation tool that can align trajectories based on timestamps, stretch and align trajectories of different scales according to the standard trajectories you specify, and calculate evaluation parameters such as mean square error to evaluate the performance of the slam algorithm.

It is recommended to use quick installation for evo installation.

pip install evo --upgrade --no-binary evo

I tried to use the source code to install, but was persuaded to quit. Maybe I was too good at it. Either the python version was not enough or the rosbags version of the library was not enough. After nearly a day of tossing, I still couldn't get it. I chose the quick installation considering the time issue. For installation methods, please refer to the evaluation tool evo installation and use , as well as problems encountered and solutions and evo use, as well as evo installation and use.

I also encountered some problems when using the quick installation. It seemed that no library was installed, and then I solved it by using Baidu. ModuleNotFoundError: No module named '_tkinter' solution

These are mainly used when learning lego-loam. For the use of evo, you can also refer to how to use the evo tool to evaluate the results of running LeGO-LOAM on the KITTI data set , as well as the PCL gadget 2 that may be used : using kitti's GT ( ground truth) to create a laser point cloud map

Use of ssh

I searched a lot of routines on the Internet for VSCode to use SSH, but all of them were unsuccessful. I always couldn't connect until I found this routine. The environment setup process for VSCode to connect to Ubuntu through SSH can solve the problem of connecting vscode on Ubuntu to the Raspberry Pi. Ubuntu, but it still cannot solve the problem of connecting my vscode on Windows to Ubuntu on Raspberry Pi. At the same time, the problem encountered when connecting to ubuntu is: I entered the password of the Raspberry Pi three times, and the message given after entering the password each time was, Permission denied. After entering it three times, I could not connect until I referred to the VSCode ssh-remote plug-in . Solved using incorrect password . If it cannot be solved, it should be solved if there is no response after inputting the password remotely through vscode ssh and it keeps prompting for the password .

Updated on 2022/9/25

Recently I think it is good to install an additional xshell software to control ssh. The specific method is here: Xshell connects to the host through ssh

Updated on 2023/7/11
When using the Raspberry Pi yourself, you need to first copy id_rsa.pub under the C drive of Windows to the .ssh folder under the Raspberry Pi. The id_rsa should be

If XHR fail occurs when connecting for the first time, it is because vscode needs to be connected to the Internet. It will automatically download something to the Raspberry Pi so that it can be connected. If it is not connected to the Internet, it cannot be downloaded, so this error will be reported.

Guess you like

Origin blog.csdn.net/weixin_41756645/article/details/126673894