Reading classic literature--LIW-OAM (LiDAR-IMU-encoder fusion SLAM)

0. Introduction

We have often been exposed to the complementary information of lidar and inertial measurement units (IMU) before. However, if the IMU undergoes severe bumps during actual use, it may cause the IMU to fail. The widely used Iterative Closest Point (ICP) algorithm can only provide constraints for attitude, while velocity can only be constrained by IMU preintegration. Therefore, velocity estimates tend to be updated with attitude results. There is a recent blog that introduces the method of integrating radar, imu and odometry " LIW-OAM: Lidar-Inertial-Wheel Odometry and Mapping ", which proposes an accurate and powerful LiDAR inertial wheel odometry and mapping system. Measurements from LiDAR, IMU and wheel encoders are integrated in a beam adjustment (BA) based optimization framework. The involvement of wheel encoders can provide velocity measurements as an important observation, which helps LI-OAM provide more accurate state predictions. In addition, constraining the speed variable through the observations of the wheel encoder in the optimization can further improve the accuracy of the state estimation. Experimental results on two public datasets show that our system outperforms all state-of-the-art LI-OAM systems in terms of absolute trajectory error (ATE), and embedding wheel encoders can significantly improve the performance of LI-OAM based on the BA framework. performance. This project is currently open source on Github

1. Article contribution

In this paper, we use an additional wheel encoder sensor, and the cost of the wheel encoder sensor is much lower than that of AHRS. Experimental results on the public datasets nclt [3] and kaist [7] show that our system outperforms the existing state-of-the-art LI-OAM systems in terms of absolute trajectory error (ATE). There are two main contributions:

  1. We propose a novel BA-based LIW-OAM system that embeds the wheel encoder velocity observations into BA-based LI optimization. Our LIW-OAM system outperforms most state-of-the-art LI-OAM systems in terms of accuracy.

  2. We have released the source code of this work to facilitate the development of the community.


2. Preliminary knowledge

2.1 Coordinate system

We use ( ⋅ ) w (·)^ w()w ( ⋅ ) l (·)^ l ()l ( ⋅ ) o (·) ^o ()o and( ⋅ ) k ( )^ k()k represents a three-dimensional point in the world coordinate system, LiDAR coordinate system, IMU coordinate system and odometer (i.e. wheel encoder) coordinate system. World coordinates and( ⋅ ) l ( ) ^l()l overlap at the starting position. In all coordinate systems, the x-axis points forward, the y-axis points to the left, and the z-axis points upward. We useli l_iliIndicates that at time ti t_itiCarry out the iiLiDAR coordinates of scan i , with oi o_ioiexpressed in ti t_itiThe corresponding IMU coordinates at the time, and then from the LiDAR coordinates li l_iliArrive at IMU o_ioiThe transformation matrix (ie, external parameters) is expressed as T lioi ∈ SE (3) T^{o_i}_{l_i} ∈ SE(3)TlioiSE ( 3 )

- - Insert image description here

Among them, $T^{o_i} {l_i} consists of the rotation matrix and the rotation matrixFrom the rotation matrix R^{o_i}{l_i} ∈ SO(3)and the translation vector and the translation vectorand translation vector t^{o_i}{l_i} ∈ \mathbb{R}3$. The extrinsic parameters are usually calibrated offline and remain constant during online pose estimation; therefore, we can simply use $To_l$ to denote $T^{o_i}{l_i}. Similarly, the transformation from odometry coordinates to IMU coordinates is denoted as . Similarly, the transformation from odometry coordinates to IMU coordinates is expressed as. Similarly, the transformation from odometer coordinates to IM U coordinates is expressed as T ^o_k$, which includesR ko R^o_kRkoandtkot ^o_ktko

We represent rotations using a rotation matrix R and a Hamilton quaternion q. In state vectors we mostly use quaternions, but rotation matrices are also used to rotate 3D vectors conveniently. ⊗ ⊗ represents the multiplication operation between two quaternions. Finally, we use( ⋅ ^ ) (\hat{·})(^ )represents a noisy measurement or estimate of a certain quantity. In addition to the attitude, we also estimate the velocityvvv . Accelerometer deviationba b_abaand gyroscope deviation b ω b_ωboh, they are all uniformly represented by a state vector:

insert image description here

2.2 Scan status expression (need to review CT-ICP)

Inspired by CT-ICP [5], we represent scan SS usingS status:

1) S S S start timetb t_btbstatus (e.g. xb x_bxb);

2) S S S end timete t_etestate (e.g. xe x_exe)。

In this way, [tb, te] [t_b, t_e] can be[tbte] The state of each point during the period is expressed asxb x_bxbxe x_exeThe function. For example, for time tp ∈ [ tb , te ] t_p ∈ [t_b , t_e]tp[tbte] The collection belongs toSSS 'spointppp , you can calculatetp t_ptpstatus at the time:

insert image description here

Among them, slerp(·) is the spherical linear interpolation operator of quaternion.

2.3 IMU-odometer measurement model (skip)

IMU-Odometer consists of a wheel encoder and an IMU which includes an accelerometer and a gyroscope. Raw gyroscope and accelerometer measurements from the IMU, i.e. a ^ t \hat{a}_ta^tω ^ t \hat{ω}_toh^t, respectively:

insert image description here

IMU measurements are measured in the IMU coordinate system, combining forces counteracting gravity and platform dynamics, and subject to accelerometer bias bat b_{a_t}bat, Gyroscope deviation b ω t b_{ω_t}bohtand the effects of additive noise. As mentioned in VINS-Mono [12], the additive noise in accelerometer and gyroscope measurements is modeled as Gaussian white noise, na ∼ N ( 0 , σ a 2 ) , n ω ∼ N ( 0 , σ ω 2 ) n_a∼N(0,σ_a^2), n_ω∼N(0,σ_ω^2)naN(0,pa2)nohN(0,poh2) . The accelerometer bias and gyroscope bias are modeled as random walks with derivatives as Gaussian distributions,b ˙ at = nba ∼ N ( 0 , σ ba 2 ) , b ˙ ω t = nb ω ∼ N ( 0 , σ b ω 2 ) \dot{b}_{a_t}=n_{b_a}∼N(0,σ_{ba}^2), \dot{b}_{ω_t}=n_{b_ω}∼N(0,σ_ {b_ω}^2)b˙at=nbaN(0,pba2)b˙oht=nbohN(0,pboh2)

The wheel encoder obtains the rotational speed τ of the shaft based on the pulses received by the counter.τ , then according toτ ττ and wheel radius calculate the speed of the left and right rear wheels.

insert image description here

其中, n τ l e f t n_{τ_{left}} ntleft n τ r i g h t n_{τ_{right}} ntright τ l e f t τ_{left} tleft τ r i g h t τ_{right} trightThe corresponding zero-mean white Gaussian noise, v ^ left \hat{v}_{left}v^left v ^ r i g h t \hat{v}_{right} v^rightis from τ ^ ⋅ \hat{τ}·t^andr ⋅ r·r Calculated measured linear velocity of the two wheels. Then, the final measurement model of the wheel encoder odometry, measured in the odometry coordinate system, can be defined as:

insert image description here

Among them, [ nv ] x [n_v]_x[nv]xare two zero-mean Gaussian distributions nv ∼ N ( 0 , σ v 2 ) n_v∼N(0,σ_v^2)nvN(0,pv2) , is still a zero-mean Gaussian distribution.

3. System Overview

Figure 1 illustrates our LIW-OAM framework, which consists of four main modules: preprocessing, initialization, state estimation, and point cloud registration. The preprocessing module downsamples the input raw points and preintegrates the IMU-odometry measurements at the same frequency as the input scans. The initialization block estimates some state parameters, including gravitational acceleration, accelerometer bias, gyroscope bias, and initial velocity. The state estimation module first integrates the IMU-odometer measurements to the last state to predict the current state, and then performs BA-based LIW optimization to optimize the current scanned state. Finally, point cloud registration adds new points to the map and removes points further away from the map.

insert image description here

Figure 1. Overview of LIW-OAM, including four main modules: preprocessing module, initialization module, state estimation module and point registration module.

4. Preprocessing (skimmed)

4.1 Downsampling

Processing a large number of 3D points incurs high computational cost. To reduce computational complexity, we downsample the input points as follows. We scan the current input S i + 1 S_{i+1}Si+1The points fit into a volume of 0.5 x 0.5 x 0.5 0.5 x 0.5 x 0.50.5×0.5×0.5 (unit: meter) in the voxel, and make each voxel contain only one point, to obtain the downsampling scanP i + 1 P_{i+1}Pi+1. This downsampling strategy ensures that the density distribution of points is uniform in 3D space after downsampling.

4.2 Pre-integration

Typically, IMU-odometry sends data at a higher frequency than LiDAR. Pre-integrate two consecutive scans S i S_iSiSum S i + 1 S_{i+1}Si+1All IMU-odometry measurements between hardware platforms can be nicely summarized from time tei t_{e_i}teitei + 1 t_{e_{i+1}}tei+1The dynamic characteristics of ei e_ieisumei + 1 e_{i+1}ei+1Separation is S i S_iSiSum S i + 1 S_{i+1}Si+1The end timestamp of . In this work, we adopt a quaternion-based discrete-time IMU-odometry pre-integration method [10] and use the method in [12] to incorporate the IMU bias. Specifically, at the corresponding IMU coordinates oei o_{e_i}oeiJapanese oei + 1 o_{e_{i+1}}oei+1S i S_i is calculated inSiSum S i + 1 S_{i+1}Si+1全电影图发动,即α ^ ei + 1 ei , η ^ ei + 1 ei , β ^ ei + 1 ei \hat{α}^{e_i}_{e_{i+1}},\hat{η }^{e_i}_{e_{i+1}},\hat{β}^{e_i}_{e_{i+1}}a^ei+1eithe^ei+1eib^ei+1eiγ ^ ei + 1 ei \hat{γ}^{e_i}_{e_{i+1}}c^ei+1ei, among themα ^ ei + 1 ei , β ^ ei + 1 ei \hat{α}^{e_i}_{e_{i+1}},\hat{β}^{e_i}_{e_{i+1) }}a^ei+1eib^ei+1eiγ ^ ei + 1 ei \hat{γ}^{e_i}_{e_{i+1}}c^ei+1eiare the translation, velocity and rotation pre-integrated from the IMU measurements respectively, and η ^ ei + 1 ei \hat{η}^{e_i}_{e_{i+1}}the^ei+1eiis the pre-integrated translation from gyroscope and wheel encoder odometer measurements. In addition, the pre-integrated Jacobian matrix with respect to the bias is also calculated based on the error state kinematics, namely Jba α, Jb ωα, Jba β, Jbωβ, Jbωγ, Jba η and Jba b ω η J^α_{b_a}, J^α_{b_ω}, J^β_{b_a}, J^β_{b_ω}, J^γ_{b_ω}, J^η_{b_a} and J^η_{b_ω }JbaaJbohaJbabJbohbJbohcJbahand Jbohh

5. Initialization (need to review SR-LIO)

The initialization module is designed to estimate all necessary values, including initial attitude, velocity, gravitational acceleration, accelerometer bias, and gyroscope bias, for subsequent state estimation. Similar to our previous work SR-LIO [19], we adopt motion initialization and static initialization for handheld devices and vehicle-mounted devices respectively. For more details about our initialization module, see [19].


6. State Estimation

6.1 State Prediction

When each new downsampling scan P i + 1 P_{i+1}Pi+1When done, we use the IMU-odometry measurements to predict P i + 1 P_{i+1}Pi+1start timestamp (i.e. xbi + 1 wx^w_{b_{i+1}}xbi+1w) and end timestamp (ie xei + 1 wx^w_{e_{i+1}}xei+1w) states to provide LIW-optimized prior motion. Specifically, the predicted state xbi + 1 wx^w_{b_{i+1}}xbi+1w(即 t b i + 1 w , R b i + 1 w , v b i + 1 w , b a b i + 1 , b w b i + 1 t^w_{b_{i+1}},R^w_{b_{i+1}},v^w_{b_{i+1}},b_{a_{b_{i+1}}},b_{w_{b_{i+1}}} tbi+1wRbi+1wvbi+1wbabi+1,bwbi+1) is assigned the value

insert image description here

where ω ^ ⋅ , a ^ ⋅ and v ^ ⋅ \hat{ω}·, \hat{a}· and \hat{v}·oh^a^andv^are measurements from the IMU gyroscope, IMU accelerometer and wheel encoder,gwg^wgw is the acceleration due to gravity in the world coordinate system,nnn andn + 1 n+1n+1 is intei t_{e_i}tei t e i + 1 t_{e_{i+1}} tei+1Two time instants during which the IMU odometer measurement value is obtained, δ t δ_tdtis nnn andn + 1 n+1n+1 interval between. We iteratively increase n from 0 totei + 1 − tei / δ t t_{e_{i+1}} -t_{e_{i}} / δ_ttei+1tei/ dtto get xei + 1 wx^w_{e_{i+1}}xei+1w. When n = 0, xnw = xeiwx^w_n = x^w_{e_i}xnw=xeiw. 对于baei + 1 b_{a_{e_{i+1}}}baei+1sum bwei + 1 b_{w_{e_{i+1}}}bwei+1、Our general guide: baei + 1 = baei , bwei + 1 = bwei b_{a_{e_{i+1}}}= b_{a_{e_{i}}}, b_{w_{e_ {i+1}}} = b_{w_{e_{i}}}baei+1=baeibwei+1=bwei

6.2 BA-based LIW-optimization (main innovation)

We jointly use lidar, inertial and wheel encoder measurements to optimize the current scan P i + 1 P_{i+1}Pi+1The starting state of (ie xbi + 1 wx^w_{b_{i+1}}xbi+1w) and the end state (i.e. xbi + 1 ex^e_{b_{i+1}}xbi+1e), where the variable vector is expressed as:

…For details, please refer to Gu Yueju

Guess you like

Origin blog.csdn.net/lovely_yoshino/article/details/130087918