Build two-dimensional laser SLAM from scratch --- use IMU and wheel speedometer for motion distortion correction of single-line laser radar

The previous articles only used 2D lidar. This article will introduce two new sensors, IMU and wheel speedometer.

First, a brief introduction to these two sensors is given, and then the data of these two sensors is simply processed, and then the IMU, wheel speedometer and lidar data are simply time-synchronized, and finally the use of these two sensors for single-line lidar is realized. Motion distortion correction function.

The implementation of the code in this paper is based on the motion distortion correction part in LIO-SAM.

1 Introduction to IMU and wheel speedometer

1.1 IMU

1.1.1 Introduction

The full name of IMU is Inertial Measurement Unit, an inertial measurement unit, a sensor mainly used to detect and measure acceleration and rotational motion.

IMU generally includes 3 functions,

  • 3-axis accelerometer: measure the acceleration in the three directions of xyz, the acceleration of the z-axis is the magnitude of the gravity value (in the case of IMU level)
  • 3-axis gyroscope (angle meter): Measure the angular velocity of the IMU rotating around the xyz axis respectively
  • 3-axis magnetometer: Measures the direction of the earth's magnetic field and is used to determine the direction of the IMU on the xyz axis, but it is greatly affected by magnets, metals, etc.

The combination of the first two functions is called a 6-axis IMU, and the combination of these three functions is called a 9-axis IMU.

The frequency of the IMU can be very high, ranging from 10hz-400hz.

1.1.2 Message format in ros

The message format of the IMU in ROS is as follows. Some IMUs will provide the integrated attitude value, that is, orientation, and some will not provide it. You need to obtain the attitude value by integrating the angular velocity.

The two most commonly used are angular velocity and acceleration.

$ rosmsg show sensor_msgs/Imu

std_msgs/Header header		# 消息头
  uint32 seq
  time stamp				# 时间戳
  string frame_id			# 坐标系
geometry_msgs/Quaternion orientation		# IMU的当前姿态,4元数的形式表示
  float64 x
  float64 y
  float64 z
  float64 w
float64[9] orientation_covariance			# 姿态的协方差
geometry_msgs/Vector3 angular_velocity		# IMU的3轴角速度
  float64 x
  float64 y
  float64 z
float64[9] angular_velocity_covariance		# IMU的3轴角速度的协方差
geometry_msgs/Vector3 linear_acceleration	# IMU的3轴加速度
  float64 x
  float64 y
  float64 z
float64[9] linear_acceleration_covariance	# IMU的3轴加速度的协方差

1.2 Wheel Speedometer

1.2.1 Introduction

The wheel speedometer is an encoder installed on the motor. It calculates the distance and angle traveled by the robot through the number of revolutions of the motor. It is called Odometry in ROS, and it is translated as an odometer.

Later, with the development of SLAM, the odometer represented more meanings, such as lidar odometer, visual odometer, etc. The meaning of these representatives is similar to that of the wheel speedometer, and they are acquired through various methods. Radar or camera The distance and angle moved during this period.

The frequency of the wheel speed meter is generally not too high, generally 20hz-50hz.

The odometer used in this paper refers to the wheel speedometer, that is, the distance and angle obtained through the encoder.

1.2.2 Message format in ros

$ rosmsg show nav_msgs/Odometry 
std_msgs/Header header
  uint32 seq
  time stamp				# 时间戳
  string frame_id			# 里程计坐标系名字,一般为odom
string child_frame_id		# 里程计坐标系指向的子坐标系名字,一般为base_link或者footprint
geometry_msgs/PoseWithCovariance pose		# 带协方差的位姿
  geometry_msgs/Pose pose
    geometry_msgs/Point position			# 位置,x y z
      float64 x
      float64 y
      float64 z
    geometry_msgs/Quaternion orientation	# 四元数表示的姿态
      float64 x
      float64 y
      float64 z
      float64 w
  float64[36] covariance
geometry_msgs/TwistWithCovariance twist		# 带协方差的速度值
  geometry_msgs/Twist twist
    geometry_msgs/Vector3 linear			# 线速度
      float64 x
      float64 y
      float64 z
    geometry_msgs/Vector3 angular			# 角速度
      float64 x
      float64 y
      float64 z
  float64[36] covariance

The linear velocity and angular velocity also exist in the odometer, but this linear velocity and angular velocity is obtained by dividing the distance by the time, which is not as accurate as the IMU.

In most cases, the angular velocity of the IMU and the distance from the wheel speedometer are preferred.

2 code explanation

2.1 Get the code

The code has been submitted on github, the address of github is
https://github.com/xiangli0608/Creating-2D-laser-slam-from-scratch

It is recommended to use git clone to download, because the code is being updated, and the code downloaded by git clone can be easily updated by git pull.

The code corresponding to this article is Creating-2D-laser-slam-from-scratch/lesson5/src/lidar_undistortion.cc and Creating-2D-laser-slam-from-scratch/lesson5/include/lesson5/lidar_undistortion.h

2.2 Callback function

Since 3 threads are used, when saving imu and odom data, it needs to be locked first to prevent two places from changing imu_queue_ and odom_queue_ at the same time.

// imu的回调函数
void LidarUndistortion::ImuCallback(const sensor_msgs::Imu::ConstPtr &imuMsg)
{
    
    
    std::lock_guard<std::mutex> lock(imu_lock_);
    imu_queue_.push_back(*imuMsg);
}

// odom的回调函数
void LidarUndistortion::OdomCallback(const nav_msgs::Odometry::ConstPtr &odometryMsg)
{
    
    
    std::lock_guard<std::mutex> lock(odom_lock_);
    odom_queue_.push_back(*odometryMsg);
}

// scan的回调函数
void LidarUndistortion::ScanCallback(const sensor_msgs::LaserScan::ConstPtr &laserScanMsg)
{
    
    
    // 缓存雷达数据
    if (!CacheLaserScan(laserScanMsg))
        return;

    // 如果使用imu,就对imu的数据进行修剪,进行时间同步,并计算雷达数据时间内的旋转
    if (use_imu_)
    {
    
    
        if (!PruneImuDeque())
            return;
    }

    // 如果使用odom,就对odom的数据进行修剪,进行时间同步,并计算雷达数据时间内的平移
    if (use_odom_)
    {
    
    
        if (!PruneOdomDeque())
            return;
    }

    // 对雷达数据的每一个点进行畸变的去除
    CorrectLaserScan();

    // 将较正过的雷达数据以PointCloud2的形式发布出去
    PublishCorrectedPointCloud();

    // 参数重置
    ResetParameters();
}

2.3 Caching radar data

The radar data is saved through the double-ended queue, and there are at least 2 data in the queue to prevent the data time of the imu or odom from including the radar data time.

After that, the timestamp of current_laserscan_ and the timestamp of the end of the point cloud are obtained.

// 缓存雷达数据
bool LidarUndistortion::CacheLaserScan(const sensor_msgs::LaserScan::ConstPtr &laserScanMsg)
{
    
       
    if (first_scan_)
    {
    
    
        first_scan_ = false;

        // 雷达数据间的角度是固定的,因此可以将对应角度的cos与sin值缓存下来,不用每次都计算
        CreateAngleCache(laserScanMsg);

        scan_count_ = laserScanMsg->ranges.size();
    }

    corrected_pointcloud_->points.resize(laserScanMsg->ranges.size());

    // 缓存雷达数据
    laser_queue_.push_back(*laserScanMsg);

    // 缓存两帧雷达数据,以防止imu或者odom的数据不能包含雷达数据
    if (laser_queue_.size() < 2)
        return false;

    // 取出队列中的第一个数据
    current_laserscan_ = laser_queue_.front();
    laser_queue_.pop_front();

    // 获取这帧雷达数据的起始,结束时间
    current_laserscan_header_ = current_laserscan_.header;
    current_scan_time_start_ = current_laserscan_header_.stamp.toSec(); // 认为ros中header的时间为这一帧雷达数据的起始时间
    current_scan_time_increment_ = current_laserscan_.time_increment;
    current_scan_time_end_ = current_scan_time_start_ + current_scan_time_increment_ * (scan_count_ - 1);
    
    return true;
}

2.4 Time Synchronization and Angle Integration of IMU

Since the double-ended queue is used to store the IMU data, the time of the IMU in the queue is continuous.

At the same time, since the IMU is 100hz, and the radar is 10hz, the frequencies of the two are different, and the time stamps will not match, so time synchronization is required.

The idea of ​​time synchronization is to first delete the data that is too early in the double-ended queue of the imu.

Then find the first IMU data whose IMU time is earlier than the current frame radar start time.

Take this IMU data as the starting point, use the subsequent IMU data to integrate, and obtain the rotation between the two IMU data respectively. imu_rot_x_ saves the angle value of the rotation at the current moment relative to the moment of the first frame of imu data.

Repeat this operation until the time stamp of the IMU is later than the time after one rotation of the current frame of radar data. That is, the time of these IMU data includes the time of the radar data turning around.

// 修剪imu队列,以获取包含 当前帧雷达时间 的imu数据及转角
bool LidarUndistortion::PruneImuDeque()
{
    
    
    std::lock_guard<std::mutex> lock(imu_lock_);

    // imu数据队列的头尾的时间戳要在雷达数据的时间段外
    if (imu_queue_.empty() ||
        imu_queue_.front().header.stamp.toSec() > current_scan_time_start_ ||
        imu_queue_.back().header.stamp.toSec() < current_scan_time_end_)
    {
    
    
        ROS_WARN("Waiting for IMU data ...");
        return false;
    }

    // 修剪imu的数据队列,直到imu的时间接近这帧点云的时间
    while (!imu_queue_.empty())
    {
    
    
        if (imu_queue_.front().header.stamp.toSec() < current_scan_time_start_ - 0.1)
            imu_queue_.pop_front();
        else
            break;
    }

    if (imu_queue_.empty())
        return false;

    current_imu_index_ = 0;

    sensor_msgs::Imu tmp_imu_msg;
    double current_imu_time, time_diff;

    for (int i = 0; i < (int)imu_queue_.size(); i++)
    {
    
    
        tmp_imu_msg = imu_queue_[i];
        current_imu_time = tmp_imu_msg.header.stamp.toSec();

        if (current_imu_time < current_scan_time_start_)
        {
    
    
            // 初始角度为0
            if (current_imu_index_ == 0)
            {
    
    
                imu_rot_x_[0] = 0;
                imu_rot_y_[0] = 0;
                imu_rot_z_[0] = 0;
                imu_time_[0] = current_imu_time;
                ++current_imu_index_;
            }
            continue;
        }

        // imu时间比雷达结束时间晚,就退出
        if (current_imu_time > current_scan_time_end_)
            break;

        // get angular velocity
        double angular_x, angular_y, angular_z;
        angular_x = tmp_imu_msg.angular_velocity.x;
        angular_y = tmp_imu_msg.angular_velocity.y;
        angular_z = tmp_imu_msg.angular_velocity.z;

        // 对imu的角速度进行积分,当前帧的角度 = 上一帧的角度 + 当前帧角速度 * (当前帧imu的时间 - 上一帧imu的时间)
        double time_diff = current_imu_time - imu_time_[current_imu_index_ - 1];
        imu_rot_x_[current_imu_index_] = imu_rot_x_[current_imu_index_ - 1] + angular_x * time_diff;
        imu_rot_y_[current_imu_index_] = imu_rot_y_[current_imu_index_ - 1] + angular_y * time_diff;
        imu_rot_z_[current_imu_index_] = imu_rot_z_[current_imu_index_ - 1] + angular_z * time_diff;
        imu_time_[current_imu_index_] = current_imu_time;
        ++current_imu_index_;
    }

    // 对current_imu_index_进行-1操作后,current_imu_index_指向当前雷达时间内的最后一个imu数据
    --current_imu_index_;

    return true;
}

2.5 Time synchronization and translation calculation of wheel tachometer

The time synchronization method of the wheel speedometer is basically the same as that of the IMU, but because the wheel speedometer is a pose transformation in a fixed coordinate system, we do not need to integrate each frame of data by ourselves.

Directly find a frame of odom data before the start of the radar data, and a frame of odom data after the radar rotates one circle, find the coordinate transformation between these two odom data, and then get the total translation around the time of the radar one circle rotation.

// 修剪imu队列,以获取包含 当前帧雷达时间 的odom的平移距离
bool LidarUndistortion::PruneOdomDeque()
{
    
    
    std::lock_guard<std::mutex> lock(odom_lock_);

    // imu数据队列的头尾的时间戳要在雷达数据的时间段外
    if (odom_queue_.empty() ||
        odom_queue_.front().header.stamp.toSec() > current_scan_time_start_ ||
        odom_queue_.back().header.stamp.toSec() < current_scan_time_end_)
    {
    
    
        ROS_WARN("Waiting for Odometry data ...");
        return false;
    }

    // 修剪odom的数据队列,直到odom的时间接近这帧点云的时间
    while (!odom_queue_.empty())
    {
    
    
        if (odom_queue_.front().header.stamp.toSec() < current_scan_time_start_ - 0.1)
            odom_queue_.pop_front();
        else
            break;
    }

    if (odom_queue_.empty())
        return false;

    // get start odometry at the beinning of the scan
    double current_odom_time;

    for (int i = 0; i < (int)odom_queue_.size(); i++)
    {
    
    
        current_odom_time = odom_queue_[i].header.stamp.toSec();

        if (current_odom_time < current_scan_time_start_)
        {
    
    
            start_odom_msg_ = odom_queue_[i];
            continue;
        }

        if (current_odom_time <= current_scan_time_end_)
        {
    
    
            end_odom_msg_ = odom_queue_[i];
        }
        else
            break;
    }

    // if (int(round(start_odom_msg_.pose.covariance[0])) != int(round(end_odom_msg_.pose.covariance[0])))
    //     return false;

    start_odom_time_ = start_odom_msg_.header.stamp.toSec();
    end_odom_time_ = end_odom_msg_.header.stamp.toSec();

    tf::Quaternion orientation;
    double roll, pitch, yaw;

    // 获取起始odom消息的位移与旋转
    tf::quaternionMsgToTF(start_odom_msg_.pose.pose.orientation, orientation);
    tf::Matrix3x3(orientation).getRPY(roll, pitch, yaw);

    Eigen::Affine3f transBegin = pcl::getTransformation(
        start_odom_msg_.pose.pose.position.x,
        start_odom_msg_.pose.pose.position.y,
        start_odom_msg_.pose.pose.position.z,
        roll, pitch, yaw);

    // 获取终止odom消息的位移与旋转
    tf::quaternionMsgToTF(end_odom_msg_.pose.pose.orientation, orientation);
    tf::Matrix3x3(orientation).getRPY(roll, pitch, yaw);

    Eigen::Affine3f transEnd = pcl::getTransformation(
        end_odom_msg_.pose.pose.position.x,
        end_odom_msg_.pose.pose.position.y,
        end_odom_msg_.pose.pose.position.z,
        roll, pitch, yaw);

    // 求得这之间的变换
    Eigen::Affine3f transBt = transBegin.inverse() * transEnd;

    // 通过 transBt 获取 odomIncreX等,一帧点云数据时间的odom变化量
    float rollIncre, pitchIncre, yawIncre;
    pcl::getTranslationAndEulerAngles(transBt,
                                      odom_incre_x_, odom_incre_y_, odom_incre_z_,
                                      rollIncre, pitchIncre, yawIncre);
    return true;
}

2.6 Distortion correction for each point of radar data

Text explaining how to perform distortion correction

The current single-line lidar has only one single-point laser transmitter, and the 360-degree range scanning is realized by rotating the mirror or rotating the laser transmitter for a circle.

And because the laser point data of this week are not obtained at the same time, the radar data will produce motion distortion when the radar moves.

for example

For a 360 single-line laser radar, the position of the laser radar is at (1, 0) when the first point is emitted, and when it hits an object and reflects back, the measured distance value is 1.3m. Since the lidar is in a forward state, after the laser rotates for one circle, the position of the lidar when the last point is emitted is assumed to be (1.1, 0), and when it hits the above object again and reflects back, the measured distance value becomes 1.2 m.

The general laser radar driver will not deal with this phenomenon, and the final data will become, the laser radar is at the position (1, 0), and the object 1.3m in front is observed, the first of the radar data points returned a value of 1.3m, while the last point of the radar data measured a distance of 1.2m for the same object.

This is the principle of distortion, which is caused by the change of the coordinate system of the lidar when the distance value is acquired at different times.

How to perform distortion correction

Knowing how the distortion occurs, it becomes clear how to correct the distortion.

You only need to find the laser radar coordinate system at the corresponding moment of each laser point, and the coordinate transformation between the laser radar coordinate system at the moment of launching the first point can transform each laser point to the moment of launching the first point In the laser radar coordinate system, the distortion correction is completed.

First, assume that the moment when the radar emits the first point is time_start , and the radar coordinate system at this time is frame_start .

The time of other radar points is time_point , the radar coordinate system at this time is frame_point , and the coordinates of other points in the frame_point coordinate system are point .

Only need to find the coordinate transformation between frame_start and frame_point , then the coordinate point of other points can be transformed into the frame_start coordinate system through coordinate transformation .

After performing the above operations on all points, the coordinates of the obtained points are the coordinates after de-distortion.

This is the process of distortion correction.

Code Implementation of Distortion Correction

The implementation of the code is the same as the above text, so I won't go into details.

// 对雷达数据的每一个点进行畸变的去除
void LidarUndistortion::CorrectLaserScan()
{
    
    
    bool first_point_flag = true;
    double current_point_time = 0;
    double current_point_x = 0, current_point_y = 0, current_point_z = 1.0;

    Eigen::Affine3f transStartInverse, transFinal, transBt;

    for (int i = 0; i < scan_count_; i++)
    {
    
    
        // 如果是无效点,就跳过
        if (!std::isfinite(current_laserscan_.ranges[i]) ||
            current_laserscan_.ranges[i] < current_laserscan_.range_min ||
            current_laserscan_.ranges[i] > current_laserscan_.range_max)
            continue;

        // 畸变校正后的数据
        PointT &point_tmp = corrected_pointcloud_->points[i];

        current_point_time = current_scan_time_start_ + i * current_scan_time_increment_;

        // 计算雷达数据的 x y 坐标
        current_point_x = current_laserscan_.ranges[i] * a_cos_[i];
        current_point_y = current_laserscan_.ranges[i] * a_sin_[i];

        float rotXCur = 0, rotYCur = 0, rotZCur = 0;
        float posXCur = 0, posYCur = 0, posZCur = 0;

        // 求得当前点对应时刻 相对于start_odom_time_ 的平移与旋转
        if (use_imu_)
            ComputeRotation(current_point_time, &rotXCur, &rotYCur, &rotZCur);
        if (use_odom_)
            ComputePosition(current_point_time, &posXCur, &posYCur, &posZCur);

        // 雷达数据的第一个点对应时刻 相对于start_odom_time_ 的平移与旋转,之后在这帧数据畸变过程中不再改变
        if (first_point_flag == true)
        {
    
    
            transStartInverse = (pcl::getTransformation(posXCur, posYCur, posZCur,
                                                        rotXCur, rotYCur, rotZCur))
                                    .inverse();
            first_point_flag = false;
        }

        // 当前点对应时刻 相对于start_odom_time_ 的平移与旋转
        transFinal = pcl::getTransformation(posXCur, posYCur, posZCur,
                                            rotXCur, rotYCur, rotZCur);

        // 雷达数据的第一个点对应时刻的激光雷达坐标系 到 雷达数据当前点对应时刻的激光雷达坐标系 间的坐标变换
        transBt = transStartInverse * transFinal;

        // 将当前点的坐标 加上 两个时刻坐标系间的坐标变换 
        // 得到 当前点在 雷达数据的第一个点对应时刻的激光雷达坐标系 下的坐标
        point_tmp.x = transBt(0, 0) * current_point_x + transBt(0, 1) * current_point_y + transBt(0, 2) * current_point_z + transBt(0, 3);
        point_tmp.y = transBt(1, 0) * current_point_x + transBt(1, 1) * current_point_y + transBt(1, 2) * current_point_z + transBt(1, 3);
        point_tmp.z = transBt(2, 0) * current_point_x + transBt(2, 1) * current_point_y + transBt(2, 2) * current_point_z + transBt(2, 3);
    }
}

2.6.1 Get the rotation at this moment

ComputeRotation() The rotation value returned by this function is the total rotation generated by pointTime relative to the time of the first imu data earlier than the current frame radar data time.

// 根据点云中某点的时间戳赋予其 通过插值 得到的旋转量
void LidarUndistortion::ComputeRotation(double pointTime, float *rotXCur, float *rotYCur, float *rotZCur)
{
    
    
    *rotXCur = 0;
    *rotYCur = 0;
    *rotZCur = 0;

    // 找到在 pointTime 之后的imu数据
    int imuPointerFront = 0;
    while (imuPointerFront < current_imu_index_)
    {
    
    
        if (pointTime < imu_time_[imuPointerFront])
            break;
        ++imuPointerFront;
    }

    // 如果上边的循环没进去或者到了最大执行次数,则只能近似的将当前的旋转 赋值过去
    if (pointTime > imu_time_[imuPointerFront] || imuPointerFront == 0)
    {
    
    
        *rotXCur = imu_rot_x_[imuPointerFront];
        *rotYCur = imu_rot_y_[imuPointerFront];
        *rotZCur = imu_rot_z_[imuPointerFront];
    }
    else
    {
    
    
        int imuPointerBack = imuPointerFront - 1;

        // 根据线性插值计算 pointTime 时刻的旋转
        double ratioFront = (pointTime - imu_time_[imuPointerBack]) / (imu_time_[imuPointerFront] - imu_time_[imuPointerBack]);
        double ratioBack = (imu_time_[imuPointerFront] - pointTime) / (imu_time_[imuPointerFront] - imu_time_[imuPointerBack]);

        *rotXCur = imu_rot_x_[imuPointerFront] * ratioFront + imu_rot_x_[imuPointerBack] * ratioBack;
        *rotYCur = imu_rot_y_[imuPointerFront] * ratioFront + imu_rot_y_[imuPointerBack] * ratioBack;
        *rotZCur = imu_rot_z_[imuPointerFront] * ratioFront + imu_rot_z_[imuPointerBack] * ratioBack;
    }
}

2.6.2 Get the translation at this moment

The distortion of radar data is mainly caused by rotation, and the distortion caused by translation is relatively small. And the frequency of the wheel speed meter is generally not very high.

Therefore, the coordinate difference of the wheel speedometer before and after the radar data is directly used as the translation amount of this section, and each value within the radar rotation time range is not saved.

The translation at pointTime is calculated by linear interpolation.

// 根据点云中某点的时间戳赋予其 通过插值 得到的平移量
void LidarUndistortion::ComputePosition(double pointTime, float *posXCur, float *posYCur, float *posZCur)
{
    
    
    *posXCur = 0;
    *posYCur = 0;
    *posZCur = 0;

    // 根据线性插值计算 pointTime 时刻的旋转
    double ratioFront = (pointTime - start_odom_time_) / (end_odom_time_ - start_odom_time_);

    *posXCur = odom_incre_x_ * ratioFront;
    *posYCur = odom_incre_y_ * ratioFront;
    *posZCur = odom_incre_z_ * ratioFront;
}

2.7 Publish the corrected point cloud

// 将较正过的雷达数据以PointCloud2的形式发布出去
void LidarUndistortion::PublishCorrectedPointCloud()
{
    
    
    // ROS_INFO("publish point cloud");
    corrected_pointcloud_->width = scan_count_;
    corrected_pointcloud_->height = 1;
    corrected_pointcloud_->is_dense = false; // contains nans

    // 将scan_msg的消息头 赋值到 PointCloudT的消息头
    pcl_conversions::toPCL(current_laserscan_header_, corrected_pointcloud_->header);

    // 由于ros中自动做了 pcl::PointCloud<PointT> 到 sensor_msgs/PointCloud2 的数据类型的转换
    // 所以这里直接发布 pcl::PointCloud<PointT> 即可
    corrected_pointcloud_publisher_.publish(corrected_pointcloud_);
}

3 run

3.1 run

For the data package corresponding to this article, please get it on the network disk

Link: https://pan.baidu.com/s/1b5zSiFKBwQnkOLSAaoWwiQ
Extraction code: slam

After downloading, change the bag_filename in launch to your actual directory name.

Run the program corresponding to this article through the following command

roslaunch lesson5 lidar_undistortion.launch

3.2 Running results and analysis

After starting, the following screen will be displayed in rviz.

insert image description hereThe yellow point cloud in the picture is the point cloud after distortion.

If you want to see the original point cloud, just click the blank box on the right side of LaserScan on the left, and a red point cloud will appear, which is the original data.

When recording data, a long-term rotation was performed at the intersection of the corridor at an angular velocity of 0.8 rad/s to detect the effect of the point cloud after distortion correction. The following picture is a screenshot of the rotation.

Red is the original radar data, and yellow is the distortion-corrected point cloud data.

The robot moves very fast. If you can’t see it clearly, you can press pause playing the bag file to observe the state of the radar data.
insert image description here
insert image description hereIt can be seen that the effect of distortion correction is still obvious, and the radar data becomes neater when rotating.

When the robot mainly performs translational motion, the distortion generated is relatively small, so the effect of distortion correction is not obvious.

4 Summary and Next

This article first briefly introduces the two sensors of IMU and wheel speedometer, and shows in the code how to perform simple processing of the data of these two sensors, perform IMU, Odom, time synchronization with radar data, and use these two The sensor is corrected for motion distortion of the radar data.

By carefully observing the point cloud effect before and after correction, it can be found that the distortion of the lidar is particularly obvious when it is rotated.

If the radar only moves in translation, the data before and after the distortion are not much different.

I don’t know if the students have had this kind of experience, that is, in the process of using the open source algorithm to build the map, if the rotation is fast, the image will overlap, which is caused by the serious distortion of the lidar data during rotation.

The next article will use IMU, wheel speedometer and lidar to realize the front-end odometer together. The code will be written based on Karto, and a small map will be saved using a sliding window.

This is also the last front-end implementation of this series of articles.

Due to my limited level, it is inevitable that there will be some errors in the code. If you find errors or bad places, please contact me and I will correct them.


The article will be updated synchronously on the official account: SLAM from scratch , everyone is welcome to pay attention, you can add my WeChat in the official account, enter the laser SLAM communication group , and exchange SLAM technology together.

If you have any suggestions for the articles I wrote, or want to see how the functions are implemented, please reply directly in the official account, I can receive them, and will seriously consider your suggestions.

insert image description here

Guess you like

Origin blog.csdn.net/tiancailx/article/details/114924197