SLAM project: reproduce the 2D laser odometer from 0, and use your own radar or gazebo to run, explain the principle and code implementation process in detail

When learning SLAM and working on projects, I have been looking at codes and calling codes, but I seldom actually write codes related to SLAM, so I gradually became a giant in theory and a rookie in practice. After learning from the pain, I decided to pass some small projects gradually Implement a SLAM system, exercise your hands-on ability, and add a touch to your future job resume.

Table of contents

Effect overview:

Step1: Subscription of radar data

Step2: Radar data initialization (preprocessing)

CreateCache()

GetBaseToLaserTf()

LaserScanToLDP()

Step3: ICP algorithm for pose matching

GetPrediction()

CreateTfFromXYTheta()

sm_icp()

PublishTFAndOdometry()

epilogue 

Effect overview:

What this article did: Based on C++, a simple laser odometer was implemented in the ROS environment, which can display the change of pose and motion trajectory in real time under RVIZ, and run through with its own radar.

Environment: UBUNTU20.04, C++, ROS

Radar: Radium God M10P 

Show results:

The yellow connection in the figure is the movement trajectory, odom represents the starting point of our movement, and base_link represents the radar we hold. When we hold the radar, the posture and trajectory of the base_link will change, as shown in the following figure:

 The overall algorithm flow and core functions are shown in the figure below:

Next, follow the sequence of this flow chart and combine the code to see how to implement a radar odometer step by step

Key points: There have been many coordinate transformations in the realization of mileage timing. It is necessary to figure out the relationship between different coordinate systems, which coordinate system each data is in, and why the coordinate transformation is performed, otherwise it will be a mess. Every time it involves When the coordinates are transformed, I will specifically explain them. I hope everyone will pay attention

Coordinate transformation is a very important link in SLAM. Fully understanding the transformation relationship between each coordinate and why coordinate transformation is necessary helps us to quickly understand the code. For the understanding of coordinate transformation, please refer to the following blog post for comparison Detailed and easy to understand, so I won’t introduce more here

In laser SLAM and ROS, the understanding of several coordinate systems and coordinate transformations of map, odom, laser_link, and base_link

Step1: Subscription of radar data

ros::NodeHandle node_handle_;
ros::Subscriber laser_scan_subscriber;
laser_scan_subscriber_ = node_handle_.subscribe(
        "/scan", 1, &ScanMatchPLICP::ScanCallback, this);

 This step is a routine operation of ROS. First, register the node handle, and then create a subscriber to subscribe to the radar topic.

Step2: Radar data initialization (preprocessing)

CreateCache()

The purpose of this function is to calculate and store the fixed parameters in the radar data, which is beneficial to avoid repeated calculations. The common fixed parameters include the angle value corresponding to each scanning point, the maximum distance and the minimum distance of the scan. The code is as follows:

void ScanMatchPLICP::CreateCache(const sensor_msgs::LaserScan::ConstPtr &scan_msg)
{
    //这里计算sin和cos值的目的是因为在后续需要用到这些角度值把雷达的数据转换为
    //icp算法需要的数据
    a_cos_.clear();
    a_sin_.clear();
    double angle;

    for (unsigned int i = 0; i < scan_msg->ranges.size(); i++)
    {
        angle = scan_msg->angle_min + i * scan_msg->angle_increment;
        a_cos_.push_back(cos(angle));
        a_sin_.push_back(sin(angle));
    }

	//保存扫描距离的最大和最小的目的是后续用这两个值来判断一个雷达数据是否有效
    input_.min_reading = scan_msg->range_min;
    input_.max_reading = scan_msg->range_max;
}

GetBaseToLaserTf()

The purpose of this function is to publish the coordinate transformation from base_link to laser_link,


bool ScanMatchPLICP::GetBaseToLaserTf(const std::string &frame_id)
{
    ros::Time t = ros::Time::now();

    geometry_msgs::TransformStamped transformStamped;
    transformStamped.header.stamp = ros::Time::now();
    transformStamped.header.frame_id = frame_id;
    transformStamped.child_frame_id = base_frame;
    transformStamped.transform.translation.x = 0.0;
    transformStamped.transform.translation.y = 0.0;
    transformStamped.transform.translation.z = 0.0;
    transformStamped.transform.rotation.x = 0.0;
    transformStamped.transform.rotation.y = 0.0;
    transformStamped.transform.rotation.z = 0.0;
    transformStamped.transform.rotation.w = 1.0;
    tf2::fromMsg(transformStamped.transform, base_to_laser_);
    laser_to_base_ = base_to_laser_.inverse();
    return true;
}

This code releases a static coordinate conversion from baser_link to laser_link, and it is considered that the two coordinate systems coincide, because we directly hold the radar for testing, and did not say that the radar is installed on a certain position of the robot, so the radar and the robot are considered to be the same One, then the coordinate system is naturally coincident.

LaserScanToLDP()

The purpose of this function is to convert the subscribed radar data into the data format required by the ICP algorithm, so that it can be fed to the ICP algorithm for inter-frame registration to calculate the pose.

void ScanMatchPLICP::LaserScanToLDP(const sensor_msgs::LaserScan::ConstPtr &scan_msg, LDP &ldp)
{
    unsigned int n = scan_msg->ranges.size();
    ldp = ld_alloc_new(n);

    for (unsigned int i = 0; i < n; i++)
    {
        // calculate position in laser frame
        double r = scan_msg->ranges[i];

        if (r > scan_msg->range_min && r < scan_msg->range_max)
        {
            // 填充雷达数据
            ldp->valid[i] = 1;
            ldp->readings[i] = r;
        }
        else
        {
            ldp->valid[i] = 0;
            ldp->readings[i] = -1; // for invalid range
        }

        ldp->theta[i] = scan_msg->angle_min + i * scan_msg->angle_increment;
        ldp->cluster[i] = -1;
    }

    ldp->min_theta = ldp->theta[0];
    ldp->max_theta = ldp->theta[n - 1];

    ldp->odometry[0] = 0.0;
    ldp->odometry[1] = 0.0;
    ldp->odometry[2] = 0.0;

    ldp->true_pose[0] = 0.0;
    ldp->true_pose[1] = 0.0;
    ldp->true_pose[2] = 0.0;
}

At this point, the radar data preprocessing is complete, let's review what we did:

  • The fixed parameters in the radar data are calculated and stored
  • Get the coordinate transformation relationship between base_link and laser_link
  • Convert the radar data into the data required by the ICP algorithm for subsequent feeding algorithms

 Note: GetBaseToLaserTf and CreateCache are only called once when the first frame of radar data comes in, and will not be called again in the future, but each frame of data will call LaserScanToLDP to convert the data of this frame into the data required by ICP

Step3: ICP algorithm for pose matching

The third step seems very simple, just run an algorithm, but there are many details in it, first we need to use the uniform motion model to provide an initial pose prediction for the algorithm, this step can provide a good initial value for the algorithm , everyone knows that iterative algorithms such as ICP are very hungry for initial values. If the initial value is good, the iteration will be fast and good. Then we call the ICP algorithm to perform inter-frame matching to calculate the pose, and publish an accumulated odometer and tf transformation as the visualization content.

Note that in this step we will perform multiple coordinate system transformations, respectively in the two places of pose initial value prediction and motion trajectory release .

Coordinate transformation is required for pose initial value prediction , because we predict the position that we may move to at the next moment based on the uniform motion model. In this process, we observe the movement of the robot (base_link) while standing in the world coordinate system (map), but Finally, the radar that uses this initial value for iteration is in the laser_link coordinate system, so we need to convert the predicted pose from map to laser_link, that is, map-odom-base_link-laser_link.

It is necessary to transform when publishing the movement trajectory, because the movement data obtained based on the radar is under laser_link, but we want to observe the movement of the robot in the world coordinate system, so we need to convert the movement of the radar from laser_link to the map . But in ROS, the coordinate transformation is a number structure, which cannot be done in one step, so we have carried out multiple transformations from laser_link to base_link to odom and finally to map. It is more complicated here, but it is easy to understand this and then look at the code. Understood.

GetPrediction()

As the name implies, predict the pose at the next moment through the uniform motion model, and provide the initial value for ICP

void ScanMatchPLICP::GetPrediction(double &prediction_change_x,
                                   double &prediction_change_y,
                                   double &prediction_change_angle,
                                   double dt)
{
    // 速度小于 1e-6 , 则认为是静止的
    prediction_change_x = latest_velocity_.linear.x < 1e-6 ? 0.0 : dt * latest_velocity_.linear.x;
    prediction_change_y = latest_velocity_.linear.y < 1e-6 ? 0.0 : dt * latest_velocity_.linear.y;
    prediction_change_angle = latest_velocity_.linear.z < 1e-6 ? 0.0 : dt * latest_velocity_.linear.z;

    if (prediction_change_angle >= M_PI)
        prediction_change_angle -= 2.0 * M_PI;
    else if (prediction_change_angle < -M_PI)
        prediction_change_angle += 2.0 * M_PI;
}

The uniform motion model means that we assume that the robot moves at a constant speed within the time of dt, so as to calculate the position and angle of the robot at the next moment according to the linear velocity and angular velocity *dt in dt. The latest_velocity_ here is essentially a geometry_msgs::Twist The type of data is obtained by running the ICP of the previous frame of radar data \Delta x, \Delta y, \Delta \Thetaand divided by dt, and the radar is used here as an odometer, so this position prediction is calculated based on the odometer data and is in the odom coordinate system. If To feed the radar ICP as its matching initial value, it needs to convert from odom coordinate system to laser_link, as follows:

double dt = (time - last_icp_time_).toSec();
    double pr_ch_x, pr_ch_y, pr_ch_a;
    GetPrediction(pr_ch_x, pr_ch_y, pr_ch_a, dt);
    // predicition_change用来保存tf变换关系
    tf2::Transform prediction_change;
    CreateTfFromXYTheta(pr_ch_x, pr_ch_y, pr_ch_a, prediction_change);

    // 将odom坐标系下的坐标变换 转换成 base_in_odom_keyframe_坐标系下的坐标变换
    prediction_change = prediction_change * (base_in_odom_ * base_in_odom_keyframe_.inverse());

    // 将base_link坐标系下的坐标变换 转换成 雷达坐标系下的坐标变换
    tf2::Transform prediction_change_lidar;
    prediction_change_lidar = laser_to_base_ * base_in_odom_.inverse() * prediction_change * base_in_odom_ * base_to_laser_;
    // 转到lser_link以后,才能够喂给icp,input_就是icp的输入参数,下面的代码是设定icp的初值的意思
    input_.first_guess[0] = prediction_change_lidar.getOrigin().getX();
    input_.first_guess[1] = prediction_change_lidar.getOrigin().getY();
    input_.first_guess[2] = tf2::getYaw(prediction_change_lidar.getRotation());

CreateTfFromXYTheta()

The function of this function is to use x, y, theta to assign a TF conversion matrix T, x, y, theta describes the conversion between two frames of radar, the conversion matrix T in TF can also be used to describe the conversion between two frames of radar transformation, but in RVIZ we realize visualization by publishing TF, so here we change the description form from xytheta to transformation matrix T.

void ScanMatchPLICP::CreateTfFromXYTheta(double x, double y, double theta, tf2::Transform &t)
{
    t.setOrigin(tf2::Vector3(x, y, 0.0));
    tf2::Quaternion q;
    q.setRPY(0.0, 0.0, theta);
    t.setRotation(q);
}

sm_icp()

After completing all this, you only need to throw the data to the ICP algorithm, and the sm_icp line of code can get it done:

sm_icp(&input_, &output_);

input_ is the incoming data. The LaserScanToLDP function in the radar data preprocessing before converts the current radar data into ICP format and puts it in input_. Of course, input_ also contains some default parameter configurations, but We only do a simple implementation here, and do not perform precise parameter adjustment, so skip it first. The ICP algorithm will save the output result in output_, including the pose transformation obtained by matching, , , and \Delta xso \Delta yon \Delta \Theta.

The last is to release the mileage data and TF and calculate the linear velocity and angular velocity. The linear velocity and angular velocity are used to provide the initial value for the next ICP. At the same time, coordinate transformation is involved here, because the value calculated by ICP is in the laser_link coordinates. Under the system, the odometer data and TF we released finally are in the odom coordinate system, so we need to do a transformation from laser_link to odom, as follows:

  tf2::Transform corr_ch;

    if (output_.valid)
    {
        // 雷达坐标系下的坐标变换
        tf2::Transform corr_ch_l;
        CreateTfFromXYTheta(output_.x[0], output_.x[1], output_.x[2], corr_ch_l);

        // 将雷达坐标系下的坐标变换 转换成 base_link坐标系下的坐标变换
        corr_ch = base_to_laser_ * corr_ch_l * laser_to_base_;

        // 更新 base_link 在 odom 坐标系下 的坐标
        base_in_odom_ = base_in_odom_keyframe_ * corr_ch;
        
        //计算线速度和角速度,这两个速度在下一帧数据到来的时候会用来计算一个预测的位置
        //为下一次ICP提供初值
        latest_velocity_.linear.x = corr_ch.getOrigin().getX() / dt;
        latest_velocity_.angular.z = tf2::getYaw(corr_ch.getRotation()) / dt;
    }
    else
    {
        ROS_WARN("not Converged");
    }

PublishTFAndOdometry()

Finally, publish the mileage data and TF, and you can view it in real time through RVIZ when the odometer is running.

void ScanMatchPLICP::PublishTFAndOdometry()
{
    geometry_msgs::TransformStamped tf_msg;
    tf_msg.header.stamp = current_time_;
    tf_msg.header.frame_id = odom_frame_;
    tf_msg.child_frame_id = base_frame_;
    tf_msg.transform = tf2::toMsg(base_in_odom_);

    // 发布 odom 到 base_link 的 tf
    tf_broadcaster_.sendTransform(tf_msg);

    nav_msgs::Odometry odom_msg;
    odom_msg.header.stamp = current_time_;
    odom_msg.header.frame_id = odom_frame_;
    odom_msg.child_frame_id = base_frame_;
    tf2::toMsg(base_in_odom_, odom_msg.pose.pose);
    odom_msg.twist.twist = latest_velocity_;

    // 发布 odomemtry 话题
    odom_publisher_.publish(odom_msg);
}

At this point, we have completed a 2D laser odometer. Before finishing the sprinkling of flowers, let’s review what we have done in this step:

  • Predict the position at the next moment through the uniform motion model, and provide the initial value for ICP
  • Carry out coordinate conversion and convert the initial value to laser_link
  • Call the ICP algorithm for inter-frame matching to obtain pose transformation
  • Convert the obtained pose transformation result from laser_link to odom
  • Publish TF and odometry based on pose transformation

epilogue 

Write the above code in the callback function of the radar, connect the radar, open the radar node in ROS, and then launch the odometer, you can check the effect in RVIZ, use the Radium God M10P to run the odometer, and rotate The accuracy is very high, but there is still room for improvement in terms of translation. In the future, we may study how to solve this problem. If you have a friend who does not have a radar, you can use gazebo to do the simulation. The effect is the same. This follow-up will also be considered. Write a blog about how to test this small odometer in gazebo. The next step is to study how to build a map. I hope to accumulate knowledge and skills step by step, and finally realize a complete self-developed slam system~ 

Guess you like

Origin blog.csdn.net/weixin_43890835/article/details/131544842