LIO-SAM runs its own dataset from 0 to 1

LIO-SAM runs its own dataset from 0 to 1

foreword

​The author stepped on a lot of pitfalls when learning LIO_SAM, and here is the record of the entire pitfall process from the beginning to the end. This article refers to a lot of articles by big guys, I am just a porter.You can jump directly to the second part to achieve from 0 to 1If you have any questions, you can contact me at any time, welcome to communicate.

1. Brief introduction of LIO-SAM (You don’t need to read this part, the dry goods are in the second part

A laser inertial navigation tightly coupled SLAM framework that achieves good mapping both indoors and outdoors.

insert image description here

(1) ImageProjection laser motion distortion correction

  • Function introduction

1. Use the imu data between the start and end of the current laser frame to calculate the rotation increment, and the IMU odometer data (from ImuPreintegration) to calculate the translation increment, and then perform motion distortion correction on the laser point at each moment of the laser frame (using relative to the laser The pose increment at the beginning of the frame, transform the current laser point to the coordinate system of the laser point at the initial time, and realize the correction); (the
initial data of the laser frame - the rotation increment obtained by the IMU, obtained by the IMU odometer The translation increment, through the pose increment, transforms the current laser point to the original laser point to achieve correction.)

2. At the same time, use the attitude angle (RPY, roll, pitch, yaw) of the IMU data and the pose of the IMU odometer data to roughly initialize the laser pose of the current frame.

  • subscription

Subscribe to the original IMU data;
subscribe to the IMU odometer data, which comes from ImuPreintegration, indicating the corresponding pose at each moment;
subscribe to the original laser point cloud data.

  • release

Publish the effective point cloud after laser motion distortion correction of the current frame for rviz display;
release the point cloud information after laser motion distortion correction of the current frame, including point cloud data, initial pose, attitude angle, effective point cloud data, etc., release Feature extraction for FeatureExtraction.

(2) FeatureExtraction point cloud feature extraction

  • Function introduction

For the current frame laser point cloud after motion distortion correction, calculate the curvature of each point, and then extract corner points and plane points (judged by the size of the curvature).

  • subscription

Subscribe to the point cloud information after motion distortion correction of the current laser frame, from ImageProjection.

  • release

Publish the point cloud information after the feature extraction of the current laser frame, including historical data including: motion distortion correction, point cloud data, initial pose, attitude angle, effective point cloud data, corner point cloud, plane point point cloud, etc., published to MapOptimization ;
Publish the corner point cloud extracted from the current laser frame for rviz display;
publish the plane point cloud extracted from the current laser frame for rviz display.

(3) ImuPreintegration IMU pre-integration

TransformFusion class

  • Function introduction

The main function is to subscribe to the laser odometer (from MapOptimization) and the IMU odometer, calculate the IMU odometer at the current moment according to the laser odometer at the previous moment, and the IMU odometer transformation increment from this moment to the current moment; rviz displays the IMU odometer trajectory (local).

  • subscription

Subscribe to laser odometry, from MapOptimization;
subscribe to imu odometry, from ImuPreintegration.

  • release

Release the IMU odometer for rviz display;
release the IMU odometer trajectory, which only shows the trajectory between the latest laser odometer moment and the current moment.

  1. ImuPreintegration class
  • Function introduction

Using the laser odometer, the IMU estimated component between the two frames of the laser odometer is used to construct a factor map, and the state of the current frame (including pose, velocity, and bias) is optimized; based on the optimized state, the IMU estimated component is applied to
obtain IMU odometer at each moment.

  • subscription

Subscribe to the IMU raw data, based on the laser odometer optimized by the factor graph, apply the IMU estimated component between two frames, and predict the IMU odometer at each moment (IMU frequency); subscribe to the laser odometer (from MapOptimization),
use The IMU estimated components between two frames construct a factor graph to optimize the pose of the current frame (this pose is only used to update the IMU odometer at each moment and optimize the next factor graph).

  • release

publish imu odometer;

(4) MapOptimization factor map optimization

  • Function introduction

1. Scan-to-map matching: extract the current laser frame feature points (corner points, plane points), local key frame map feature points, perform scan-to-map iterative optimization, and update the current frame pose; 2. Key
frame Factor graph optimization: add keyframes to the factor graph, add laser odometer factors, GPS factors, and closed-loop factors, perform factor graph optimization, and update all keyframe poses; 3. Closed-loop detection: find similar distances and time intervals in historical
keyframes The farther frame is set as the matching frame, and the local key frame map is extracted around the matching frame, and the scan-to-map matching is also performed to obtain the pose transformation, construct the closed-loop factor data, and add factor map optimization.

  • subscription

1. Subscribe to the point cloud information of the current laser frame from FeatureExtraction;
2. Subscribe to the GPS odometer;
3. Subscribe to the closed-loop data provided by the external closed-loop detection program, which is not provided by this program and is actually useless here.

  • release

1. Release the historical key frame odometer;
2. Release the feature point cloud of the local key frame map;
3. Release the laser odometer, which is displayed as a coordinate axis in rviz;
4. Release the laser odometer;
5. Release the laser odometer path,
6. Publish the map storage service; 7.
Publish the closed-loop matching local key frame map;
8. Publish the feature point cloud of the current key frame after the closed-loop optimized pose transformation;
9. Publish the closed-loop Edge, which is shown as the connection between closed-loop frames in rviz;
10. Publish the downsampled plane point set of the local map;
11. Publish the corner point and plane point downsampled set of the historical frame (accumulated);
12. Publish the current frame The point cloud after the original point cloud registration;

Features:

  1. A total of three sensors are used: imu, lidar, GPS (optional or not, here we mainly test the situation of VLP16+IMU) ;
  2. Odometry (IMU frequency) is sent after receiving radar odometer (lidar odometry) information, so the front-end frequency is higher;
  3. IMU odometry provides initial estimates and pre-integration processing, while the IMU raw data performs motion compensation on the radar (two sensors perform data fusion to better estimate the surrounding environment information);
  4. The back-end factor graph optimization includes four factors. The IMU pre-integration combined with the inter-frame constraint factor of the radar odometer is maintained at the pre-integration node, and the other three are maintained at the back-end node, namely the GPS factor, the radar odometer factor and the loopback detection factor. .
  5. Tight coupling: The zero offset of the IMU can be estimated, and the inter-frame constraints of the radar odometer are used for feedback to make the IMU solution more accurate and provide a better initial value.

Supplementary note:

  • ​ The IMU usually consists of three single-axis accelerometers and three single-axis gyroscopes . The accelerometer detects the acceleration signal of the three-axis independent of the carrier in the coordinate system, and the gyroscope detects the angular velocity signal of the carrier relative to the coordinate system. After these signals are processed, the attitude of the vehicle can be solved. The update frequency of the IMU is relatively high, generally up to several hundred to 1KHz. Using three acceleration values, the displacement can be obtained through two integrations to achieve position positioning, and the angular velocity value integration can obtain attitude information, which can be combined to obtain the actual state of the object. It should be noted that the IMU provides relative original positioning information. Its function is to measure the moving route relative to the starting point, so it cannot provide information about your specific location.

    (The 9-axis IMU has three single-axis accelerometers, three single-axis gyroscopes and three single-axis magnetometers)

    For specific theoretical study and source code interpretation, you can search online by yourself (there are a lot of public information in this part), and the author only focuses on how to use LIO_SAM to run your own data set.

2. Realization from 0 to 1 (you can see it directly here)

1. Renderinginsert image description here

Let’s put an effect picture first, and you can probably see the effect of the operation, which is not bad. When the author recorded the news of all topics, it took up a lot of memory, and the short 40s video took up 4G. This is not necessary, we use lidar + IMU, we only need to record radar and imu topics (/velodyne_points and /imu/data) during the actual operation.

2. Hardware introduction

The author uses Velodyne 16-line lidar (VLP16 for short) + 9-axis imu (N100) .

Many bloggers on the Internet introduced that a 9-axis IMU is required to run LIO_SAM, and a 6-axis IMU can also be used. I heard Gao Bo introduce that LIO_SAM also uses a 6-axis. A 9-axis IMU can be used if GPS integration is required or funding permits. In order to avoid controversy, it is better to use 9 axes.

3. Preliminary work

1) Radar data format

​ The LIO-SAM algorithm has relatively strict requirements on the data format of the lidar. The previous single lidar mapping algorithm did not pay attention to this point. Generally, the XYZI(x, y, z, intensity) format is required. But what LIO-SAM requires is the XYZIRT (x, y, z, intensity, ring, timestamp) format, that is, the algorithm uses the number of channels of the lidar ring parameter and the timestamp timestep parameter. When starting the algorithm, it will check whether it has these two parameters. parameters. So how to get these two parameters? If you use the same lidar as the author, you can update the lidar driver to the RSLidar-SDK version to collect. Refer to another tutorial from the big guy to install the latest Sagitar driver: (44 messages) Install Sagitar on Ubuntu18.04 Juichuang's latest driver RSLidar_SDK collects laser point cloud data in XYZIRT format--a small problem of SLAM ignorance_Ferryman 's Blog-CSDN Blog
  But the problem is that the collected data is still not directly usable, the points in the Sagitar format and the Velodyne format point is still problematic. How can this be solved? The big guy found a big guy's conversion node on a certain hub. This node can convert the points in the Sagitar format to the points in the Velodyne format. For the specific installation steps, refer to the author's other tutorial link: (44 messages) Ubuntu18 installation ROS node solution----Sagitar Juchuang radar point cloud format converted to Velodyne radar point cloud format--SLAM ignorant small problem_Ferryman's Blog -CSDN Blog_Sagitar Radar to velodyne article, intrusion)

​ If you are using the same type of lidar as the author, you don't need to consider this issue. The format of VLP16 is the XYZIRT (x, y, z, intensity, ring, timestamp) format, that is, the algorithm uses the channel number ring parameter of the lidar and the timestamp timestep parameter.

2) Consistency of the coordinate system

​ (Here to move the big guy's article, invade and delete) (44 messages) Solve the problems encountered by LIO-SAM running its own data packets – SLAM does not learn and has no skills . There is a requirement for frame_id , and you can find the corresponding source code modification. If you don’t want to change the source code, you can only work hard externally. This involves the mapping relationship of internal coordinates. If it is different, ERROR may cause the map to fail. The frame_id in the data set given by the original work This corresponds to:

//激光雷达数据
/points_raw-----------frame_id:"velodyne"--------
//IMU数据
/imu_raw--------------frame_id:"imu_link"--------
//GPS数据
/gps/fix--------------frame_id:"navset_link"-----

View the frame_id of your own data packet:

// 在播放数据包的时候使用如下指令查看某一Topic的frame_id:
 rostopic echo  /Topic    | grep   frame_id

If it is different from the original one, it needs to be modified. For the specific modification method, you can download a toolkit bag_tools from the official wiki of ROS, which contains many operation tools about data packets. The official tutorial is attached, and you can download and solve it yourself on Baidu. Here is an impression conversion instruction:

 rosrun  bag_tools  change_frame_id.py  -t  /需要要改的topic   -f    新的frame_id    -i      旧.bag    -o    新.bag

4. frequency setting of imu(important)

The frequency of the imu should preferably be greater than 100HZ. When the author uses less than 100HZ, the map will follow the car when turning (accurately speaking, follow the imu) when running data packets. At first, I didn’t notice the problem of the imu frequency. It is thought that the external parameter setting is not accurate enough. In fact, if you have read the original author's paper, you should find that the author directly sets the external parameters as the identity matrix, as shown in the blue circled part in the following figure: translation matrix and rotation matrix

insert image description here

But we can see from the official configuration picture below that the translation of vlp16 and imu cannot be 0. It can be seen that running lio_sam does not require particularly accurate external parameters. If you don’t believe it, you can use the calibration tool to compare the results. See if it is close to the identity matrix.

insert image description here

When the author started running, I always believed in the precise setting of the external parameters and the internal parameters of the imu. When I found that even though my external parameters were very accurate, the running graph was still floating, I realized that something was wrong.Special thanks to the two big guys for their suggestions

5. The placement of imu and vlp16 hardware locations(important)

  1. Try to place the position of the imu and the radar closer, so that it is no problem to set the translation matrix to 0.

  2. The coordinate systems of the imu and radar should be exactly the same. If not, try to ensure that the coordinate systems differ by 90° or 180°;

    Insert a digression here. When I installed it, I didn’t figure out the coordinate system of imu at the beginning, and I made a mess. Later, I asked the merchant and found that the physical coordinate system of imu had been adjusted when outputting in ROS, and I still follow the The imu physical coordinate system is set, and the running image is outrageous. So be sure to figure out what direction the acquired imu data and radar coordinate system are. Here is a picture of the author's installation:

insert image description here

The author's imu and lidar coordinate systems are installed exactly the same (the Z axis is vertically upward), and they are also very close, so the author later directly set the translation matrix to 0, and the rotation matrix to the identity matrix (this is the most easy). The external parameter setting is the conversion relationship between imu and lidar. If this problem is solved from the source, it will be much more convenient.

In the lio-sam parameter file, there is a parameter under the imu internal reference that is the acceleration of gravity of the imu, and the positive and negative indicate the direction:

If the Z axes of imu and vlp16 are up at the same time, it is set to positive, otherwise it is negative.

Also need to pay attentionBaud rate and imu frequency matching, the frequency corresponding to 115200 should not exceed 50HZ, and 961200 can be set to 100HZ upwards.

6. LiDAR and IMU external parameter calibration

You can jump directly to the link: (44 messages) Solve the problems encountered by LIO-SAM running its own data packets – SLAM does not learn and has no skills. Small problems_Ferryman 's Blog-CSDN Blog_lio_sam This blogger wrote very detailed

​ There are many open source calibration methods on the Internet, and the author used lidar_align to calibrate successfully.

According to the above, there is no need for particularly precise external parameters. If you find a warning large bias...large velocity... when the program is enabled, it means that the external parameter setting is wrong, that is, the conversion relationship is wrong. You can use this calibration to see the conversion relationship

(1) Install the lidar_align joint calibration lidar and IMU external parameter program

  1. Download lidar_align
cd 自己的工作空间/src

git clone https://github.com/ethz-asl/lidar_align.git

catkin_make

It is recommended to download the corrected code: lidar_align_wwtx

Create your own ROS workspace, put the source code, and catkin_make.

  • The first compilation will report an error due to the lack of NLOPT library

    Could not find NLOPTConfig.cmake when compiling

    Solution:

    sudo apt-get install libnlopt-dev
    

    Find and move the NLOPTConfig.cmake file to lidar_align/src/ (it can be compiled directly here), if the above error is still reported, add this sentence in CMakeLists.txt:

    list(APPEND CMAKE_FIND_ROOT_PATH ${PROJECT_SOURCE_DIR})
    set (CMAKE_PREFIX_PATH "/usr/local/lib/cmake/nlopt")
    

    As shown in the picture: Because I am away from home, I can’t demonstrate it, so let’s put a picture of the boss(invasion and deletion)(44 messages) Lidar and IMU jointly calibrate and run LIOSAM_Luokei_Jianxin's Blog-CSDN Blog_liosam

insert image description here

Then do a second compilation

  • The second compilation will still report an error, and resolve the conflict (in order, just run the following commands)

    sudo mv /usr/include/flann/ext/lz4.h /usr/include/flann/ext/lz4.h.bak
    
    sudo mv /usr/include/flann/ext/lz4hc.h /usr/include/flann/ext/lz4.h.bak
    
    sudo ln -s /usr/include/lz4.h /usr/include/flann/ext/lz4.h
    
     sudo ln -s /usr/include/lz4hc.h /usr/include/flann/ext/lz4hc.h
    
     catkin_make
    
    

    The third compilation passed

(2) Rewrite the IMU interface

Original link: https://blog.csdn.net/weixin_42141088/article/details/118000544(invasion and deletion)

This tool was not originally used to calibrate lidar and IMU but was used to calibrate lidar and odometer. So it is necessary to rewrite the IMU interface to replace the odometer interface. Therefore, this tool is not accurate to calibrate the above two sensors to a certain extent, but inaccuracy does not mean that it is not available. The big guy who rewrote this interface couldn't be found, and I couldn't connect, so I invaded and deleted the contact. Here's how to do it:
Open the loader.cpp file

找到以下odom部分注释删掉都可
/*  types.push_back(std::string("geometry_msgs/TransformStamped"));
  rosbag::View view(bag, rosbag::TypeQuery(types));

  size_t tform_num = 0;
  for (const rosbag::MessageInstance& m : view) {
    std::cout << " Loading transform: \e[1m" << tform_num++
              << "\e[0m from ros bag" << '\r' << std::flush;

    geometry_msgs::TransformStamped transform_msg =
        *(m.instantiate<geometry_msgs::TransformStamped>());

    Timestamp stamp = transform_msg.header.stamp.sec * 1000000ll +
                      transform_msg.header.stamp.nsec / 1000ll;

    Transform T(Transform::Translation(transform_msg.transform.translation.x,
                                       transform_msg.transform.translation.y,
                                       transform_msg.transform.translation.z),
                Transform::Rotation(transform_msg.transform.rotation.w,
                                    transform_msg.transform.rotation.x,
                                    transform_msg.transform.rotation.y,
                                    transform_msg.transform.rotation.z));
    odom->addTransformData(stamp, T);
  }
*/

将以上部分替换为:

	types.push_back(std::string("sensor_msgs/Imu"));
	rosbag::View view(bag, rosbag::TypeQuery(types));
	size_t imu_num = 0;
	double shiftX=0,shiftY=0,shiftZ=0,velX=0,velY=0,velZ=0;
	ros::Time time;
	double timeDiff,lastShiftX,lastShiftY,lastShiftZ;
	for (const rosbag::MessageInstance& m : view){
	  std::cout <<"Loading imu: \e[1m"<< imu_num++<<"\e[0m from ros bag"<<'\r'<< std::flush;

	  sensor_msgs::Imu imu=*(m.instantiate<sensor_msgs::Imu>());

	  Timestamp stamp = imu.header.stamp.sec * 1000000ll +imu.header.stamp.nsec / 1000ll;
	  if(imu_num==1){
		 time=imu.header.stamp;
			 Transform T(Transform::Translation(0,0,0),Transform::Rotation(1,0,0,0));
		 odom->addTransformData(stamp, T);
	 }
	 else{
		 timeDiff=(imu.header.stamp-time).toSec();
		 time=imu.header.stamp;
		 velX=velX+imu.linear_acceleration.x*timeDiff;
		 velY=velX+imu.linear_acceleration.y*timeDiff;
		 velZ=velZ+(imu.linear_acceleration.z-9.801)*timeDiff;

		 lastShiftX=shiftX;
		 lastShiftY=shiftY;
		 lastShiftZ=shiftZ;
		 shiftX=lastShiftX+velX*timeDiff+imu.linear_acceleration.x*timeDiff*timeDiff/2;
		 shiftY=lastShiftY+velY*timeDiff+imu.linear_acceleration.y*timeDiff*timeDiff/2;
		 shiftZ=lastShiftZ+velZ*timeDiff+(imu.linear_acceleration.z-9.801)*timeDiff*timeDiff/2;

		 Transform T(Transform::Translation(shiftX,shiftY,shiftZ),
				Transform::Rotation(imu.orientation.w,
						 imu.orientation.x,
						 imu.orientation.y,
						 imu.orientation.z));
		 odom->addTransformData(stamp, T);
	 }
	}

并在开头添加头文件:

#include <sensor_msgs/Imu.h>

Compile the fourth time

Before that, you need to install some other libraries, such as Eigen, Metis, Gtsam, ceres, etc. In short, you can find the corresponding libraries on the Internet to install if you are missing anything, and you need to pay attention here (be sure to pay attention to version matching).

(3) Record data and calibrate

​ Just record the lidar and IMU (/velodyne_points and /imu/data) for two minutes at the same time. It is best not to exceed 2G, otherwise unknown errors may occur and cannot be calibrated. The two-minute data should contain as much rotation and translation as possible (that is, run in a straight line for a while and then turn twice), so that the marked error results (it is said) will converge to less than a few hundred (which is already quite good), (The calibration process is relatively long, please wait patiently). It is best not to have transparent glass and moving obstacles around.

The meaning of the following command is to record only /velodyne_points /imu/data data packets

cd 数据包存放的地方(不运行此句,则默认在home下)
rosbag record -O lidar_imu.bag /velodyne_points /imu/data

(4) Modify the launch file

Open the lidar_align.launch file, and copy the two-minute data packet to the following:

insert image description here

start calibration

cd lidar_align
source devel/setup.bash 
roslaunch lidar_align lidar_align.launch

The calibration process takes about an hour, wait patiently, and iterate 200 to 300 times.

After the calibration is completed, the calibration result will be displayed on the terminal, and the corresponding calibration file will be generated under the result folder.

7. Download imu_utils

​ This is to calibrate the internal parameters of the imu. If the merchant provides the internal parameters, you can skip this calibration stepMake sure the internal parameters are correct. This step requires 2 hours of recording. The author directly adopts the internal reference provided by the merchant.

For the specific operation steps, refer to Daguo’s article here, you can directly skip to follow Daguo’s tutorial to install and use imu_utils to calibrate the IMU, and then use it for the joint calibration of the camera and IMU in kalibr .

Here to record two hours of IMU data, pay attention! ! ! When recording data, the IMU must be kept in a static state. Do not shake during the recording process and only record the topic of the IMU. But don’t worry about the memory, it only takes a little more than 300M in two hours. And this data package is best placed in your imu_utils workspace (a place parallel to src). It is best to start recording 10 minutes after the IMU is powered on. There will be a large error when recording starts immediately after powering on.

record command

rosbag play -r 200 imu_utils/imu.bag

It should be noted that when running terminal calibration,The terminal will not display calibration information, don't worry, it's actually working, and it will be calibrated in no time.

Log an issue here:

insert image description here

error: cannot convert 'int*' to'idx_t*'{aka long int*}' for argument '1' to 'int' METIS_NodeND(idx_t*, idx_t*, idx_t*,idx_t*, idx_t*, idx_t*,idx_t*)'

output_error = METIS_NodeND(&m, m_indexPtr.data(), m_innerIndices.data(), NULL, NULL, perm.data(), iperm.data());

This problem is finally caused by the version mismatch between ceres and Eigen (I have been delayed for a long time!!) Lower the version appropriately.

Put here the version that I ran successfully:
insert image description here

8. Modify the parameter file

Modify the lio-sam parameter file according to the external and internal parameters calibrated above:

If the specific modification is made, it will not be repeated.Skip to see the tutorial https://blog.csdn.net/weixin_42141088/article/details/118000544, which is quite detailed

Here I put my own parameter file for reference only

lio_sam:

  # Topics
  pointCloudTopic: "velodyne_points"      # Point cloud data(替换成自己的点云话题)
  imuTopic: "imu_data"                    # IMU data(替换自己的imu话题)
  odomTopic: "odometry/imu"                   # IMU pre-preintegration odometry, same frequency as IMU
  gpsTopic: "odometry/gpsz"      
  # GPS odometry topic from navsat, see(没有GPS就不需要更改) module_navsat.launch file

  # Frames
  lidarFrame: "base_link"
  baselinkFrame: "base_link"
  odometryFrame: "odom"
  mapFrame: "map"

  # GPS Settings
  useImuHeadingInitialization: false          # if using GPS data, set to "true"
  useGpsElevation: false                      # if GPS elevation is bad, set to "false"
  gpsCovThreshold: 2.0                        # m^2, threshold for using GPS data
  poseCovThreshold: 25.0                      # m^2, threshold for using GPS data
  
  # Export settings
  savePCD: true                               # https://github.com/TixiaoShan/LIO-SAM/issues/3
  savePCDDirectory: "/rosbag/pcd"           # in your home folder, starts and ends with "/". Warning: the code deletes "LOAM" folder then recreates it. See "mapOptimization" for implementation

  # Sensor Settings
  sensor: velodyne                            # lidar sensor type, 'velodyne' or 'ouster' or 'livox'
  N_SCAN: 16                                  # number of lidar channel (i.e., Velodyne/Ouster: 16, 32, 64, 128, Livox Horizon: 6)
  Horizon_SCAN: 1800                          # lidar horizontal resolution (Velodyne:1800, Ouster:512,1024,2048, Livox Horizon: 4000)
  downsampleRate: 1                           # default: 1. Downsample your data if too many points. i.e., 16 = 64 / 4, 16 = 16 / 1
  lidarMinRange: 1.0                          # default: 1.0, minimum lidar range to be used
  lidarMaxRange: 1000.0                       # default: 1000.0, maximum lidar range to be used

  # IMU Settings 对应imu的内参  我这里是直接用的商家提供的
  imuAccNoise: 1.0909715156015328e-02 
  imuGyrNoise: 8.0226069504443656e-04
  imuAccBiasN: 4.0347767978459928e-04
  imuGyrBiasN: 4.4153147263889589e-05
  imuGravity: 9.80511
  imuRPYWeight: 0.01

  # Extrinsics: T_lb (lidar -> imu)
  extrinsicTrans: [0.0, 0.0, 0.0]   #  平移矩阵  我这里直接设置为0   因为我imu和雷达靠的很近  在此不做赘述了 上面解释过了。
  extrinsicRot: [1, 0, 0,           # 旋转矩阵  我这里直接是单位矩阵,可以用上面标定的外参对比一下 
                  0, 1, 0,
                  0, 0, 1]
  extrinsicRPY: [1, 0, 0,
                  0, 1, 0,
                  0, 0, 1]
# LOAM feature threshold
  edgeThreshold: 1.0
  surfThreshold: 0.1
  edgeFeatureMinValidNum: 10
  surfFeatureMinValidNum: 100


I only list the main modifications here, and other parameters are modified according to comments and personal needs.

3. Implementation

After the previous work is done, you can run it now to see the effect.

  1. open bottom layer

    roslaunch  xxx_bringup  xxx.launch  //每个人都不一样,这里是比较常见的方式,运行雷达,imu和底盘。
    
  2. record packet

    //放到对应的路径下
    rosbag record -O xxx.bag /velodyne_points /imu/data
    
  3. Control the movement of the car

  4. After the recording package ends, start running lio-sam

    roslaunch lio-sam run.launch 
    cd xxxx//数据包的路径
    rosbag play xxx.bag
    

    At this time, you can see the effect in rviz. After running the data package, the pcd file will be saved by itself.

5. Finally, there is a pcl folder under the path of the bag package, which contains the saved point cloud map

//安装pcl_tools,安装过则跳过
sudo apt-get install  pcl-tools 
//到pcl路径下打开一个终端
pcl_viewer xxx.pcd

insert image description here

The picture above is one of my pcd pictures.

4. Reference link

​( 44 messages) Solve the problems encountered by LIO-SAM running its own data packets – SLAM does not learn and has no skills. Small problems_Ferryman 's Blog-CSDN Blog_lio_sam

(44 messages) LIO-SAM: configure the environment, install and test, adapt to the data set collected by yourself_The blog of primary school students waiting to grow-CSDN blog_lio-sam

Guess you like

Origin blog.csdn.net/qq_52383168/article/details/126300616