Run R3LIVE from scratch with your own data

1. Camera internal reference calibration

The camera uses a 4mm wide-angle camera, and the camera’s internal parameters are calibrated using the most common checkerboard method. First, install the package that comes with ROS

sudo apt install ros-melodic-camera-calibration

Calibrate after starting the camera with usb_cam.

rosrun camera_calibration cameracalibrator.py --size 8x6 --square 0.06 image:=/uab_cam/image_raw camera:=/camera --no-service-check

--size: the length X width of the calibration board

--square: the side length of the small grid in the chessboard (m)
image:=/... : subscribe to the image topic
camera:=/camera: camera name

After calibration, an internal reference file will be generated, Intrinsic is the camera matrix, and Distortion is the distortion matrix.

Intrinsic:
[430.229,0,306.853],[0,437.83,248.373],[0,0,1]

Distortion:
[0.156215,-0.156542,-0.004833,-0.001379]

2. Cam2lidar external parameter calibration

I use the automatic + manual method for external reference calibration, first use the algorithm for automatic registration, and then use manual calibration to correct.

The automatic calibration algorithm uses the livox_camera_calib algorithm

GitHub - hku-mars/livox_camera_calib: This repository is used for automatic calibration between high resolution LiDAR and camera in targetless scenes.

This method works well, but it requires a specific occasion. After my test, I choose a scene with a large depth difference and a flat scene for data collection, such as a corridor.

        The automatic calibration algorithm is particularly dependent on the scene of the data. A good scene can achieve a good calibration effect. Then manually calibrate for further correction, using SensorsCalibration, this calibration library contains the calibration between most sensors.

https://github.com/PJLab-ADG/SensorsCalibration

The final calibration result is shown in the figure below. 

After calibration, an external parameter matrix will be generated

Extrinsic:
[-0.00650189,-0.999724,-0.0225309,0.0946092],[0.0035881,0.0225079,-0.999741,0.162664],[0.999972,-0.00658104,0.00344083,-0.00686066],[0,0,0,1]

3. Record rosbag

After that, the data can be collected. One thing to note is the time synchronization between different sensors, because LIVOX calculates the time from the start of the lidar by default, not the ROS system time. If the data is recorded directly, it will not run. of. Two solutions are given below.

  • Synchronize the network card corresponding to the lidar connection with the system time:

ifconfig Check the network port, for example, the network port is eno1

If the following prompts appear, you can use hardware synchronization
 `hardware-raw-clock (SOF_TIMESTAMPING_RAW_HARDWARE)`
 `hardware-transmit (SOF_TIMESTAMPING_TX_HARDWARE)`
 `hardware-recive (SOF_TIMESTAMPING_RX_HARDWARE)`

sudo apt install ptpd
sudo ptpd -M -i eno1 -C 

When the best master appears, confirm it with wireshark,

wireshark -- eno1 -- search for ptp in the search box -- a Sync Message signal appears

Then start the lidar again, and the time stamp is synchronized with the local machine.

  • Directly use the LIVOX driver provided by the author of R3LIVE

GitHub - ziv-lin/livox_ros_driver_for_R2LIVE

The author changed the timestamp in the source code, so it is feasible to start recording data directly with this driver. Before recording, you need to turn on the time synchronization function in the livox_lidar_config.json file. And choose an interface name, here I choose the interface of the camera.

When recording data, you can record the image topic in compressed format, such as usb_cam/image_raw/compressed, so that the rosbag will be smaller.

4. Install and compile R3LIVE

https://github.com/hku-mars/r3live

Just follow the official website reademe, the author wrote it in detail.

5. Running results

It is mainly necessary to modify the yaml file and fill in the obtained camera internal parameters, external parameters, and distortion matrix. Note that the 3*3 rotation matrix is ​​the transpose of the upper left 3*3 part of the 4*4 transformation matrix. Then change the topic of the launch file to your own.

r3live_vio:
   image_width: 640
   image_height: 480
   camera_intrinsic:
      [460.799008, 0.000000, 280.624145,
    0.000000, 459.386391, 167.344099,
    0.0, 0.0, 1.0] 
   camera_dist_coeffs: [-0.107372, 0.087416, 0.001676, -0.001356, 0.000000]  #k1, k2, p1, p2, k3
   # Fine extrinsic value. form camera-LiDAR calibration.
   #转置矩阵
   camera_ext_R:       
          [-0.00650189,0.0035881,0.999972,
            -0.999724,0.0225079,-0.00658104,
            -0.0225309,  -0.999741,   0.00344083]
   # camera_ext_t: [0.050166, 0.0474116, -0.0312415] 
   camera_ext_t: [0.0946092,0.162664,-0.00686066] 
<launch>
    <!-- Subscribed topics -->
    <param name="LiDAR_pointcloud_topic" type="string" value= "/laser_cloud_flat" />
    <param name="IMU_topic" type="string" value= "/livox/imu" />
    <param name="Image_topic" type="string" value= "/usb_cam/image_raw" />
    <param name="r3live_common/map_output_dir" type="string" value="$(env HOME)/r3live_output" />
    <rosparam command="load" file="$(find r3live)/../config/my_data_r3live_config.yaml" />

    <node pkg="r3live" type="r3live_LiDAR_front_end" name="r3live_LiDAR_front_end"  output="screen" required="true"/>
    <node pkg="r3live" type="r3live_mapping" name="r3live_mapping" output="screen" required="true" />
    
    <arg name="rviz" default="1" />
    <group if="$(arg rviz)">
        <node name="rvizvisualisation" pkg="rviz" type="rviz" output="log" args="-d $(find r3live)/../config/rviz/r3live_rviz_config.rviz" />
    </group>
</launch>

After running, you can build a map.

roslaunch r3live my_data_r3live_bag.launch 

 

Guess you like

Origin blog.csdn.net/HUASHUDEYANJING/article/details/131507019