cartography runs mapping and positioning on the robot

The cartgrapher framework is open sourced by Google in 2016. The framework can access data from 2D lasers, 3D lasers, odometers, and IMU sensors, and output 2D maps or 3D maps. At the same time, the framework has a more distinctive feature that it can incrementally update the map. When the cartgrapher is running in positioning mode, it can incrementally update the existing map while positioning.

1 cartography installation

There are many tutorials for installing cartography on the Internet [2] . Basically, there are no other problems with these tutorials or the tutorials provided on the official website. The only problem is that the speed of copying directly from github is very slow. You can use the code that is not cloned in China. .
Download the source code according to the official website [1] When downloading, you need to modify the ceres-solver address to: https://github.com/ceres-solver/ceres-solver.git (use the command vim src/ .rosinstall)
or go to me To download the warehouse , decompress the three compressed packages in the src directory.

2 2D mapping test

Here we first test the demo on the official website, and then test it on our own robot. Here we are running the dataset of the Pure localization section of the official website.

2.1 Start 2D mapping demo

roslaunch cartographer_ros demo_backpack_2d.launch bag_filename:=/media/crp/0E3C06880E3C0688/b2-2016-04-05-14-44-52.bag

bag_filename represents the data packet of ROS bag.

note: Here we replace the offline_backpack_2d.launch file on the official website with demo_backpack_2d.launch. This is because we found that the offline_backpack_2d.launch file did not start the map saving service.

cartograph_data set 2D mapping (x8)

First save the map as a .pbstream file

rosservice call /write_state ~/cartograph_test.pbstream

Use the conversion node that comes with cartography to convert .pbstream files into pgm and yaml files

rosrun cartographer_ros cartographer_pbstream_to_ros_map -pbstream_filename /home/crp/cartograph_test.pbstream -map_filestem /home/crp/cartograph_test

After starting the node, you can see the pgm and yaml files generated in the folder,
Insert picture description here
but if you need to use cartography for positioning, there is no need to convert to pgm format.

2.2 Start 2D positioning demo

Next we use the existing map to locate

roslaunch cartographer_ros demo_backpack_2d_localization.launch load_state_filename:=/home/crp/ cartograph_test.pbstream bag_filename:=/media/crp/0E3C06880E3C0688/b2-2016-04-27-12-31-41.bag

Among them, cartograph_test.pbstream is a map file generated in our previous step, and bag_filename: represents the currently input lidar data

Cartograph_Data set 2D positioning incremental update map

The positioning data is output in the TF coordinate system.

2.3 Realization of mapping on the kobuki robot

Here we refer to the demo history to configure the parameter file. Here we mainly need to pay attention to the configuration of several coordinate systems in the lua file. After my own attempts
a) When only using lidar (tracking_frame=”laser”, publish_frame=”laser”)
b) When using odometer + lidar (tracking_frame=”base_link”, publish_frame=”odom”)
c ) Use IMU+laser+mileage timer (tracking_frame=”imu_link”, publish_frame=”odom”)

The rest of the parameters can be configured by referring to the demo. The launch file ("kobuki_robot.launch") and lua ("kobuki_robot.lua") file configuration that I used is as follows:
kobuki_robot.launch

<launch>
  
  <arg name="urdf_file" default="$(find xacro)/xacro --inorder '$(find kobuki_description)/urdf/kobuki_standalone.urdf.xacro'"/>
  <param name="robot_description" command="$(arg urdf_file)"/>

  <node pkg="robot_state_publisher" type="robot_state_publisher" name="robot_state_publisher" output="screen">
    <param name="publish_frequency" type="double" value="5.0"/>
  </node>

  <node name="joint_state_publisher" pkg="joint_state_publisher" type="joint_state_publisher">
    <param name="use_gui" value="true"/>
  </node>


  <node name="cartographer_node" pkg="cartographer_ros"
      type="cartographer_node" args="
          -configuration_directory $(find cartographer_ros)/configuration_files
          -configuration_basename kobuki_robot.lua"
      output="screen">
    <remap from="scan" to="/scan" />
    <remap from="odom" to="/odom" />
  </node>
 

  <node name="cartographer_occupancy_grid_node" pkg="cartographer_ros"
      type="cartographer_occupancy_grid_node" args="-resolution 0.05" />

  <node name="rviz" pkg="rviz" type="rviz" required="true"
      args="-d $(find cartographer_ros)/configuration_files/demo_2d.rviz" />
 
</launch>

kobuki_robot.lua

include "map_builder.lua"
include "trajectory_builder.lua"

options = {
    
    
  map_builder = MAP_BUILDER,
  trajectory_builder = TRAJECTORY_BUILDER,
  map_frame = "map",
  tracking_frame = "base_footprint",
  published_frame = "odom",
  odom_frame = "odom",
  provide_odom_frame = false, --算法内部提供里程计
  publish_frame_projected_to_2d = false,
  use_odometry = true, --使用里程计
  use_nav_sat = false,
  use_landmarks = false,

  num_laser_scans = 1,
  num_multi_echo_laser_scans = 0,
  num_subdivisions_per_laser_scan = 1,
  num_point_clouds = 0,
  lookup_transform_timeout_sec = 0.2,
  submap_publish_period_sec = 0.3,
  pose_publish_period_sec = 5e-3,
  trajectory_publish_period_sec = 30e-3,
  rangefinder_sampling_ratio = 1.,
  odometry_sampling_ratio = 1.,
  fixed_frame_pose_sampling_ratio = 1.,
  imu_sampling_ratio = 1.,
  landmarks_sampling_ratio = 1.,
}

MAP_BUILDER.use_trajectory_builder_2d = true

TRAJECTORY_BUILDER_2D.submaps.num_range_data = 35
TRAJECTORY_BUILDER_2D.min_range = 0.3
TRAJECTORY_BUILDER_2D.max_range = 8.
TRAJECTORY_BUILDER_2D.missing_data_ray_length = 1.
TRAJECTORY_BUILDER_2D.use_imu_data = false
TRAJECTORY_BUILDER_2D.imu_gravity_time_constant = 9.8 
TRAJECTORY_BUILDER_2D.use_online_correlative_scan_matching = true
TRAJECTORY_BUILDER_2D.real_time_correlative_scan_matcher.linear_search_window = 0.1
TRAJECTORY_BUILDER_2D.real_time_correlative_scan_matcher.translation_delta_cost_weight = 10.
TRAJECTORY_BUILDER_2D.real_time_correlative_scan_matcher.rotation_delta_cost_weight = 1e-1
POSE_GRAPH.optimization_problem.huber_scale = 1e2
POSE_GRAPH.optimize_every_n_nodes = 35
POSE_GRAPH.constraint_builder.min_score = 0.65

return options

Below is a video recorded in the laboratory

kobuki cartograph 建图

2.4 Positioning based on existing maps on kobuki

Next, we use and build the map to locate, and at the same time update the map incrementally (note the area in the upper right corner of the map)
kobuki_localization.launch

<launch>
  
  <arg name="urdf_file" default="$(find xacro)/xacro --inorder '$(find kobuki_description)/urdf/kobuki_standalone.urdf.xacro'"/>
  <param name="robot_description" command="$(arg urdf_file)"/>

  <node pkg="robot_state_publisher" type="robot_state_publisher" name="robot_state_publisher" output="screen">
    <param name="publish_frequency" type="double" value="5.0"/>
  </node>

  <node name="joint_state_publisher" pkg="joint_state_publisher" type="joint_state_publisher">
    <param name="use_gui" value="true"/>
  </node>


  <node name="cartographer_node" pkg="cartographer_ros"
      type="cartographer_node" args="
          -configuration_directory $(find cartographer_ros)/configuration_files
          -configuration_basename kobuki_localization.lua
	  -load_state_filename $(arg load_state_filename)"
      output="screen">
    <remap from="echoes" to="/scan" />
  </node>
 

  <node name="cartographer_occupancy_grid_node" pkg="cartographer_ros"
      type="cartographer_occupancy_grid_node" args="-resolution 0.05" />

  <node name="rviz" pkg="rviz" type="rviz" required="true"
      args="-d $(find cartographer_ros)/configuration_files/demo_2d.rviz" />
 
</launch>

kobuki_localization.lua The lua configuration file in positioning mode, but two configuration parameters are added on the basis of map creation

include "kobuki_robot.lua"

TRAJECTORY_BUILDER.pure_localization = true
POSE_GRAPH.optimize_every_n_nodes = 20

return options

Positioning & incremental update map video:

kobuki_Cartograph_positioning & incremental mapping

3. 3D mapping test

3.1 3D data set mapping

When using 3D lidar to build maps, we must combine the IMU and use the gravity direction vector provided by the IMU. Here we directly follow the steps of the official website [1] to run, first you need to download this 3D data package [5]

Secondly, we need to replace the "offline_backpack_3d.launch" on the official website with "demo_backpack_3d.launch", otherwise it will be unable to call when saving the map

Start 3D lidar mapping

roslaunch cartographer_ros demo_backpack_3d.launch bag_filename:=/media/crp/0E3C06880E3C0688/b3-2016-04-05-13-54-42.bag

Cartography-3D mapping

Wait until the data is running and call the write_state service to save the map

rosservice call /write_state ~/3d_local.pbstream

Convert this pbstream file into a 3D ply point cloud file

roslaunch cartographer_ros assets_writer_my_rslidar_3d.launch bag_filenames:=b3-2016-04-05-13-54-42.bag pose_graph_filename:=~/3d_local.pbstream

Wait for a while, the command will exit automatically after the processing is completed. At this time, a .bag_points.ply suffix file will be generated next to the bag file. This is the point cloud file[6]. Finally, use the PCL tool to convert the ply file into pcd file

pcl_ply2pcd b3-2016-04-05-13-54-42.bag_points.ply test_3d.pcd

When running, the pose of the robot is published in TF, as shown in the figure below. So we can know the position of the robot by reading the coordinate transformation between odom->map
Insert picture description here

3.2 3D positioning

For 3D positioning, we use the "***~/3d_local.pbstream***" generated in section 3.1 as an existing map, and match the current laser data input to estimate the position

cartographer_ros demo_backpack_3d_localization.launch load_state_filename:=/home/crp/3d_local.pbstream bag_filename:=/media/crp/0E3C06880E3C0688/b3-2016-04-05-15-52-20.bag

Cartography-3D positioning

Also when running positioning, the pose of the robot is also published in TF, as shown in the figure below. So we can know the position of the robot by reading the coordinate transformation between odom->map. (It can be clearly seen that the output frequency of the pose during positioning is much lower than the frequency during mapping)
Insert picture description here

Reference materials:

[1]https://google-cartographer-ros.readthedocs.io/en/latest/compilation.html#building-installation
[2] https://www.cnblogs.com/hitcm/p/5939507.html
[3] https://zhuanlan.zhihu.com/p/64747755
[4] https://storage.googleapis.com/cartographer-public-data/bags/backpack_3d/b3-2016-04-05-13-54-42.bag
[5] https://storage.googleapis.com/cartographer-public-data/bags/backpack_3d/b3-2016-04-05-15-52-20.bag
[6]http://community.bwbot.org/topic/523/%E8%B0%B7%E6%AD%8Ccartographer%E4%BD%BF%E7%94%A8%E9%80%9F%E8%85%BE%E8%81%9A%E5%88%9B3d%E6%BF%80%E5%85%89%E9%9B%B7%E8%BE%BE%E6%95%B0%E6%8D%AE%E8%BF%9B%E8%A1%8C%E4%B8%89%E7%BB%B4%E5%BB%BA%E5%9B%BE/2

Welcome to discuss or leave a message with friends who are autonomous navigation vehicles, [email protected]

Guess you like

Origin blog.csdn.net/crp997576280/article/details/103279649