rosbag records and plays back data and creates maps and navigation through GMapping algorithm

1 Introduction

Through the introduction in the previous chapters, we understand the working principle of ROS, and we can also remotely control the robot. So how does the robot navigate autonomously in the actual environment? To do this, the robot must know where it is and where it needs to go.
This means that the robot needs to have a map of its surrounding environment and know its position in the map, so in this section we will be familiar with how to use the robot's sensor data to build a map.

2. rosbag tool

2.1, rosbag record

Here we first get familiar with a tool for recording and playing back ROS topics: rosbag still uses the launch
file from the previous article to start the robot: roslaunch mywanderbot test.launch to start recording topics: rosbag record -a will record all messages sent by online topics. If There are many sensors and a lot of data will be recorded. It is generally recommended to record specified topics, such as recording scan and tf : rosbag record scan tf will save all the messages sent on the recorded scan and tf topics in a .bag file. , the default is the name according to "Year-Month-Day-Hour-Minute-Second", such as 2023-10-13-08-28-57.bag . Of course, you can also specify a name or add a prefix to indicate it. rosbag record -O test.bag scan tf generated file: test.bag






rosbag record -o test.bag scan tf
generates a name with the test prefix plus time: test_2023-10-13-08-36-10.bag
The record topic here is essentially a subscription. We can view the topic information: rostopic info Scan
can see that there is a subscriber record.


2.2. View rosbag

View the information of this bag file: rosbag info test.bag

path:        test.bag
version:     2.0
duration:    1.2s
start:       Jan 01 1970 08:05:48.80 (348.80)
end:         Jan 01 1970 08:05:50.03 (350.03)
size:        53.9 KB
messages:    83
compression: none [1/1 chunks]
types:       sensor_msgs/LaserScan [90c7ef2dc6895d81024acba2ac42f369]
             tf2_msgs/TFMessage    [94810edda583a504dfda3829e70d7eec]
topics:      scan    7 msgs    : sensor_msgs/LaserScan
             tf     76 msgs    : tf2_msgs/TFMessage    (2 connections)

You can see the duration, size, type and topic of the records, etc. Viewing the bag file information is very helpful to know whether the information you want is recorded. 

2.3. rosbag playback

After recording, you can play it back: rosbag play --clock test.bag
where the --clock flag indicates that rosbag needs to publish time messages, which is very important for creating maps.
Note: The --clock flag will cause rosbag to publish time messages, and the time starts from the time recorded by this package. If other nodes are publishing time messages, such as the Gazebo emulator, it will cause many problems. If there are two source release times, due to inconsistent times, they will jump back and forth in time, which will confuse the mapping algorithm.
So when you run the rosbag command with the --clock parameter , make sure that nothing else on the system is publishing the time. A simple solution is to close the running emulator.

3. Create a map

The importance of maps to robots is self-evident. Here, the slam_gmapping node in the gmapping package is used to create the map. This node uses the GMapping algorithm. The principle is to use a Rao-Blackwellized particle filter to maintain the robot's own position. For tracking, this algorithm is based on the robot’s sensor data and an already built partial map. Although real-time sensor data can be used to generate maps while the robot is running, here we use another method. We will drive the robot to move around and use the above rosbag to record the sensor data and then play back the data. , use the slam_gmapping node to obtain a map. The advantage of using rosbag is that you don’t have to drive the robot to run again, which saves time, especially when you need to try a lot of different parameters.

3.1. launch does not exist

Let's start the robot and keyboard control:

roslaunch turtlebot_stage turtlebot_in_stage.launch
roslaunch turtlebot_teleop keyboard_teleop.launch

If an error like the following occurs:

RLException: [turtlebot_in_stage.launch] is neither a launch file in package [turtlebot_stage] nor is [turtlebot_stage] a launch file name
The traceback for the exception was written to the log file

RLException: [keyboard_teleop.launch] is neither a launch file in package [turtlebot_teleop] nor is [turtlebot_teleop] a launch file name
The traceback for the exception was written to the log file

Such an error usually means that there is no installation package. The reason is a version problem. We need to install
sudo apt-get install ros-melodic-turtlebot-teleop.

Similarly, you need to pay attention to the Ubuntu version here. The previous version should be ros-melodic-turtlebot-teleop . If an error like the following occurs:

 E: Unable to locate package ros-melodic-turtlebot-teleop

We use double-clicking the tab key to prompt. The version I have appears is this:
sudo apt-get install ros-melodic-turtlebot3-teleop.
You can see that turtlebot has been upgraded to turtlebot3.

After installation, we start it. The correct command is as follows (the name is also determined by searching and double-clicking the tab key):
roslaunch turtlebot3_gazebo turtlebot3_stage_1.launch
roslaunch turtlebot3_teleop turtlebot3_teleop_key.launch
 

After starting, as shown in the figure:

Keyboard w and x are used to accelerate and decelerate linear speed; a and d are used to accelerate and decelerate angular speed; s is to stop.

3.2. Record data

After becoming proficient in operating the robot, we will record the messages sent by scan and tf . Here we use -O to specify the bag file name.
rosbag record -O data.bag /scan /tf

In the process of driving the machine to traverse the surrounding environment, it has enough data.

3.3. GMapping algorithm

Before starting the mapping program, we terminate the simulation software first. This is very critical, because the simulator will publish LaserScan messages, which will conflict with the rosbag replay messages. After closing, we start roscore .
Open a new terminal, we will ask ROS to use the timestamp recorded in the package file and start the slam_gmapping node:
rosparam set use_sim_time true
rosrun gmapping slam_gmapping

If an error like the following occurs:

[rospack] Error: package 'gmapping' not found

Install gmapping , also choose according to the system version: sudo apt install ros-melodic-gmapping 

After installation, start: rosrun gmapping slam_gmapping.
It should be noted here that if the topic is not scan , a remapping needs to be done, similar to: scan:=mylaser_scan . Okay, at this time the mapping program has been started and is waiting for data. After inputting, the next step is to play back the data.
Play back rosbag recording command: rosbag play --clock data.bag

The map at this time has not yet been saved on the hard disk. In order to save slam_gmapping , we use the map_saver node in the map_server package to save it. rosrun map_server map_saver

[rospack] Error: package 'map_server' not found

Also check the version to install: sudo apt-get install ros-melodic-map-server  

Of course, when we are saving the map, if the following error occurs:

[ WARN] [1697176900.416482590, 1697172684.125851250]: MessageFilter [target=odom ]: Dropped 100.00% of messages so far. Please turn the [ros.gmapping.message_filter] rosconsole logger to DEBUG for more information.

We can view the tf tree to become familiar with the relationship between them: rosrun rqt_tf_tree rqt_tf_tree , as shown below:

Check whether their nodes are complete. This is usually caused by the lack of required topics for recording, so we can simply record all messages: rosbag record -O data_all.bag -a
. This way there will be no problem. 

3.4. The whole process

For some of the previous error handling and explanations, here is a process summary. Open two terminals, one is to start the robot, and the other is to record all data:

roslaunch mywanderbot test.launch
rosbag record -O data_all.bag -a
 

After recording, before proceeding, we need to stop the emulator, then open a new terminal and enter the following commands:

roscore
rosparam set use_sim_time true
rosrun gmapping slam_gmapping
rosbag play --clock data_all.bag
rosrun map_server map_saver

Turn on the GMapping algorithm, wait for the playback of the recorded data to perform calculation processing, and finally save the map to the hard disk.
This generates two files, one is map.pgm (map) and the other is map.yaml (map metadata).
We can view the online topics: rostopic list , as shown below:


We can see in the picture that there is an additional topic /tf_static . We can view the information of this topic:
rostopic info tf_static

Type: tf2_msgs/TFMessage

Publishers: None

Subscribers: 
 * /slam_gmapping (http://YAB:37763/)

Its subscriber is slam_gmapping , so the mapping program here needs this tf_static topic message. 

Check its type: rosmsg show tf2_msgs/TFMessage

geometry_msgs/TransformStamped[] transforms
  std_msgs/Header header
    uint32 seq
    time stamp
    string frame_id
  string child_frame_id
  geometry_msgs/Transform transform
    geometry_msgs/Vector3 translation
      float64 x
      float64 y
      float64 z
    geometry_msgs/Quaternion rotation
      float64 x
      float64 y
      float64 z
      float64 w

All the data recorded above will make the file very large and unnecessary, so here we only need to record one more topic: tf_static
rosbag record -O data_three.bag /scan /tf /tf_static

Let’s take a look at the generated map and view the map command : eog map.pgm or: rosrun rviz rviz (add map in rviz, you can also select map topic to view), as shown below: 

View map metadata: cat map.yaml

image: map.pgm
resolution: 0.050000
origin: [-100.000000, -100.000000, 0.000000]
negate: 0
occupied_thresh: 0.65
free_thresh: 0.196

3.5. Improve map quality

For Turtlebot 's sensor, we actually use data obtained by Microsoft's Kinect depth camera instead of a laser rangefinder. This fake laser rangefinder has a smaller range and smaller viewing angle than the real laser sensor. slam_gmapping uses laser data to estimate the robot's motion. Long-distance, wide-angle data will make this estimate more accurate.
We can change the parameters of the slam_gmapping node to improve the quality of the map:

rosparam set /slam_gmapping/angularUpdate 0.1
rosparam set /slam_gmapping/linearUpdate 0.1
rosparam set /slam_gmapping/lskip 10
rosparam set /slam_gmapping/xmax 10
rosparam set /slam_gmapping/xmin -10
rosparam set /slam_gmapping/ymax 10
rosparam set /slam_gmapping/ymin 10

These commands change how far each rotation of angularUpdate and linearUpdate takes a new scan, how many rays to skip when processing each LaserScan message (lskip) and how long the map extends (xmin, xmax, ymin, ymax)
Changes to these parameters only affect slam_gmapping , so you can use the data collected above without re-collecting the data. This is also an advantage of using rosbag to record and then play back to create maps.

Some uses of parameters:

rosparam list : List all current parameters
rosparam get xxx : Display the xxx parameter value
rosparam set xxx v : Set the xxx parameter value v
rosparam dump file_name : Save the parameters to the file
rosparam load file_name : Read the parameters from the file
rosparam delete xxx : Delete the xxx parameters

4. New installation 

For some friends, if the above packages are not installed, here is a new installation using the open source turtlebot3 :

4.1. Establish a work area 

mkdir -p ~/mybot3_ws/src
cd ~/mybot3_ws/src
catkin_init_workspace

4.2. Compilation

cd ..
catkin_make
echo "source ~/mybot3_ws/devel/setup.bash" >> ~/.bashrc
source ~/.bashrc

4.3. Download package

cd ~/mybot3_ws/src
git clone https://github.com/ROBOTIS-GIT/turtlebot3.git

4.4. Compile again

cd ..
catkin_make

4.5. Missing packages

Something went wrong here, the summary is as follows:

CMake Error at /opt/ros/melodic/share/catkin/cmake/catkinConfig.cmake:83 (find_package):
  Could not find a package configuration file provided by "turtlebot3_msgs"
  with any of the following names:

    turtlebot3_msgsConfig.cmake
    turtlebot3_msgs-config.cmake
Invoking "cmake" failed

The turtlebot3_msgs package is missing , we continue to download and compile.

cd ~/mybot3_ws/src
git clone https://github.com/ROBOTIS-GIT/turtlebot3_msgs.git

If it is not a scientific Internet, an error occurs:

gnutls_handshake() failed: The TLS connection was non-properly terminated.
取消代理:

git config --global --unset https.https://github.com.proxy 
git config --global --unset http.https://github.com.proxy

If it still doesn't work, you can delete the configuration file: rm -rf ~/.gitconfig

For more information about package installation, please refer to: ROS new workspace (workspace) and package (package) compilation practice (C++ example)

4.6. Create a map

After compiling again without any problems, we start creating maps and navigation operations (warm reminder: double-click the tab key to automatically type):
open the gazebo simulation robot
roslaunch turtlebot3_gazebo turtlebot3_world.launch
open the map
roslaunch turtlebot3_slam turtlebot3_slam.launch slam_methods:=gmapping
enable keyboard control The robot creates a map
roslaunch turtlebot3_teleop turtlebot3_teleop_key.launch
as shown below:

Finally, it can be saved as a map: rosrun map_server map_saver -f ~/map

As shown below:

5. Navigation

5.1. Load the map

After the map is created, let's take a look at the robot's navigation in the map:

roslaunch turtlebot3_gazebo turtlebot3_world.launch
roslaunch turtlebot3_navigation turtlebot3_navigation.launch map_file:=$HOME/map.yaml

The following error occurs, the summary is as follows:

ERROR: cannot launch node of type [amcl/amcl]: amcl
ERROR: cannot launch node of type [move_base/move_base]: move_base

AMCL (Adaptive Monte Carlo Localization) adaptive Monte Carlo localization method, often used for robot positioning and navigation, installation command (depending on ROS version)
sudo apt-get install ros-melodic-navigation

5.2. Version commands

lsb_release -a

No LSB modules are available.
Distributor ID:	Ubuntu
Description:	Ubuntu 18.04.6 LTS
Release:	18.04
Codename:	bionic

ROS version:

printenv ROS_DISTRO
melodic

Simulated robot version:

gazebo -version
Gazebo multi-robot simulator, version 9.0.0

5.3. Full version of tf tree

rosrun rqt_tf_tree rqt_tf_tree
is as shown below: 

5.4. Manual navigation

Click " 2D Nav Goal " on the rviz menu bar , we can select the destination on the map where the robot needs to drive, and then the robot will drive through the planned path, as shown below:

The principle of this navigation package is as follows:

1. The navigation destination, which is the clicked " 2D Nav Goal ", will be sent to the navigation software. We make a ROS action call ( action ) and pass in a MoveBaseGoal type parameter to represent this destination. Let’s take a look at this type of structure: rosmsg info MoveBaseGoal

[move_base_msgs/MoveBaseGoal]:
geometry_msgs/PoseStamped target_pose
  std_msgs/Header header
    uint32 seq
    time stamp
    string frame_id
  geometry_msgs/Pose pose
    geometry_msgs/Point position
      float64 x
      float64 y
      float64 z
    geometry_msgs/Quaternion orientation
      float64 x
      float64 y
      float64 z
      float64 w

You can see that this parameter is the pose (position and orientation) in a certain coordinate system (usually the map coordinate system).

2. The Global Planner uses path planning algorithms and maps to plan the shortest path from the current location to the target location.

3. This path is passed into the local planner (Local Planner) , and the local planner will try to drive the robot to follow this path. The local planner uses sensor information to avoid obstacles that are in front of the robot but are not on the map (such as people walking). If the local planner is stuck and cannot plan a path, it will request the global planner to draw a new path. and try to follow the path again.

4. When the robot approaches the target pose, this behavior call is completed.

5.5. Code navigation

There is no problem with the manual navigation mentioned above. Let’s finally look at writing a node to navigate ourselves. In the same way, we start the robot and load the navigation map.

roslaunch turtlebot3_gazebo turtlebot3_world.launch
roslaunch turtlebot3_navigation turtlebot3_navigation.launch map_file:=$HOME/map.yaml

We come to the navigation package turtlebot3_navigation and initialize a node.

cd ~/mybot3_ws/src/turtlebot3/turtlebot3_navigation
gedit patrol.py
#!/usr/bin/env python
import rospy
import actionlib

from move_base_msgs.msg import MoveBaseAction,MoveBaseGoal

waypoints=[[(-2.5,-2.2,0),(0,0,0,1)],[(6.5,4.43,0),(0,0,-1.2,1.5)]]

def goal_pose(pose):
    goal_pose=MoveBaseGoal()
    goal_pose.target_pose.header.frame_id='map'
    goal_pose.target_pose.pose.position.x=pose[0][0]
    goal_pose.target_pose.pose.position.y=pose[0][1]
    goal_pose.target_pose.pose.position.z=pose[0][2]

    goal_pose.target_pose.pose.orientation.x=pose[1][0]
    goal_pose.target_pose.pose.orientation.y=pose[1][1]
    goal_pose.target_pose.pose.orientation.z=pose[1][2]
    goal_pose.target_pose.pose.orientation.w=pose[1][3]
    return goal_pose

if __name__=='__main__':
    rospy.init_node('patrol')
    client=actionlib.SimpleActionClient('move_base',MoveBaseAction)
    client.wait_for_server()
    while True:
        for pose in waypoints:
            goal=goal_pose(pose)
            client.send_goal(goal)
            client.wait_for_result()

The code is relatively simple. First define waypoints to plan the path, then build a function goal_base to convert these waypoints into MoveBaseGoal . Finally, write an action client, wait for the server to be ready, and cycle between the waypoints. Each All points are sent to the navigation software as destinations.
Add executable permissions: chmod u+x patrol.py
We execute the node: rosrun turtlebot3_navigation patrol.py
and you can see the robot driving on the map.
For more usage of behavior invocation (action), those who are interested can check out: Practice of Action Service of ROS Communication Mechanism

Guess you like

Origin blog.csdn.net/weixin_41896770/article/details/134027989