Article Directory
Preface
Environment description:
This process uses Ubuntu18.04+ROS Melodic+Gazebo+yolo3
1. Preliminary preparation
(1) Create a workspace
-
Create ros work area
mkdir -p ~/robot_positioning_ws/src
-
Switch to work area directory
cd ~/robot_positioning_ws/src
-
Initialize the ROS workspace
catkin_init_workspace
(Two) function package racecar
-
Download Gazebo to build a track
git clone https://github.com/xmy0916/racecar.git
-
Install controllers related
sudo apt-get install ros-melodic-controller-manager sudo apt-get install ros-melodic-gazebo-ros-control sudo apt-get install ros-melodic-effort-controllers sudo apt-get install ros-melodic-joint-state-controller
-
Racecar feature pack compilation
cd .. catkin_make
Problems encountered in compiling
①Could not find a package configuration file provided by "driver_base" with any of the following names
solutionsudo apt-get install ros-melodic-driver-base
②
Could not find a package configuration file provided by "OpenCV" with any of the following names
Solutionlocate OpenCVConfig.cmake
sudo gedit ~/robot_positioning_ws/src/racecar_gazebo/CMakeLists.txt
modify the path on line 7 to your path: set(OpenCV_DIR /opt/ros/kinetic/share/OpenCV-3.3.1-dev/)
③ackermann_msgs/AckermannDrive.h: No such file or directory
Solutionsudo apt-get install ros-melodic-ackermann-msgs
Compiled successfully
2. Gazebo builds its own virtual environment
(1) Related settings
- Set environment variables
echo "source ~/robot_positioning_ws/devel/setup.bash" >> ~/.bashrc source ~/.bashrc
(2) Running the car model
- Run the statement
roslaunch racecar_gazebo racecar.launch
- Operation effect
Description: The small window of the control interface of tk controls the movement of the car, W means forward, D means turn left, S means backward, A means turn right
(3) Manually build the environment
-
Open gazebo
roslaunch gazebo_ros empty_world.launch
-
Click Edit->Build Editor, create a model, and save the model
-
Import the track model framework you just created to
run the car modelroslaunch racecar_gazebo racecar.launch
Import and create the environment model
insert->select the corresponding saved model,
add obstacles and
save as a world file
(4) Operation and creation environment
- Create a launch file and configure track parameters
lyy.launch contentcd ~/robot_positioning_ws/src/racecar/racecar_gazebo/launch sudo gedit lyy.launch
<?xml version="1.0"?> <launch> <!-- Launch the racecar --> <include file="$(find racecar_gazebo)/launch/racecar.launch"> <arg name="world_name" value="lyy"/> </include> </launch>
- Run gazebo
Shown as followsroslaunch racecar_gazebo lyy.launch
Three, perform gmapping mapping
-
gmapping mapping
roslaunch racecar_gazebo slam_gmapping.launch
Problem description:
ERROR: cannot launch node of type [gmapping/slam_gmapping]: gmapping
solutionsudo apt-get install ros-melodic-gmapping
-
Run a complete circle (through the kt control window)
the starting point of the
trolley, the end point of the trolley is
displayed in Rviz
-
Save the map created by gmapping
sudo apt-get install ros-melodic-map-server cd ~/robot_positioning_ws/src/racecar/racecar_gazebo/map rosrun map_server map_saver -f lyy_map
Fourth, the car's autonomous positioning and navigation
(1) The environment created by myself
-
Editing the launch file
cd ~/robot_positioning_ws/src/racecar/racecar_gazebo/launch sudo gedit lyy_auto.launch
lyy_auto.launch file content
<?xml version="1.0"?> <launch> <!-- Launch the racecar --> <include file="$(find racecar_gazebo)/launch/racecar.launch"> <arg name="world_name" value="lyy"/> </include> <!-- Launch the built-map --> <node name="map_server" pkg="map_server" type="map_server" args="$(find racecar_gazebo)/map/lyy_map.yaml" /> <!--Launch the move base with time elastic band--> <param name="/use_sim_time" value="true"/> <node pkg="move_base" type="move_base" respawn="false" name="move_base" output="screen"> <rosparam file="$(find racecar_gazebo)/config/costmap_common_params.yaml" command="load" ns="global_costmap" /> <rosparam file="$(find racecar_gazebo)/config/costmap_common_params.yaml" command="load" ns="local_costmap" /> <rosparam file="$(find racecar_gazebo)/config/local_costmap_params.yaml" command="load" /> <rosparam file="$(find racecar_gazebo)/config/global_costmap_params.yaml" command="load" /> <rosparam file="$(find racecar_gazebo)/config/teb_local_planner_params.yaml" command="load" /> <param name="base_global_planner" value="global_planner/GlobalPlanner" /> <param name="planner_frequency" value="0.01" /> <param name="planner_patience" value="5.0" /> <!--param name="use_dijkstra" value="false" /--> <param name="base_local_planner" value="teb_local_planner/TebLocalPlannerROS" /> <param name="controller_frequency" value="5.0" /> <param name="controller_patience" value="15.0" /> <param name="clearing_rotation_allowed" value="false" /> </node> </launch>
-
Run the created environment
roslaunch racecar_gazebo lyy_auto.launch
-
Start rviz
roslaunch racecar_gazebo racecar_rviz.launch
-
Design the trajectory of the car.
Use 2D Nav Goal to publish the target.
Problem description:使用2D Nav Goal发布目标时,始终没有反应
Solutionsudo apt-get install ros-melodic-teb-local-planner
Since there is no wall separating the start point and the end point, now move the end point to the opposite side
-
Start the path_pursuit.py script file
rosrun racecar_gazebo path_pursuit.py
The actual situation is
temporarily unclear what caused it.
(2) The environment is included in the downloaded function package
-
Operating environment
roslaunch racecar_gazebo racecar_runway_navigation.launch
-
Start rviz
roslaunch racecar_gazebo racecar_rviz.launch
-
Design the car trajectory
using 2D Nav Goal to publish the target position -
Start the path_pursuit.py script file
rosrun racecar_gazebo path_pursuit.py
Running results (part of the movement process)
Five, use YOLO to detect and identify marked objects
(1) Load YOLO
-
ssh settings
ssh-keygen ##输入之后根据提示输入两次回车,两次密码 ls ~/.ssh eval `ssh-agent` ssh-add ~/.ssh/id_rsa cat ~/.ssh/id_rsa.pub #得到一串密匙
Log in to GitHub, select settings, ssh and GPG keys in the account drop-down option, select new ssh key, enter the name Default public key, and then copy the obtained key to the text box, and then click add ssh key, the key setting is complete.
-
Download darknet-ros
cd ~/robot_positioning_ws/src git clone --recursive [email protected]:leggedrobotics/darknet_ros.git
-
Compile
cd ~/robot_positioning_ws catkin_make -DCMAKE_BUILD_TYPE=Release
说明:
Start to compile the entire project. After the compilation is completed, it will be checked whether there are two model files, yolov2-tiny.weights and yolov3.weights, in the ./darknet_ros/darknet_ros/yolo_network_config/weights file. By default, the downloaded code does not include this in order to save volume. Two model files. Therefore, the model file will be downloaded automatically after compilation, and there will be a long waiting time at this time. Before starting to compile, you can download it in advance, copy the model file to the above folder, and it will not download again.
(2) Realize the detection of marked objects
-
Open gazebo
roslaunch racecar_gazebo racecar_runway_navigation.launch
-
Open the racecar.gazebo file to view topics
-
Modify the topic subscribed in the ros.yaml file
Original file
Modified file
-
Start YOLO V3
roslaunch darknet_ros darknet_ros.launch
This process is very easy to get stuck, so you need to try several times.
In the end, the recognition result is not very accurate. The possible reason is that the camera did not capture all the objects, and some of the objects were recognized as other objects. This process did take a long time, and it was too easy to get stuck. It takes more patience to achieve the final effect. When building the environment by yourself, you need to pay attention to what separates the starting point and the end point. When you don’t plan the path later, it won’t be drawn.
Reference connection
- Under Ubuntu 18.04, use Gazebo to build a track to complete ROS robot positioning and navigation simulation + load YOLO to detect and identify marked objects [smart car]
- Realize ROS smart car positioning and navigation simulation under Ubuntu 18.04
- Implementation of darknet_ros (YOLO V3) detection under ROS
- Embedded system application development and big job-ROS robot positioning and navigation simulation