Gazebo built a virtual environment to complete ROS robot positioning and navigation simulation and YOLO detection and identification of marked objects


Preface

Environment description:
This process uses Ubuntu18.04+ROS Melodic+Gazebo+yolo3


1. Preliminary preparation

(1) Create a workspace

  1. Create ros work area

    mkdir -p ~/robot_positioning_ws/src
    
  2. Switch to work area directory

    cd ~/robot_positioning_ws/src
    
  3. Initialize the ROS workspace

    catkin_init_workspace
    

(Two) function package racecar

  1. Download Gazebo to build a track

    git clone https://github.com/xmy0916/racecar.git
    

    Insert picture description here

  2. Install controllers related

    sudo apt-get install ros-melodic-controller-manager
    sudo apt-get install ros-melodic-gazebo-ros-control
    sudo apt-get install ros-melodic-effort-controllers
    sudo apt-get install ros-melodic-joint-state-controller
    
  3. Racecar feature pack compilation

    cd ..
    catkin_make
    

    Problems encountered in compiling
    Could not find a package configuration file provided by "driver_base" with any of the following namesInsert picture description here
    solution

    sudo apt-get install ros-melodic-driver-base
    

    Could not find a package configuration file provided by "OpenCV" with any of the following names
    Insert picture description here
    Solution

    locate OpenCVConfig.cmake
    sudo gedit ~/robot_positioning_ws/src/racecar_gazebo/CMakeLists.txt
    modify the path on line 7 to your path: set(OpenCV_DIR /opt/ros/kinetic/share/OpenCV-3.3.1-dev/)

    Insert picture description here
    ackermann_msgs/AckermannDrive.h: No such file or directory
    Insert picture description here
    Solution

    sudo apt-get install ros-melodic-ackermann-msgs
    

    Compiled successfully
    Insert picture description here

2. Gazebo builds its own virtual environment

(1) Related settings

  1. Set environment variables
    echo "source ~/robot_positioning_ws/devel/setup.bash" >> ~/.bashrc
    source ~/.bashrc
    

(2) Running the car model

  1. Run the statement
    roslaunch racecar_gazebo racecar.launch 
    
  2. Operation effect
    Description: The small window of the control interface of tk controls the movement of the car, W means forward, D means turn left, S means backward, A means turn right
    Insert picture description here

(3) Manually build the environment

  1. Open gazebo

    roslaunch gazebo_ros empty_world.launch
    
  2. Click Edit->Build Editor, create a model, and save the model
    Insert picture description here

  3. Import the track model framework you just created to
    run the car model

    roslaunch racecar_gazebo racecar.launch 
    

    Import and create the environment model
    insert->select the corresponding saved model,
    Insert picture description here
    add obstacles and
    Insert picture description here
    save as a world file
    Insert picture description here

(4) Operation and creation environment

  1. Create a launch file and configure track parameters
    cd ~/robot_positioning_ws/src/racecar/racecar_gazebo/launch
    sudo gedit lyy.launch
    
    lyy.launch content
    <?xml version="1.0"?>
    <launch>
      <!-- Launch the racecar -->
      <include file="$(find racecar_gazebo)/launch/racecar.launch">
        <arg name="world_name" value="lyy"/>
      </include>
    </launch>
    
  2. Run gazebo
    roslaunch racecar_gazebo lyy.launch 
    
    Shown as follows
    Insert picture description here

Three, perform gmapping mapping

  1. gmapping mapping

    roslaunch racecar_gazebo slam_gmapping.launch
    

    Problem description: ERROR: cannot launch node of type [gmapping/slam_gmapping]: gmapping
    Insert picture description here
    solution

    sudo apt-get install ros-melodic-gmapping
    
  2. Run a complete circle (through the kt control window)
    the starting point of the
    Insert picture description here
    trolley, the end point of the trolley is
    Insert picture description here
    displayed in Rviz
    Insert picture description here

  3. Save the map created by gmapping

    sudo apt-get install ros-melodic-map-server
    cd ~/robot_positioning_ws/src/racecar/racecar_gazebo/map
    rosrun map_server map_saver -f lyy_map
    

    Insert picture description here

Fourth, the car's autonomous positioning and navigation

(1) The environment created by myself

  1. Editing the launch file

    cd ~/robot_positioning_ws/src/racecar/racecar_gazebo/launch
    sudo gedit lyy_auto.launch
    

    lyy_auto.launch file content

    <?xml version="1.0"?>
    <launch>
      <!-- Launch the racecar -->
      <include file="$(find racecar_gazebo)/launch/racecar.launch">
        <arg name="world_name" value="lyy"/>
      </include>
      
      <!-- Launch the built-map -->
      <node name="map_server" pkg="map_server" type="map_server" args="$(find racecar_gazebo)/map/lyy_map.yaml" />
    
      <!--Launch the move base with time elastic band-->
      <param name="/use_sim_time" value="true"/>
      <node pkg="move_base" type="move_base" respawn="false" name="move_base" output="screen">
        <rosparam file="$(find racecar_gazebo)/config/costmap_common_params.yaml" command="load" ns="global_costmap" />
        <rosparam file="$(find racecar_gazebo)/config/costmap_common_params.yaml" command="load" ns="local_costmap" />
        <rosparam file="$(find racecar_gazebo)/config/local_costmap_params.yaml" command="load" />
        <rosparam file="$(find racecar_gazebo)/config/global_costmap_params.yaml" command="load" />
        <rosparam file="$(find racecar_gazebo)/config/teb_local_planner_params.yaml" command="load" />
    
        <param name="base_global_planner" value="global_planner/GlobalPlanner" />
        <param name="planner_frequency" value="0.01" />
        <param name="planner_patience" value="5.0" />
        <!--param name="use_dijkstra" value="false" /-->
        
        <param name="base_local_planner" value="teb_local_planner/TebLocalPlannerROS" />
        <param name="controller_frequency" value="5.0" />
        <param name="controller_patience" value="15.0" />
    
        <param name="clearing_rotation_allowed" value="false" />
      </node>
      
    </launch>
    

    Insert picture description here

  2. Run the created environment

    roslaunch racecar_gazebo lyy_auto.launch
    
  3. Start rviz

    roslaunch racecar_gazebo racecar_rviz.launch
    

    Insert picture description here

  4. Design the trajectory of the car.
    Use 2D Nav Goal to publish the target.
    Insert picture description here
    Problem description: 使用2D Nav Goal发布目标时,始终没有反应
    Solution

    sudo apt-get install ros-melodic-teb-local-planner
    

    Since there is no wall separating the start point and the end point, now move the end point to the opposite side

  5. Start the path_pursuit.py script file

    rosrun racecar_gazebo path_pursuit.py
    

    The actual situation is
    Insert picture description here
    temporarily unclear what caused it.

(2) The environment is included in the downloaded function package

  1. Operating environment

    roslaunch racecar_gazebo racecar_runway_navigation.launch 	
    
  2. Start rviz

    roslaunch racecar_gazebo racecar_rviz.launch
    
  3. Design the car trajectory
    using 2D Nav Goal to publish the target position

  4. Start the path_pursuit.py script file

    rosrun racecar_gazebo path_pursuit.py
    

    Running results (part of the movement process)
    Insert picture description here

Five, use YOLO to detect and identify marked objects

(1) Load YOLO

  1. ssh settings

    ssh-keygen ##输入之后根据提示输入两次回车,两次密码
    ls ~/.ssh
    eval `ssh-agent`
    ssh-add ~/.ssh/id_rsa
    cat ~/.ssh/id_rsa.pub #得到一串密匙
    

    Log in to GitHub, select settings, ssh and GPG keys in the account drop-down option, select new ssh key, enter the name Default public key, and then copy the obtained key to the text box, and then click add ssh key, the key setting is complete.

  2. Download darknet-ros

    cd ~/robot_positioning_ws/src
    git clone --recursive [email protected]:leggedrobotics/darknet_ros.git
    
  3. Compile

    cd ~/robot_positioning_ws
    catkin_make -DCMAKE_BUILD_TYPE=Release
    

    Insert picture description here
    说明:
    Start to compile the entire project. After the compilation is completed, it will be checked whether there are two model files, yolov2-tiny.weights and yolov3.weights, in the ./darknet_ros/darknet_ros/yolo_network_config/weights file. By default, the downloaded code does not include this in order to save volume. Two model files. Therefore, the model file will be downloaded automatically after compilation, and there will be a long waiting time at this time. Before starting to compile, you can download it in advance, copy the model file to the above folder, and it will not download again.

(2) Realize the detection of marked objects

  1. Open gazebo

    roslaunch racecar_gazebo racecar_runway_navigation.launch
    
  2. Open the racecar.gazebo file to view topics
    Insert picture description here

  3. Modify the topic subscribed in the ros.yaml file
    Original file
    Insert picture description here
    Modified file
    Insert picture description here

  4. Start YOLO V3

    roslaunch darknet_ros darknet_ros.launch
    

    This process is very easy to get stuck, so you need to try several times.
    Insert picture description here
    In the end, the recognition result is not very accurate. The possible reason is that the camera did not capture all the objects, and some of the objects were recognized as other objects. This process did take a long time, and it was too easy to get stuck. It takes more patience to achieve the final effect. When building the environment by yourself, you need to pay attention to what separates the starting point and the end point. When you don’t plan the path later, it won’t be drawn.


Reference connection

  1. Under Ubuntu 18.04, use Gazebo to build a track to complete ROS robot positioning and navigation simulation + load YOLO to detect and identify marked objects [smart car]
  2. Realize ROS smart car positioning and navigation simulation under Ubuntu 18.04
  3. Implementation of darknet_ros (YOLO V3) detection under ROS
  4. Embedded system application development and big job-ROS robot positioning and navigation simulation

Guess you like

Origin blog.csdn.net/qq_43279579/article/details/115262778