ROS learning [17] ----- the installation and configuration directory of ORB_SLAM2 on ubuntu16.04
- First, understand ORB_SLAM2
- Second, install Pangolin
- Three, install opencv
- Fourth, install the Eigen library
- V. Installation of ORB_SLAM2
- 1. Clone the ORB_SLAM2 package to ubuntu local
- 2. The compilation of the ORB_SLAM2 package in the ROS space
- 3. Preparation for running ORB_SLAM in ROS environment
- Six, install usb_cam
- Seven, the monocular operation of ORB_SLAM2
When conducting ROS learning, the essential learning stage is the learning of ORB_SLAM2, and at the same time, it is also a visual visualization system that is executed by compiling the files yourself. In this blog, senior Lin Jun mainly takes everyone to understand the installation and ORB_SLAM2 Configuration
- ubuntu system: ubuntuKylin-16.04
- opencv version: opencv-3.4.1
- ROS version: Kinetic
First, understand ORB_SLAM2
1. What is ORB_SLAM2
- ORB-SLAM is a visual SLAM system written by Raul Mur-Artal of Zaragoza University in Spain.
- His paper "ORB-SLAM: a versatile and accurate monocular SLAM
system" was published in the 2015 IEEE Trans. On Robotics. - Open source code includes early ORB-SLAM and later ORB-SLAM2. The first version is mainly used for monocular SLAM, while the second version supports three interfaces: monocular, binocular and RGBD.
2. Features of ORB_SLAM2
- The feature points extracted and tracked use ORB. ORB feature extraction process is very fast, suitable for real-time system.
- The loopback detection uses a word bag model, and its dictionary is a large ORB dictionary.
- The interface is rich, supports monocular, binocular, RGBD multiple sensor input, ROS is optional during compilation, making its application very light. The price is to support various interfaces, and the code logic is slightly more complicated.
- Real-time calculation is performed on the PC at 30ms / frame, but it does not perform well on the embedded platform.
Through the above definitions and characteristics, you have a certain understanding of ORB_SLAM2 in detail, now, start the installation of ORB_SLAM2
Second, install Pangolin
Pangolin is a lightweight OpenGL input / output and video display library that encapsulates OpenGL. Can be used for 3D vision and 3D navigation visual map, can input various types of video, and can retain the video and input data for debugging
1. Install, download and compile tools
sudo apt-get install cmake
sudo apt-get install git
sudo apt-get install g++
2. Installation dependencies
sudo apt-get install libglew-dev
sudo apt-get install libboost-dev libboost-thread-dev libboost-filesystem-dev
sudo apt-get install libpython2.7-dev
3. Install Pangolin
1) Download the Pangolin package from gitHub to the local
git clone https://github.com/stevenlovegrove/Pangolin.git
2) Enter the Pangolin folder and create a compilation folder
cd Pangolin
mkdir build
cd build
3), configuration compilation
cmake -DCPP11_NO_BOOSR=1 ..
4), compile Pangolin resource file
sudo make -j8
The compilation is successful as shown above:
Three, install opencv
1), the installation of opencv, senior Lin Jun wrote another blog, friends can refer to the installation blog of opencv, the link is as follows:
https://blog.csdn.net/qq_42451251/article/details/105565305
Fourth, install the Eigen library
The magic of Eigen is that it is a library built with pure header files. This means that you can only find its header files, not binary files like .so or .a. When using it, you only need to introduce the Eigen header file, no need to link the library file!
1) The installation command is as follows:
sudo apt-get install libeigen3-dev
2) Check the specific location of the Eigen library
sudo updatedb
locate eigen3
It can be seen that the location of the eigen3 library is atusr/include/eigen3In this directory, remember this directory, you need to use it later!
V. Installation of ORB_SLAM2
In progressORB_SLAM2Installation is, we need to prepare some tools for compiling or downloading the source code of ORB_SLAM2, the specific steps are as follows:
1. Clone the ORB_SLAM2 package to ubuntu local
1), download ORB_SLAM2 resource package
cd
git clone https://github.com/raulmur/ORB_SLAM2.git ORB_SLAM2
The download process is slow, please be patient!
2) The senior Lin Jun has uploaded the ORB_SLAM2 resource package after downloading to the CSDN backend resource module. Small partners can download through the following link, the download link is as follows:
https://download.csdn.net/download/ qq_42451251 / 12332604
3), cut the ORB_SLAM2 source code resource package to the src folder of the ROS workspace
2. The compilation of the ORB_SLAM2 package in the ROS space
1) Enter the ORB_SLAM2 package
cd ~/lenovo/ros/src/ORB_SLAM2
2), modifybuild.shFile Permissions
chmod +x build.sh
3), compilebuild.shfile
./build.sh
Generally speaking, this process will wait at 59% for a long time, but please do not close it. After about 5 minutes, it will continue to compile, because some processes need to wait for
compilation. Modify make -j in the build.sh file to make -j8. Of course, if your computer configuration is low, it is recommended to modify to make, single-threaded compilation; the followingbuild_ros.shThe file can also be operated in this way!
3. Preparation for running ORB_SLAM in ROS environment
1) If you need to run ORB_SLAM in ROS environment, we need to passbuild_ros.shThe file is compiled, the specific steps are as follows:
1. Continue to modify the permissions in the current terminal
chmod +x build_ros.sh
2), add environment variables, add to .bashrcEnd of file
gedit ~/.bashrc
export ROS_PACKAGE_PATH=${ROS_PACKAGE_PATH}:~/lenovo/ros/src/ORB_SLAM2/Examples/ROS
among them,~/lenovo/ros/Corresponding to the path and name of Lin Jun ’s own ROS workspace, modify it according to his own ROS workspace!
3), compilebuild_ros.shfile
./build_ros.sh
The compilation process also needs to wait for time, just wait quietly. Of course, it may be wrong at the end, as shown below: The
cause of the error is: libboost_system.so and libboost_filesystem.so cannot find the link directory
4), the solution
1. Viewboost_systemwithboost_filesystemSpecific location
locate boost_system
locate boost_filesystem
2. Copy the above file name, and then correspond to the path /usr/lib/x86_64-linux-gnuGo down to find the corresponding above file and putlibboost_system.sowithlibboost_system.so.1.58.0versuslibboost_filesystem.sowithlibboost_filesystem.so.1.58.0Copy together to the ORB_SLAM2 / lib path as follows:
3. Copy the above four files to the path below the ORB_SLAM2 / lib
4.ORB_SLAM2/Examples/ROS/ORB_SLAM2Add the library directory
to the Cmakelists.txt under Locate the set () function, and add the following code at the end of the function:
${PROJECT_SOURCE_DIR}/../../../lib/libboost_filesystem.so
${PROJECT_SOURCE_DIR}/../../../lib/libboost_system.so
As shown below:
close after saving
5), compile again in the terminal just now, the success is as follows:
./build_ros.sh
Here, our ORB_SLAM2 is basically installed. Next, we will download and process the data set, and then run
Six, install usb_cam
1. Install usb_cam
1) Enter the ros space and download usb_cam
cd lenovo/ros/src
git clone https://github.com/bosch-ros-pkg/usb_cam.git
2), compile ROS workspace
cd ~/lenovo/ros
ctakin_make
3). Program registration to make the function package take effect
source ./devel/setup.bash
4), createusb_camCompile folder
cd src/usb_cam
mkdir build
cd build
5), compile
cmake ..
make -j8
6), modifyusb_cam
Launch file permissions Go to the launch file under usb_cam to find the launch file, configure his permissions to make it an executable program:
7), access the laptop camera
8), create a new terminal, and run the camera node:
roslaunch usb_cam usb_cam-test.launch
At this point, our usb_cam camera is downloaded and tested, and the running node is shut down. The terminal is not busy for a while, and we need this terminal later! Next, we need to modify the camera node of the c language file under ORB_SLAM2 to usb_cam / image_raw
2. Modify the camera node of the configuration file
1) The path is:ros/src/ORB_SLAM2/Examples/ROS/ORB_SLAM2/srcThe ros_mono.cc file under the following, as follows:
2), modify the contents of the above three files, modify the camera node tousb_cam/image_rawOf course, we will only use the ros_mono.cc file in the later implementation, so for the time being, you can only modify this, it is recommended to modify it:
usb_cam/image_raw
3), compile again under ORB_SLAM2build_ros.shfile
chmod +x build_ros.sh
./build_ros.sh
As you can see, the node of the file we modified is updated!
Here, we need to configure it ok, let's start our experiment!
Seven, the monocular operation of ORB_SLAM2
1. Download the TUM data set
1) Download the TMU data set through the following link,
http://vision.in.tum.de/data/datasets/rgbd-dataset/download
2. Create a Data folder in the ORB_SLAM2 package under the ROS space to store the TMU data set
1), create a Data folder
mkdir Data
2) Upload the downloaded data set to the folder under the transfer software and extract the files here.
Here, our data set is ready to be completed. Next, we will run the ORB_SLAM2 data set!
3. Monocular operation based on TMU in ORB_SLAM2
1). Enter the following command to run monocularly
./Examples/Monocular/mono_tum Vocabulary/ORBvoc.txt Examples/Monocular/TUM1.yaml Data/rgbd_dataset_freiburg1_xyz/
among them,Data/rgbd_dataset_freiburg1_xyz/It is the path of our data set
2). The monocular operation results are as follows: the
small blue square in the right window of the figure is the extracted image ORB feature, and the left window shows the sparse map of the environment and the camera movement Track.
3) In addition, there are many sets of data in our data set, we can also test through other sets of data, the test code is as follows:
./Examples/Monocular/mono_tum Vocabulary/ORBvoc.txt Examples/Monocular/TUM2.yaml Data/rgbd_dataset_freiburg1_xyz/
./Examples/Monocular/mono_tum Vocabulary/ORBvoc.txt Examples/Monocular/TUM3.yaml Data/rgbd_dataset_freiburg1_xyz/
There are many sets of data in the downloaded data set, which can be tested! Here, the above two test results are given!
4. Monocular operation based on the camera in the ORB-SLAM2 package under ROS
In order to be able to run ORB-SLAM2 online in real time, you need to use the camera of ROS and laptop, so make sure your laptop camera is connected to the computer!
1) Run the camera node again in the newly created terminal above,
roslaunch usb_cam usb_cam-test.launch
2) Call the laptop camera to run ORB-SLAM2 in real time
rosrun ORB_SLAM2 Mono /home/xxx/lenovo/ros/src/ORB_SLAM2/Vocabulary/ORBvoc.txt /home/xxx/lenovo/ros/src/ORB_SLAM2/Examples/ROS/ORB_SLAM2/Asus.yaml
In the above codexxxUsername representing ubuntu,lenovo/rosOn behalf of Lin Jun ’s own ROS space,
holding his laptop and walking slowly, we can see that the left window is planning the path, and the right window is capturing RGB! Be sure to move slowly!
3) In addition to being through Mono, we can also run through other executable files, such as the following tags. Senior Lin Jun has only tested these two for the time being, and the others have not yet tested:
rosrun ORB_SLAM2 RGBD /home/xxx/lenovo/ros/src/ORB_SLAM2/Vocabulary/ORBvoc.txt /home/xxx/lenovo/ros/src/ORB_SLAM2/Examples/ROS/ORB_SLAM2/Asus.yaml
The above is the entire content of this blog. I hope that your friends can help you understand the operation of ORB_SLAM2 on ros by reading this blog!
In the comments area of the friends who encountered the problem, Senior Lin Jun will answer it for everyone. This senior is not too cold!
Another day of Chen Yiyue's programming follows years ^ _ ^