translate CoreSLAM tinySLAM paper: a SLAM Algorithm in less than 200 lines of C code

tniySlam

This paper introduces a new Mines ParisTech.CoreSLAM developed SLAM algorithm called SLAF robotic systems. The algorithm was developed as part of CoreSLAM its small code size of less than 200 lines of C language code and name. It is so small that we decided to publish its source code in conference papers, so it is not only open source, and the source code is published!
Encourages us to do this it is that we get a good performance in the experiment. After a brief review of the existing SLAM algorithm, we will introduce prompt us to create CoreSLAM algorithm motives. The third section devoted algorithm. Then, we will introduce robotic platform for testing and discuss the results obtained.

Some studies have focused on these algorithms in computational speed and performance comparison of 6 , indicating that the problem is still not resolved in all cases.
SLAM algorithm based on the use of the laser in the particle filter, DP-SLAM algorithm is one of the most famous 2 . DP-SLAM operating principle is to maintain the posture of the joint distribution and texture of the robot using a particle filter. The algorithm map associated with each particle, and focuses on the problem part of the map shared among the particles, to minimize the memory (copy generation and the time of the map). DP-SLAM problem in that, as we will be integrated into conventional positioning system such as particle filter is very complex, and 3 in the very similar.
In our review, we have seen that, compared with our algorithm, most algorithms were tested on a slow robot - almost their speed does not exceed 1 m / s- and / or being used very expensive lasers (typically SICK IBEO or laser scanner, a range of 100 meters). Our robot speed reached 3 m / s, plus the short distance range Northern URG04, bringing a major but interesting questions for the positioning and mapping tasks.
Moreover, so far, if not all, most (if not all) SLAM algorithms require many lines of code or using complex mathematical methods, so it takes a lot of effort to understand and verify its operation - often rely on influence the outcome of many different parameters. Our work begins with an important goal: making a simple algorithm, easy to understand, and still provide good performance - and most importantly, easy to integrate into existing particle filter algorithm framework. Finally we added a requirement that our algorithms must eventually be embedded, it is necessary to minimize memory consumption, and the key cycle calculated using integer code.

algorithm

And DP-SLAM Instead, we decided to use only a map. Therefore, DP-SLAM CoreSLAM relative advantage is that the theoretical capacity will not disappear in the long corridor, this is indeed by particle map concept goals. Not without external circulation process can not be achieved in the DP-SLAM closed. In fact, we believe that this advantage is not worth complicated - especially since we can rely on the good odometer on the platform , and in view of our goal is to close the loop rather small (exploration laboratory rather than corridor ...).
Because CoreSLAM idea is to integrate the information into the positioning subsystem laser-based particle filter, so we have to write two main functions:

  • Scan to map the distance function, which is used as the position detection filter for each state hypothesis (particles) ikelihood function. ts_distance_scan_to_mapFunction source code provided below (Algorithm 2). Note that it only contains the point of impact of all scanned (with respect to the position of the particles) of the simple sum of all map values. This is a very quick process, using only integer arithmetic, which allows us not to limit the number of particles in real-time operation. However, this means that the likelihood estimation map is purpose-built.
  • Map update function, the robot is used to generate a map in advance, will be discussed in more detail below.
    Advancing the robot, will be discussed in detail below. Construction and map of compatible particle filter easy : like peak likelihood function is very important for the efficiency of the filter, and therefore should be easy to control. We accomplish this updating comprises digging holes grayscale by using the number of grayscale, the width of these apertures corresponds directly to our peak likelihood function. For each obstacle is detected, the algorithm does not draw a single point, but will draw a function, that function is located at the point position of the obstacle (zooming about the newly created map, see Figure 1). As a result, the configuration of the map does not look like a traditional map, but looks like a surface with suction holes. By AB filter (Algorithm line 744) is integrated into the map so that the map converge to the newer profile. The map update is called last updated positions particle filter (The likelihood after the evaluation of a weighted average of all the particles) may be selected using a delay (see discussion below).
    ts_map_updateFunction source code provided below (Algorithm 3). It uses the function ts_map_laser_ray(algorithm 4), which is the most tricky part. Bresenham algorithm which uses a laser to draw on the map, and to calculate the correct profile to use another enhancement of Bresenham algorithm internally. It uses only integer calculations, using the key portion is not an integer division. Even if each step is performed only once (in our example is 10 Hz), we must also ensure that this process is rapid, because due to the high-resolution map (1 cm per pixel), a large part of the map is touched laser scanning frame.
    CoreSLAM can be easily integrated into the particle filter, but may not be used. Indeed, the structure of the map. And having a striking orifice slope, it can be used for any gradient descent algorithm. Matching algorithm converges to obstacles easier, because the hole function plays a guiding role. Our stand-alone version using very simple Monte-Carlo algorithm to scan and map the current match and retrieve the updated position of the robot.
    Indeed, the development of a stand-alone version is to adjust the parameters CoreSLAM, i.e. the scan speed to the map of the integration ( ts_map_updatequality parameters of the function, see 2), which is set to 50, i.e., the "width around the hole" .
    Effect point map (a fixed value of 600 mm in our implementation), and the scale of the map (in our example 1 point = 1 cm). In this version, the odometer can be ignored or used as a starting point for the Monte Carlo search (in FIGS. 5 and 6, the odometer is ignored, and thus can be compared odometer laser positioning)
    based version to the particle filter management ambiguous situation is required, so for repositioning it is necessary (when we start from a complete map instead of a blank map).
    It is also the best choice for integrated odometer, because we can deal with non-systematic error (e.g., slip) (by slip probability set to 10% by integrating the nonlinear filter error model, we can conclude 10% of the particles remain in the position it has a high initial noise model, and 90% of the particles having the odometer low noise model is developing).
    The particle filter is also very good consolidation frame other than the odometer sensors and laser. We also integrated GPS (for outdoor use) and compass (no chance to get good results, in our case, the sensor is very sensitive to electromagnetic noise).
    Our two about localization accuracy and data integration by discussing the problems delayed the map to complete the presentation of CoreSLAM.
    ** Sub-pixel accuracy: * Even if we 1cm resolution map, may measure the displacement of less than 1cm, because our ts_distance_scan_to_mapfunction takes a plurality of points to be calculated. Even one millimeter of distance can be measured, because some laser spot falls on another point map.
    Delay **: ** delayed concept - from the laser to scan the time between its integrated map, the map for the resolution of the measured relative small displacements are necessary. For example, in order to correctly measure the displacement at a speed of 1cm / s of the moving object, if the resolution of the map is 1 cm, a frequency of 1Hz, it is necessary to wait for 10 measurements, then the waiting time should be 10.
    Delay theoretical formula is:
    Latency = MapResolution Measuremenuency RobotSpeed =\frac{\text {MapResolution} * \text {Measuremenuency}}{\text {RobotSpeed}}
    About our robots and taking into account our map resolution, the formula is: Delay = Robotsperd. Our robot operate at relatively high speed (2 m / s). Its speed drops to less 0.01 m / s, even at this rate, the waiting time is equal to 1 is enough to calculate. In addition, the aforementioned sub-pixel accuracy makes the task easier.
    Therefore, we decided to remove CoreSLAM delay management code, but recall that it is necessary to CoreSLAM applied to objects moving slowly. 0.01m/sIf you do not discuss closed loop it can not be the end of the description of the algorithm. Our algorithm does not cover closed, but can be integrated with any closed-loop algorithm. In fact, we are currently developing a tinyLoopCloser algorithm that should nicely complement our CoreSLAM.

Platform Notes

Our experimental platform is MinesRover (see Figure 2), which is Mines ParisTech and joint development of SAGEM DS homemade robots. The rover has four drive wheels and steering wheels as well as two ranging (see FIG. 3). The mechanical structure of the robot based on the joint movement seesaw, resulting in a very good mileage information, because the two always go round the center of the free exercise (at least in theory) always remains in contact with the ground. This device does not need a mechanical damper to prevent slippage problem.
Here Insert Picture Description
In the picture we can see the dome of the camera, steering servo motor, HOKUYO URG-04 laser rangefinder, GPS receivers (gray squares) and in front of the robot emergency ultrasound sensors.
The robot is powered by a four-element 4.1 Ah lithium battery (14.8V). The motor can achieve four 45W CC 3 m / s, the maximum speed.
Positioning accuracy of the GPS receiver may be up to 1 meter (using an EGNOS satellites). Hokuyo URG04 10Hz laser sensor provides the horizontal scanning range is 5.6 meters. In order to retrieve the robot's direction, we will result compass and GPS together. The ultrasonic obstacle sensors provide any chance of finding not detected laser light (e.g., stair steps), and is typically used as an emergency stop device. The 5-axis robot further embedded IMU may provide information about the yaw rate and the inclination of the robot (not used in our experiments).
Qwerk electronic device to the robot as the center module, the module is designed by charmedlabs.com. It is a Xilinx FPGA ARM 9 microprocessor and the logic circuit board based.
Here Insert Picture Description
6 It is a robot, and with four steering wheels, and two points of the encoder 2000 with freely rotating wheel. Qwerk module located in the center of the robot. 200Mhz microprocessor it is possible to reliably manage all sensors and actuators.
Qwerk installed on the Linux operating system, and provides support for GPS and USB bus Hokuyo. FPGA logic responsible IC bus support, and the odometer input servo motor control. Using a laser sensor Hokuyo URG-04LX via USB connection.
Although it is an ideal sensor price, but it has the following disadvantages:

  • Its maximum range is limited to 5,6 m, taking into account the speed of our robots -3 m / sec, severely limits its speed. In our chaotic environment, many measures led to the 0 m sensors (not represented reflection or no reflection) measures or ambiguous (e.g., floor or wall boxes next to the lower portion of chairs - the front portion of the box or may detect the back wall)
  • Measurement frequency (10Hz) is also limited. For example, at a speed of 3m / s to hit the walls, the walls appear to be inclined, because after the laser rotary 240 °, the last point than the second point of measurement close to 240/360 * 3/10 = 20centimeters. Measuring (although at the same distance immediately). In our robot, we must take this into account, so we use a constant (per scan) of the longitudinal and rotational speed correction per scan.

in conclusion

Experiments conducted in the laboratory of electronic technology in the University of Paris. This environment is indeed challenging algorithm: It is a chaotic environment, boxes and computers lying on the ground, many tables and shelves (see Figure 9), using a laser scanner to detect and difficult to track.
Figure 4. Mines software architecture diagram Rover robot platform.
The left is the embedded software. On the right side, which is part of the software that runs on the desktop. Mine communication between the rover and the operator is on a desktop PC via the HMI completed. Particle filter algorithm running on a desktop computer, in order to reduce the burden on the ARM9 robot. We use a wireless connection, i.e., the interior of the external robot the MIMO, can provide a good connection quality.
Figures 5, 6 and 7 show the results of samples having the same data. On the previous two graphs, we see the comparison between the estimated odometer and movement only by laser, and that they are very similar. FIG. 7 is a map in conjunction with a laser constructed and ranging information, and exhibits good accuracy of the reconstruction, almost closed loop. Interestingly, observed in the present experiment slip (see FIG. 8) has been corrected by laser information well. Please note, our experiments the robot speed reaches 2.5 m / s, and the yaw rate of 150% / sec (measured by a laser and confirmed by measuring the distance measuring method), high angular velocities of the robot which is favored by the steering wheel 4 .
Here Insert Picture Description
5. FIG experiment, only odometry estimated speed of the robot and the other side (not considering odometry) aser CoreSLAM our speed as shown in FIG. Please note that a good match between the measure. We observed small delay (about 1) measured by laser light, considering the nature of the sensor, which is predictable. This shows that our odometer very well (and has been perfectly calibrated), and the laser CoreSLAM able to cope with high-speed movement of the robot (arriving in this experiment was 2.5 m / s).
Gray is displayed by the robot built "bore" FIG . In red, we cover all laser scan performed. Blue is the reconstruction of the trajectory of the robot. Closed loop here is almost perfect. The map information is obtained by combining the odometer and laser (Monte Carlo search) is obtained.
8. FIG odometer constructed using only the map.
Please note that the skid situation when the driver hit the right wall of the laboratory.

Published 10 original articles · won praise 1 · views 526

Guess you like

Origin blog.csdn.net/chenshiming1995/article/details/104338679
Recommended