Multi-sensor calibration in Apollo unmanned platform

Sensor calibration is one of the most basic and core modules of unmanned vehicles. As the first service provided by the software layer, the calibration quality and accuracy greatly affect the perception , positioning map , PNC and other modules. In the Apollo open source autonomous driving platform, we provide a wealth of multi-sensor calibration services, such as calibration between various sensors such as lidar , inertial navigation , camera , and Doppler radar . The algorithm covers the sensor configuration and calibration requirements of regular Level 2-Level 4 autonomous driving.

In this article, we will start with two core services in **L4 sensor calibration (calibration from lidar to inertial navigation, calibration from camera to lidar)**, and introduce in detail the process of Apollo calibration services, related precautions and common Problem analysis, I hope Apollo developers and partners can learn from it and successfully complete high-quality sensor calibration.

TIPS

For this sharing, we have invited Chen Guang, senior software architect of Baidu Meiyan Apollo perception team, to explain the multi-sensor calibration in Baidu Apollo unmanned driving platform.

Chen Guang , Baidu Meiyan Apollo perception team, senior software architect. He joined the Stanford Research Institute in the United States as a computer vision research and development scientist. As the project leader (Co-PI) and technical leader (tech lead), he participated in and took charge of the **U.S. Defense Advanced Technology Research Agency (DARPA, I

Guess you like

Origin blog.csdn.net/qq_25439417/article/details/131467838