Apollo Advanced Course ⑤ | Apollo hardware development platform introduced

Original Awa Jun Apollo developer community 2019-01-07

In the last week, Awa Jun specific basic situation for everyone Apollo Baidu open platform. This issue, Awa king will introduce about Apollo hardware development platform.

Ado, welcome to the developer went into the fifth Advanced Course.

Uber Accident Analysis

Accident occurred March 18, 2018 evening, ** an ongoing unmanned test Uber car in Tempe, Arizona, hit a pedestrian. ** The pedestrian was taken to hospital, then was declared died.

According to preliminary findings, Uber vehicle when hitting the pedestrians, is in autopilot, this is the first case in the history of autonomous vehicles hit pedestrians to death in public road case. This event, unmanned sounded the alarm about security.

June 22, 2018 US Highway Commission issued accident report:

6 seconds before the sensor, the system accident has been found that pedestrians; 1 second before the accident, emergency braking has been initiated AEB original car, but the car is not braked, because the Uber modified when Volvo cx90, original car brake system truncated, modified by a subsequent computer to transmit a control command, a brake .

  • System is not a fully closed state (main cause);
  • Driver's bow to play phone, the system detects pedestrians without warning after;
  • Autopilot and related infrastructure, inadequate lighting four seconds before the accident, the road leads from the photo no pedestrians.

Uber other traffic problems have occurred before, such as vehicle cut rub, such as direct rollover.

The reason is that, the Uber later installation of vehicle sensors (Velodyne 64 lines) is relatively heavy, the center point and the higher SUV, after installing the sensor focus on the shift, easy to roll when making sharp turns.

Uber Accident Analysis

The first day of automated driving - safety

From a process perspective autopilot developed view, it can be divided into the following four steps:

  1. Software in the ring
    software in the loop simulation is based on simulation and modeling software, like racing games. That system is the software simulation of a real road environment such as light, weather and other natural environment, the developer may autopilot code development is complete, run in the simulation system, test whether you can achieve your goals.
  2. HIL
    HIL is based on the necessary hardware platform. After the first step of the software simulation, the simulation results of all the sensor, the computing unit together, hardware in the test environment.
  3. Ring vehicle
    performs vehicle based on the vehicle is in the ring. In the second step after the test hardware environment to complete the implementation of the third step, that is, in a closed environment developed function test developer, closed the traffic flow will not interfere with the environment.
  4. The driver in the ring
    driver in the ring is based on the actual road. After successfully testing the third step into the ring at the driver, the driver in the ring is to study human - vehicle - road - traffic interaction between the four, it is not only a test program code autopilot, but also get professional judgment drivers.

More than four steps are automated driving research and development throughout the process, according to the above process R & D to ensure adequate security autopilot.
Safety is a bar on the first day of the autopilot

Autonomous vehicles hardware system

Here Insert Picture Description
Autopilot divided into three systems: perception, decision-making and control , each system has a corresponding hardware system.
Perceptual system is divided into motor sports, environmental awareness and driver monitoring three parts.

  • The vehicle motion is divided into inertial navigation, a speed sensor, an angle sensor, and a global positioning system.
  • Environmental perception is divided into laser radar, ultrasonic, camera, millimeter-wave radar, V2X.
  • Driver monitoring divided into bioelectrical sensing camera.

Decision computing system is divided into units, T-BOX three parts and the black box.

Computing unit in the automatic pilot is aware of the decision-making control algorithm. Currently autopilot use the X86 configuration of a server or PC.

T-BOX ie Telematics BOX, a vehicle communications gateway networking, Internet access on it then under the CAN bus. For example, a door switch instruction transmitted APP phone, the gateway is through T-BOX operation instruction is sent to the CAN bus for control.

Black box is used to record all the information and status of unmanned process.

The control system portion is divided into a vehicle control and warning systems.

The vehicle control is divided into braking, steering, engine and transmission. Warning system is divided into sound, image and vibration.

These are the hardware architecture throughout the autopilot system.

Automatic driving car sensors perceive class presentation

Camera is mainly used to detect lane markings, traffic signage, traffic lights, vehicles and pedestrians. Its advantage is that the overall detection information and the price is cheap, the disadvantage is performance influenced by the weather .

Mainly by the camera lens, lens modules, filters, CMOS / CCD, ISP, and data transfer these parts. Monocular camera and binocular divided.

The basic operating principle of the camera : focus the light on the CMOS Sensor behind the front of the camera through the lens and filter.

Sensor exposure optical signals into electrical signals and then converted to a standard RGB image processor through an ISP or YUV data format, and finally transferred to a computer for processing the rear end.

Lidar core principle is the TOF (Time of Flight), i.e. the emitted beam of the obstacles encountered, light may echogenic, and calculates the distance and the light receiving folded back on APD.

According to its principle of laser scanning can be divided radar rotate coaxially, the rotary prism, MEMS, OPA phased array and Flash. Not only is the laser radar for sensing, surveying & mapping class will need to use it.

Millimeter-wave radar is mainly used to detect the transit vehicles. Mainly by the millimeter wave radar RF antenna, and a chip composed of the algorithm, the basic principle is that emits a beam of electromagnetic waves, electromagnetic waves and echoes observed intake difference to calculate the distance and velocity.

The advantage is more accurate and fast detection speed, the interference from the weather, the disadvantage of not detecting the lane line recognition.

Navigation is received by all visible GNSS board GPS satellite signals and calculated to obtain spatial position of the object to be inspected in a geodetic coordinate system.

When the vehicle through the tunnel, there are groups of buildings and shade blocking other roads, GPS signal generated block does not provide a good settlement and real-time navigation, so in this case need to integrate information inertial navigation.

Inertial navigation is a completely closed system, free from outside influence, it can be given directly to the vehicle body position, velocity and attitude.
Here Insert Picture Description

Autonomous vehicles sensor mounting position generally:

  • Laser radar is rotated 360 °, it is mounted on a roof;
  • The millimeter-wave radar strong directivity, so it is generally mounted on the front and rear bumpers;
  • Considering the interference and the pitch attitude of the vehicle body on the road, the navigation system is generally installed in the two central axis of the wheel;
  • The body will mount the camera 360 °.

Automatic driving car sensors

Here Insert Picture Description
The figure summarizes the autopilot to use a sensor.

Automatic cruise, emergency braking, pedestrian detection are L1, L2 level functionality.

European standards are mandatory vehicle in 2017 with AEB function; American standards is to make 2020 all cars have driver-assist, lane departure and AEB function;

China's standards by 2018 will autopilot L1 class features AEB as mandatory standards , these criteria are mandatory for commercial vehicles, trucks and buses.

Following is a brief L1, L2 difference current production programs and programs L3 + Baidu above and many research and development of artificial intelligence.

L1, L2 level sensor fear of erroneous detection , for example, false detection occurs when the sensor is driving, and then brakes will make the driving experience is poor. L1, L2 certain algorithm avoids false detection rate.

L3 more concerned about the sensor missed , is the main driver of a system, the system must not allow the sensor segment appears undetected cases.

This is the traditional car prices and a number of AI companies for two major differences in philosophy sensors on autopilot.
Adaptation of current L4 autopilot some urban roads and highways, the speed limit of the highway is 120km / h, the braking distance calculated at different speeds according to the road friction coefficient. Plus the overall system response time, according to the following table the calculated mathematical formula.
Here Insert Picture Description

Currently the autopilot system response times will be within 500 msec, a vehicle brake hydraulic pressure needs to be 0.3 to 0.5 seconds, and the truck air brakes require 0.8 seconds.

Currently on the market in the sale of cars, most are below this technical indicator explained in the sale of car performance is very good. Now, for the requirements of the sensor, can be measured to 150 m is sufficient.
Here Insert Picture Description

This is a trigonometric arc tangent function, but this will be more divided by a formula 2, in order to avoid missed.
When the angle between the two line laser beams has a radar object, detecting an edge in the right it will produce some missed, divided by 2 will not be generated in order to ensure on each missed angle .

Under the 0.4 ° resolution this fact, we can detect a person, vehicle or rider 100 meters away.

Under a resolution of 0.1 ° this fact, we can detect a person, vehicle or rider 400 meters away.
Does not mean, however can detect the autopilot system to be identified, only this one line image or low resolution imaging, the autopilot is not identified .

Currently Baidu Apollo platform, we radar 4-5 with a laser line in the same car can only be a good obstacle for classification.

Now Velodyne 64 lines as laser radar, a resolution of 0.4 ° at his perception of the object distance is 50 meters.

Future autopilot sensor trend: the autopilot sensor is inseparable from multi-sensor fusion . Laser radar and camera are all optical sensors and the like, and the core components are very similar to the processing circuit,

In the future it is possible to laser radar and cameras distal fused together, directly output RGB, color information plus the cloud point XYZ, and then were transferred to the rear end of the calculation process is performed.

US start-up companies Aeye developed iRADAR system, which not only reflect the true color information of the two-dimensional world, but also can point cloud overlays, each pixel color information as well as space not only coordinate information.

Automatic driving car computing unit

Here Insert Picture Description
The figure is calculated automatically driving a car cell architecture.

In the calculating unit portion of autonomous vehicles, and the need to consider the design requirements of the ISO-26262 standard rules the entire vehicle, the electromagnetic interference and vibration.

All CPU, GPU, FPGA, MCU and bus have to do redundancy to prevent single point of failure.

Currently units are centralized computing architecture, is about all the work into a IPC them.

A disadvantage of this architecture is bulky, high power consumption, an amount suited to future production; advantage is convenient fast iterative code, IPC hardware slot designed to facilitate updated and expanded.

Because of the shortcomings of centralized, will consider future embedded solutions. The raw data of the sensors to a first fusion Sensor Box in which to complete the data fusion, data is then fused to the rear end computing platform to process .

Sensor Box effect : the raw sensor data currently presented is how to determine whether the determination is completed with a fusion target, the need for a synchronization time stamp, the time stamp to ensure that this is the same for each sensor to detect a coordinate system, time stamp synchronization is done inside the Sensor Box.

This original scheme centralized computing function out dismantling, it is possible to reduce the power consumption of the overall system, but not enough for mass production more.

Chip Design Flow
Here are the chip design process.

Autopilot algorithm we developed, after curing sensing algorithm can be made into a dedicated chip.

ASIC chips are based on the specific needs of a particular custom chip, its advantages are smaller than normal and the FPGA GPU, lower power consumption, stable performance and production.

The semiconductor industry is now very mature. Autopilot algorithm companies simply do the front-end design of the chip, such as the algorithm curing down, and then select the appropriate IP core, and finally EDA (electronic design automation), the chip design complete circuit diagram and then handed over to the back-end, which like TSMC kind of chip manufacturers for production flow sheet.

Chip design process is divided into three integral part of the chip design, chip manufacturing, the chip package . Now the entire semiconductor industry is developing from the deep ultraviolet (DOV) to extreme ultraviolet (EUV).

7 nanometer semiconductor is entering the era of new technology to improve the performance a great deal. Comparison 16 nanometer, 7 nm process can improve performance by 40%, 60% energy saving.

Autopilot wire system

Here Insert Picture Description
Autopilot remote control system (control by wire) refers to the car's control by a few simple commands to complete, rather than by the physical completion of the operation.

Wire portion corresponding to the person's hands and feet, the upper end of the command-line control system implementation. Divided into three parts: deceleration control, steering control and speed control.

These traditional car control assisted by a hydraulic system and a complete vacuum booster pump, autonomous vehicles control of the electric wire needs to be done components, such as electronic hydraulic brake system (EHB).
Continental brake solutions

The figure is a continent braking solutions. It integrated hydraulic MK C1 and the brake module, with a compact and lightweight design saves braking unit, the braking signal is also emitted by an electrical signal from the brake shorter .

MK100 using the ESC (electronic stability system) can back up each other and between MK C1. When the system fails to take over MK C1 by the MK100.

From the schematic view, all the power supply is performed, the brake line and the line are double continent FIG backup, greatly improved security, but this system is only suitable for passenger cars. Like trucks, buses and other commercial vehicles are braked by the air brake system.

At present, many automatic driving cars use EPS (electronic power steering system). EPS directly below the steering column and rack combination, without using the electrically controlled .
Here Insert Picture Description

The Infiniti Q50 steering system, the steering column by truncation clutch, a clutch release when the vehicle starts, all autopilot instructions by ECU (Electronic Control element) transmits a control command to the lower end of the two steering motors, for steering control.

Wire throttle control is automatically driven vehicle acceleration, a position sensor may detect the decelerating brake shades on the brake pedal, the sensor transmitting instructions to the EMS (engine brake system), the air intake valve more, i.e., the acceleration fast.
Autonomous vehicles by-wire system

Most of autonomous vehicles is currently the new energy vehicles, new energy vehicles to complete control of the acceleration by driving the motor torque control, from the point of view of the entire wire, divided into three stages:

  1. 0 pairs of the original car steering wheel pedal be modified, truncated after some steering column, a steering installation of the motor, by controlling the steering motor, a drawback has not been tested to verify the original car, there are security risks.
  2. 0 original car driver assistance based on Can bus protocol cracks, vehicle steering and braking control of the vehicle by the original bus command.
  3. 0 from the beginning of a system chassis developed in full accordance with the steering-by-wire autopilot demand for customized, with the difference that 2.0 Taking into account the needs of redundancy and backup.

July 4, 2018, Apollo hardware development platform officially released, adding 15 hardware vendor selection, also released Apollo sensor unit.

After adding the underlying abstraction layer (the original hardware reference design hardware development platform for the upgrade), hardware development platform is much more enriched.

Developers have proposed, Baidu provides hardware reference design, now either buy or supply cycle is very long.

Based on these demands, we have enriched the selection of hardware, the sensor through our tested and proven, industrial machine control unit to publish, to facilitate developers to buy.
Apollo hardware development platform
In the reference design Baidu currently available, we will be divided into Apollo platform certification and Apollo hardware development platform certification.

Apollo platform certification means the publication came out after Baidu currently using sensor certified. Such as LIDAR Velodyne 64 lines belonging to Apollo platform certified products, we will provide a data set based on the sensor being used.

Apollo hardware development platform certification , is in Apollo verify the code level, such as the perception of extra work training data acquisition module mark and models, but also need to develop themselves complete.

Follow Apollo will continue to enrich the ecosystem to continue to provide support and selection of chips and sensors.

The sensor unit (Sensor Box) all sensor information to the sensor fusion unit, the alignment is completed the whole time stamp, the transmission of data to be processed before the processing unit calculating the IPC rear end.

This is according to Baidu in the use of sensors developed, does not necessarily apply to all developers. Apollo will launch AXU subsequent expansion unit, the unit comes with PCI slot will be more flexible.
 AXU extension unit
In the Apollo abstraction layer, hardware interfaces, such as the kernel driver, USP Library (user space library) and the like.

USP Library (user space library) is mainly used in Can bus protocol. Since each depot / models / Can batch which is different bus protocols, the control instruction information written in the USP Library manipulate.

Apollo development platform, and HAL hardware abstraction layer, which in order to prevent a short circuit caused by a single hardware development intermediate layer of the system hardware kernel crash . Different hardware manufacturers can choose to open all the source code, or compiled code published on the Apollo platform.

After the Apollo completion code into the nucleus of the work will be posted on GitHub, developers do not need for different hardware selection to develop a different drive.
 Autopilot industry map
Finally, VSI is released autopilot industry chain layout.

Autopilot new energy automotive industry, IT, transportation, telecommunications, semiconductor artificial intelligence, mobile Internet, more than 100,000 large-scale industrial one hundred million polyethylene polymerization industry.

Autonomous vehicles is the material flow, energy flow, information flow of the polymer, need deep integration and cooperation to ensure that the software and hardware industry autopilot industry successfully landing.

﹏﹏﹏﹏ END ﹏﹏﹏﹏

Published 36 original articles · won praise 8 · views 1553

Guess you like

Origin blog.csdn.net/weixin_43619346/article/details/105045923