Overview
The Shanghai Auto Show in 2021 is still dazzling. Don’t get carried away. Although I am an experienced driver, I am not talking about car models (beauties) this time. I am really talking about cars. It is the cars that make me look at them. It’s dazzling; from the perspective of autonomous driving, there are six popular models that are the stars of this auto show, which are quite sharp and dazzling. Because I am more interested in martial arts novels, if I use an analogy, if the six vehicles are combined, it is simply the "Six Meridians Divine Sword", which is definitely a "peerless martial arts" worthy of everyone's pursuit:
1) Jihu Alpha S Huawei Hi-Zhongchong Sword : wide open and wide open, majestic and majestic
2) R Auto ES33-Shangyang Sword : clever, flexible and elusive
3) Ji Krypton 001-Guan Chongjian : Win with clumsiness and simplicity
4) Zhiji L7-Shaochong Sword : light and quick
5) Xiaopeng P5-Shaoze Sword : Comes and goes, with subtle changes
6) Weilai ET7-Shaoshang Sword : The sword is powerful, quite earth-shattering, and the wind and rain are coming
Introduction to sword manual:
Appendix 1. Summary of autonomous driving related configurations
Appendix Table 2. Lidar performance parameters
1. Zhongchong Sword: BAIC-Jihu Alpha S Huawei HI (estimated delivery time: Q4 quarter of 2021)
1. Autonomous driving perception system
Solution : camera + millimeter wave radar + lidar, fused perception of multiple heterogeneous sensors
Sensor configuration : lidar*3 + millimeter wave radar*6 + ADS camera*9 + surround view camera*4 + ultrasonic radar*12+ in-car monitoring camera*1
a . Performance parameters of mass-produced lidar ( Huawei 96-line medium and long-range lidar ) :
Type: rotating mirror scanning hybrid solid-state lidar
Detection distance: 150 meters @10% reflectivity
FOV(H/V):120°*25°
Angular resolution: 0.25°*0.26°
Laser light source: 905nm
Arrangement location: 1 in the middle of the bumper and 1 on each side
Huawei 96-line medium and long-range lidar
b . Arrangement of 9 ADS cameras
Front-view camera*4 (1 binocular + 1 telephoto + 1 wide-angle): front windshield
Side view camera*4: Side front view*2 (exterior rearview mirror base) + side rear view*2 (fender)
Rearview camera*1: middle part of rear wing
ADS camera layout diagram
2. Autonomous driving system computing platform : MDC810
Can support two levels of computing power : 400Tops/800Tops
High integration : integrated combined inertial navigation and ultrasonic radar ECU
Flexible design: supports separate liquid cooling and pluggable SSD
Overall dimensions: 300 × 171 × 35 mm
Sensor configuration: supports up to 16* cameras + 8* lidar + 6* millimeter wave radar + 12* ultrasonic radar
High reliability: EMC Class5, IP67 dustproof and waterproof, -40°C ~85° ambient temperature support
3. Redundant design of autonomous driving system
Redundant design : steering, power, communication, braking, perception sensors
If the main system fails, immediately switch to the redundant system, for example:
a . Braking redundancy : When the main braking system fails, the redundant braking system will switch to the backup system within milliseconds to provide emergency braking function.
b . Power supply redundancy : After the main battery fails and cannot provide normal power, the system will automatically switch to the backup battery and can provide emergency power replenishment service for up to 3 minutes, giving the driver sufficient time to stop at the roadside, significantly Reduce the probability of safety accidents.
4. Autonomous driving functions and application scenarios
Functions and application scenarios
Schematic diagram of special complex working conditions scenarios
2. Shang Yangjian: SAIC-R Automobile ES33 (estimated delivery time: second half of 2022)
1.Autonomous driving perception system
Perception system solution: fusion perception of various heterogeneous sensors of camera + millimeter wave radar + lidar
Sensor configuration : lidar*1 + 4D imaging millimeter wave radar*2 + traditional 3D millimeter wave radar*6 + ADS camera*7+ surround view camera*4 + in-car monitoring camera*1 + ultrasonic radar*12
ES33 six-fold integrated sensing system
a . Lidar (Luminar Iris) performance parameters:
Type : Dual-axis rotating mirror scanning hybrid solid-state lidar
Detection distance : 250m@10% reflectivity
FOV(H/V):120°*30°
Laser light source : 1550nm
Point cloud density : equivalent to 300 lines
R Automotive ES33 lidar layout diagram
b . 4D imaging radar (ZF-PREMIUM) performance parameters:
Detection distance : 350m
Bandwidth : 77GHz
FOV (horizontal detection angle): ±60°
Working mode: FMCW (Frequency Modulated Continuous Wave)
Number of transmit/receive channels : 192 channels, providing thousands of data points per measurement cycle
Ability to accurately identify smaller and static objects : Vehicles using this technology have been tested to detect Coke cans up to 140 meters away
Road boundary detection and free space measurement
R Auto ES33-4D Imaging Millimeter Wave Radar
2. Autonomous driving system computing platform
Super computing power chip : NVIDIA DRIVE AGX Orin
Computing power can be expanded : 500~1000Tops
Algorithm: Super Environment Model – Perfectly Build a Digital Mirror
Efficient data closed loop
ES33 Super Brain
3. Autonomous driving functions and application scenarios
ES33 is the first model based on the R-TECH technology concept. R-TECH is a brand-new technology brand of R Auto, including core technologies in the fields of intelligent driving, intelligent cockpit, and three-electric technology; among which PP-CEM (intelligent driving) belongs to Part of the technology brand R-TECH;
PP-CEM - Full-stack self-developed high-end intelligent driving solution: six-fold integrated perception system + super brain
PP-CEM high-end intelligent driving solution
Functions and application scenarios:
At present, the official information about the specific autonomous driving functions of ES33 has not been disclosed, but at the SAIC R Brand Co-Creator Ecosystem Conference, it can be seen that the scenarios that PP-CEM can cope with are: all-scenario, all- weather ;
At the same time, combined with the environmental sensors carried by ES33: lidar and 4D millimeter wave radar , and the high computing power computing platform: NVIDIA Drive Orin; it is not difficult to speculate that the autonomous driving function of ESS33 will inevitably cover the current industry consensus scenarios: highways and urban areas. And three major scenarios of low-speed parking;
3. Guan Chongjian: Geely-jikrypton 001 (estimated delivery time: Q4 2021)
1.Autonomous driving perception system
Perception system solution: focusing on visual perception (Super Eagle Eye System – VIDAR)
Sensor configuration : Long-range millimeter wave radar*1 + 8MP ADS HD camera*8 + Streaming media camera*1 + Surround view camera*4 + In-car detection camera*2 + Ultrasonic radar*12
8 ADS high-definition cameras (8MP) : front view*3 (front windshield: 1 binocular + 1 telephoto monocular) + side rear view*2 (fender) + side front view*2 (outside Rearview mirror base) + rear view*2 (rear of roof)
ADS HD camera layout diagram
2. Computing platform for autonomous driving system: Mobileye EyeQ5H*2
Process : Based on TSMC's 7nm FinFET process
Computing power : The computing power of a single chip reaches 24 Tops
Autonomous driving system hardware and system architecture
3. Autonomous driving functions and application scenarios
Functions and application scenarios
"Skeleton recognition" and full scene recognition:
1) Pedestrian gesture intent recognition – “Skeleton Recognition”
Principle : The new "skeleton recognition" technology based on the cross-vision fusion algorithm platform can locate human body feature points, recognize postures, and predict behaviors. For example, when a pedestrian suddenly raises his hand, "skeleton recognition" technology can determine what the person's intention is, whether he wants you to stop the car, greets an acquaintance, or is preparing to make a phone call;
“Skeleton Recognition” Technology
2) The Super Eagle Eye system is integrated with high-precision map data and has multi-environment recognition capabilities: special-shaped vehicles, door openings, foreign object recognition, suburban road sections, construction tunnels, tree bush recognition, etc.;
Full scene recognition, full scene prediction
4. Shao Chong Sword: SAIC + Ali + Zhangjiang Hi-Tech – Zhiji L7 (expected delivery time: Q1 quarter of 2022)
1. Autonomous driving perception and decision-making
Zhiji L7 provides two sets of automatic driving solutions:
1 ) Standard solution : visual perception based on high-definition camera
a. Sensor configuration : camera*12 (ADS camera*7+surround view camera*4+vehicle monitoring camera*1) + millimeter wave radar*5 + ultrasonic radar*12
Zhiji Automotive Vision Solutions
7 ADS cameras (5MP): front view*2 (front windshield) + side rear view*2 (fender) + side front view*2 (B-pillar) + rear view*1 (upper rear license plate)
ADS camera layout diagram
b. Autonomous driving computing platform: NVIDIA Drive Xavier, with a computing power of 30 Tops+
2 ) Next-generation upgrade solution: camera + millimeter-wave radar + lidar fusion perception of various heterogeneous sensors
a. Sensor configuration:
On the basis of the standard solution, the pixels of the visual camera are upgraded from 5 million to 10 million.
Add 2 semi-solid high-beam lidars
Next-generation upgrade solution for smart driving
b. Autonomous driving computing platform: NVIDIA Drive Orin, with computing power of 500~1000Tops +
2. Autonomous driving functions and application scenarios
Function configuration
5. Shaoze Sword: Xiaopeng P5 (expected delivery time: Q4 2021)
1. Autonomous driving perception system
Perception system solution: fusion perception of various heterogeneous sensors of camera + millimeter wave radar + lidar
Sensor configuration: lidar*2 + millimeter wave radar*5 + ADS camera*8 + surround view camera*4 + interior detection camera*1 + ultrasonic radar*12
a . Lidar (DJI Livox-HAP) performance parameters:
Type: Biprism scanning hybrid solid-state lidar
Detection distance: 150m@10% reflectivity)
FOV(H&V):120°*30°
Angular resolution: 0.16°*0.2°
Laser wavelength: 905nm
Point cloud density: equivalent to 144-line lidar
Location: Both sides of the lower part of the front bumper (below the headlights)
b . 8 ADS cameras: front view*3 (front windshield, 2MP pixels) + side front view*2 (rearview mirror base) + side front view*2 (fender) + rear view*1 (upper part of license plate plate) )
ADS camera layout diagram
2. Autonomous driving system computing platform:
Core chip : NVIDIA Drive Xavier
Computing power : 30Tops
Operating system : QNX Safety OS
Security level: ASILD level
Note: It is speculated that the Xpeng P5 should use the same autonomous driving domain controller IPU03 (Desay SV) as the P7
Autonomous driving domain controller IPU03
3. Autonomous driving functions and application scenarios
Functions and application scenarios
Complex scenarios that urban NGP can handle: 1) Urban working conditions and congestion scenarios 2) Identify the drivable range 3) Identify small target objects 4) Identify detourable areas
Complex scenarios that urban NGP can handle
6. Shaoshangjian: Weilai ET7 (estimated delivery time: Q1 quarter of 2022)
1. Autonomous driving perception system
Perception system solution: camera + millimeter wave radar + lidar fusion perception of multiple heterogeneous sensors
Sensor configuration: lidar*1 + ADS camera*7 + surround view camera*4 + millimeter wave radar*5 + ultrasonic radar*12
NIO Aquila super-sensing system: perception sensor + 2 high-precision positioning units + vehicle-road collaborative sensing V2X
Aquila super sensory system
a . Lidar (Tudatong-Falcon) performance parameters:
Type: Dual-axis rotating mirror scanning hybrid solid-state lidar
Detection distance: 250m@10% reflectivity
FOV:120°*30°
Maximum resolution: 0.06°*0.06°
Laser wavelength: 1550nm
Point cloud density: equivalent to 300 lines of lidar
b . 7 ADS cameras (8MP): front view*2 (front windshield) + side front view*2 (both sides of the front of the roof) + side rear view*2 (fenders) + rear view*1 (vehicle Top and rear center)
ADS camera layout diagram
c. Ultra-long-distance visual perception : 8MP high-definition camera
Can detect vehicles 680m away + cones 260m away + pedestrians 220m away
Comparison of detection distance between 8MP and 1.2MP cameras
The side front-view camera of NIO ET7 is arranged on the roof of the car, which is called a "watchtower layout". The advantages are as follows:
The sensor's line of sight can effectively overcome obstructions, reduce blind spots, and improve safety, which is very meaningful for autonomous driving in urban areas.
In urban scenes, the sensor's line of sight is easily blocked by green belts and vehicles. Compared with cameras installed on B-pillars and rearview mirrors, high-mounted side front cameras deployed on the roof can reduce blind spots;
The front camera is placed on the high side of the roof. Because of its high position, it has a wide field of view and improves the redundancy of forward vision. Even if the front-facing main camera does not work, relying on the two high-mounted side front cameras, complete perception of forward vision can still be achieved.
Comparison of the field of view between the B-pillar (left) and the roof (right)
2. Autonomous driving system computing platform: NIO supercomputing platform NIO Adam
1 ) NIO Adam hardware architecture:
Equipped with four NVIDIA Drive Orin chips
Adopt NVOS underlying operating system
Has 48 CPU cores
256 matrix operation units
8096 floating point units
68 billion transistors
NIO Adam hardware architecture
2 ) Performance of supercomputing platform NIO Adam
a. Super image processing pipeline: ultra-high bandwidth image interface, ISP can process 6.4 billion pixels per second
b. Ultra-high backbone data network : all sensor and vehicle system signal inputs are losslessly distributed to each computing core in real time
NIO Adam hardware architecture – 4 dedicated chips
c. 4* high-performance dedicated chips : 2 main control chips + 1 redundant backup chip + 1 dedicated chip for group intelligence and personality training
2 Main control chip : realizes the full stack operation of the NAD algorithm, including multi-scheme mutual verification perception, multi-source high-precision positioning, multi-modal prediction and decision-making; sufficient computing power ensures that NAD can handle complex traffic scenes Process more data faster and more accurately;
1 Redundant backup chip : If any main chip fails, NAD can ensure safety
1. Dedicated chip for group intelligence and personality training : it can accelerate the evolution of NAD, and at the same time, it can conduct personalized local training according to each user's driving environment, improving each user's autonomous driving experience;
Note : NAD - NIO Autonomous Driving
3. Redundant design of autonomous driving system
Core controller power supply and communication redundancy
Steering system control redundancy
Parking brake redundancy
Dual motor power redundancy
4. Autonomous driving functions and application scenarios
Functions and application scenarios
Note: All ET7 series are equipped with 19 NAD safety and driving assistance functions as standard. The complete functions of NAD will adopt the service subscription model of "monthly activation and monthly payment", that is, ADaaS (ADas a Service)
Conclusion:
1. In terms of autonomous driving perception solutions, multiple heterogeneous sensor fusion solutions based on vision and radar have become the mainstream trend; and high-end autonomous driving driving perception capabilities have been fully upgraded, with high-beam lidar/8MP high-definition cameras/ 4D millimeter wave radar will become a standard feature of future high-end autonomous vehicles;
2. When promoting the highlights of autonomous driving functions, we no longer focus on the level of autonomous driving, but more on the implementation of specific usage scenarios, such as: highway navigation-assisted driving, urban road navigation-assisted driving, and parking lot automatic parking ;
3. Although these so-called high-end autonomous driving functions are constantly approaching the L3/L4 level in terms of experience, due to the incomplete technology and legal regulations, the so-called L3/L4 functions currently promoted in the industry are still strictly prohibited by law. For a system defined as L2, once an accident occurs, the main driving responsibility is still the person;
4. High-beam lidar will become one of the necessary sensors for high-end autonomous driving. In order to achieve long detection range, more and more manufacturers are using 1550nm lasers to replace 905nm lasers; at the same time, 4D millimeter wave radar is expected to replace traditional 3D millimeter wave radar and low-beam lidar;
5. In order to quickly create popular models, some car companies require high-end autonomous driving functions. In the short term, they will choose suppliers’ autonomous driving domain control as a transitional solution. In the future, most powerful companies will use their own The developed EE architecture adopts a self-developed domain controller solution to ensure the control of core technologies.
Note 1: To communicate with the author and submit articles, please add the author’s WeChat account sjtusun. Please be sure to indicate your company, position and real name before adding WeChat, otherwise the verification will not pass. Thank you for your understanding.
Remark 2: Next, "Nine Chapters Smart Driving" will publish several exclusive/first major news/in-depth analysis articles. If you don't want to miss it, please scan the QR code below to follow "Nine Chapters Smart Driving".
Recommended reading (hyperlink):
Regarding lidar, 12 questions that investors and car companies are interested in
Ten major trends in the car manufacturing 2.0 era
Recommend individual exhibitions and concurrent forums. If you are interested, please scan the QR code in the picture to register: