A summary of the autonomous driving configurations of the 6 most popular models at the Shanghai Auto Show

Overview

The Shanghai Auto Show in 2021 is still dazzling. Don’t get carried away. Although I am an experienced driver, I am not talking about car models (beauties) this time. I am really talking about cars. It is the cars that make me look at them. It’s dazzling; from the perspective of autonomous driving, there are six popular models that are the stars of this auto show, which are quite sharp and dazzling. Because I am more interested in martial arts novels, if I use an analogy, if the six vehicles are combined, it is simply the "Six Meridians Divine Sword", which is definitely a "peerless martial arts" worthy of everyone's pursuit:

1)         Jihu Alpha S Huawei Hi-Zhongchong Sword : wide open and wide open, majestic and majestic

2)         R Auto ES33-Shangyang Sword : clever, flexible and elusive

3)         Ji Krypton 001-Guan Chongjian : Win with clumsiness and simplicity

4)         Zhiji L7-Shaochong Sword : light and quick

5)         Xiaopeng P5-Shaoze Sword : Comes and goes, with subtle changes

6)         Weilai ET7-Shaoshang Sword : The sword is powerful, quite earth-shattering, and the wind and rain are coming

Introduction to sword manual:

Appendix 1. Summary of autonomous driving related configurations

a20fccb01fe23d18a58dcab2595cfb1c.png

Appendix Table 2. Lidar performance parameters

639d24957494752752f532d525956ad7.png

1. Zhongchong Sword: BAIC-Jihu Alpha S Huawei HI (estimated delivery time: Q4 quarter of 2021)

1. Autonomous driving perception system

Solution : camera + millimeter wave radar + lidar, fused perception of multiple heterogeneous sensors

Sensor configuration : lidar*3 + millimeter wave radar*6 + ADS camera*9 + surround view camera*4 + ultrasonic radar*12+ in-car monitoring camera*1

a . Performance parameters of mass-produced lidar ( Huawei 96-line medium and long-range lidar ) :

  •  Type: rotating mirror scanning hybrid solid-state lidar

  • Detection distance: 150 meters @10% reflectivity

  •  FOV(H/V):120°*25°

  • Angular resolution: 0.25°*0.26°

  • Laser light source: 905nm

f985942b42c97d912465e86f8e0f031f.png

Arrangement location: 1 in the middle of the bumper and 1 on each side

39bc42382e52a5c029e6290b8e6c9986.png

Huawei 96-line medium and long-range lidar

b . Arrangement of 9 ADS cameras

  • Front-view camera*4 (1 binocular + 1 telephoto + 1 wide-angle): front windshield

  • Side view camera*4: Side front view*2 (exterior rearview mirror base) + side rear view*2 (fender)

  •  Rearview camera*1: middle part of rear wing

8d9a587cae8d864ed52c533e5ce54f7d.png

ADS camera layout diagram

2. Autonomous driving system computing platform : MDC810

  • Can support two levels of computing power : 400Tops/800Tops

  •  High integration : integrated combined inertial navigation and ultrasonic radar ECU

  • Flexible design: supports separate liquid cooling and pluggable SSD

  • Overall dimensions: 300 × 171 × 35 mm

  •  Sensor configuration: supports up to 16* cameras + 8* lidar + 6* millimeter wave radar + 12* ultrasonic radar

  • High reliability: EMC Class5, IP67 dustproof and waterproof, -40°C ~85° ambient temperature support

3. Redundant design of autonomous driving system

Redundant design : steering, power, communication, braking, perception sensors

If the main system fails, immediately switch to the redundant system, for example:

a . Braking redundancy : When the main braking system fails, the redundant braking system will switch to the backup system within milliseconds to provide emergency braking function.

b . Power supply redundancy : After the main battery fails and cannot provide normal power, the system will automatically switch to the backup battery and can provide emergency power replenishment service for up to 3 minutes, giving the driver sufficient time to stop at the roadside, significantly Reduce the probability of safety accidents.

4. Autonomous driving functions and application scenarios

2f216ed06da9ff5b630fa058b2e3152d.png

Functions and application scenarios

af70ce7029c518c3b2fa7b812413f235.png

Schematic diagram of special complex working conditions scenarios

2. Shang Yangjian: SAIC-R Automobile ES33 (estimated delivery time: second half of 2022)

1.Autonomous   driving perception system

Perception system solution: fusion perception of various heterogeneous sensors of camera + millimeter wave radar + lidar

Sensor configuration : lidar*1 + 4D imaging millimeter wave radar*2 + traditional 3D millimeter wave radar*6 + ADS camera*7+ surround view camera*4 + in-car monitoring camera*1 + ultrasonic radar*12

89dddf0974a1d2c1ef836b8b7dafcf8c.png

ES33 six-fold integrated sensing system

a . Lidar (Luminar Iris) performance parameters:

  • Type : Dual-axis rotating mirror scanning hybrid solid-state lidar

  • Detection distance : 250m@10% reflectivity

  • FOV(H/V):120°*30°

  • Laser light source : 1550nm

  • Point cloud density : equivalent to 300 lines

0c928718434d1f470a6fa2788e625686.png

R Automotive ES33 lidar layout diagram

b . 4D imaging radar (ZF-PREMIUM) performance parameters:

  • Detection distance : 350m

  • Bandwidth : 77GHz

  • FOV (horizontal detection angle): ±60°

  • Working mode: FMCW (Frequency Modulated Continuous Wave)

  • Number of transmit/receive channels : 192 channels, providing thousands of data points per measurement cycle

  • Ability to accurately identify smaller and static objects : Vehicles using this technology have been tested to detect Coke cans up to 140 meters away

  •  Road boundary detection and free space measurement

39445e5d75f4fd7f008fa33b37303b02.png

R Auto ES33-4D Imaging Millimeter Wave Radar

2. Autonomous driving system computing platform

  • Super computing power chip : NVIDIA DRIVE AGX Orin

  • Computing power can be expanded : 500~1000Tops

  • Algorithm: Super Environment Model – Perfectly Build a Digital Mirror

  •   Efficient data closed loop

cab73eeedc3903a4d6d62b3b580a5eb0.png

ES33 Super Brain

3. Autonomous driving functions and application scenarios

  • ES33 is the first model based on the R-TECH technology concept. R-TECH is a brand-new technology brand of R Auto, including core technologies in the fields of intelligent driving, intelligent cockpit, and three-electric technology; among which PP-CEM (intelligent driving) belongs to Part of the technology brand R-TECH;

  • PP-CEM - Full-stack self-developed high-end intelligent driving solution: six-fold integrated perception system + super brain

90c335db450188deb955778c0bd7aaf8.png

PP-CEM high-end intelligent driving solution

Functions and application scenarios:

  • At present, the official information about the specific autonomous driving functions of ES33 has not been disclosed, but at the SAIC R Brand Co-Creator Ecosystem Conference, it can be seen that the scenarios that PP-CEM can cope with are: all-scenario, all- weather ;

  • At the same time, combined with the environmental sensors carried by ES33: lidar and 4D millimeter wave radar , and the high computing power computing platform: NVIDIA Drive Orin; it is not difficult to speculate that the autonomous driving function of ESS33 will inevitably cover the current industry consensus scenarios: highways and urban areas. And three major scenarios of low-speed parking;

3. Guan Chongjian: Geely-jikrypton 001 (estimated delivery time: Q4 2021)

1.Autonomous driving perception system

  • Perception system solution: focusing on visual perception (Super Eagle Eye System – VIDAR)

  • Sensor configuration : Long-range millimeter wave radar*1 + 8MP ADS HD camera*8 + Streaming media camera*1 + Surround view camera*4 + In-car detection camera*2 + Ultrasonic radar*12

  • 8 ADS high-definition cameras (8MP) : front view*3 (front windshield: 1 binocular + 1 telephoto monocular) + side rear view*2 (fender) + side front view*2 (outside Rearview mirror base) + rear view*2 (rear of roof)

08fece3139982288cb32992d840722f4.png

ADS HD camera layout diagram

2.    Computing platform for autonomous driving system: Mobileye EyeQ5H*2

  • Process : Based on TSMC's 7nm FinFET process

  • Computing power : The computing power of a single chip reaches 24 Tops

30bfae4b557734e29b211ae76eb0a244.png

Autonomous driving system hardware and system architecture

3.    Autonomous driving functions and application scenarios

00f04c1f5b1a49b1804f52ae545f25c2.png

Functions and application scenarios

"Skeleton recognition" and full scene recognition:

1) Pedestrian gesture intent recognition – “Skeleton Recognition”

Principle : The new "skeleton recognition" technology based on the cross-vision fusion algorithm platform can locate human body feature points, recognize postures, and predict behaviors. For example, when a pedestrian suddenly raises his hand, "skeleton recognition" technology can determine what the person's intention is, whether he wants you to stop the car, greets an acquaintance, or is preparing to make a phone call;

311f35c5add334eb7024542605c70d0e.png

“Skeleton Recognition” Technology

2) The Super Eagle Eye system is integrated with high-precision map data and has multi-environment recognition capabilities: special-shaped vehicles, door openings, foreign object recognition, suburban road sections, construction tunnels, tree bush recognition, etc.;

5267e6a19811ba6fad2baf684ea04325.png

Full scene recognition, full scene prediction

4. Shao Chong Sword: SAIC + Ali + Zhangjiang Hi-Tech – Zhiji L7 (expected delivery time: Q1 quarter of 2022)

1. Autonomous driving perception and decision-making

Zhiji L7 provides two sets of automatic driving solutions:

1 ) Standard solution : visual perception based on high-definition camera

a. Sensor configuration : camera*12 (ADS camera*7+surround view camera*4+vehicle monitoring camera*1) + millimeter wave radar*5 + ultrasonic radar*12

4f0befb19fb7c75b33d05173ab4c2053.png

Zhiji Automotive Vision Solutions

7 ADS cameras (5MP): front view*2 (front windshield) + side rear view*2 (fender) + side front view*2 (B-pillar) + rear view*1 (upper rear license plate)

53794957c57fea0f0acc8e53126bfb1c.png

ADS camera layout diagram

b. Autonomous driving computing platform: NVIDIA Drive Xavier, with a computing power of 30 Tops+

2 ) Next-generation upgrade solution: camera + millimeter-wave radar + lidar fusion perception of various heterogeneous sensors

a. Sensor configuration:

  • On the basis of the standard solution, the pixels of the visual camera are upgraded from 5 million to 10 million.

  • Add 2 semi-solid high-beam lidars

cf416ca8652cde6e80e51457889b4262.png

Next-generation upgrade solution for smart driving

b. Autonomous driving computing platform: NVIDIA Drive Orin, with computing power of 500~1000Tops +

2. Autonomous driving functions and application scenarios

86fdb7dfea7968c1ff3b8f6c7f20ab41.png

Function configuration

5. Shaoze Sword: Xiaopeng P5 (expected delivery time: Q4 2021)

1. Autonomous driving perception system

Perception system solution: fusion perception of various heterogeneous sensors of camera + millimeter wave radar + lidar

Sensor configuration: lidar*2 + millimeter wave radar*5 + ADS camera*8 + surround view camera*4 + interior detection camera*1 + ultrasonic radar*12

a . Lidar (DJI Livox-HAP) performance parameters:

  • Type: Biprism scanning hybrid solid-state lidar

  • Detection distance: 150m@10% reflectivity)

  • FOV(H&V):120°*30°

  • Angular resolution: 0.16°*0.2°

  • Laser wavelength: 905nm

  • Point cloud density: equivalent to 144-line lidar

21a37ff40eb92526304307f0b1b84e7b.png

Location: Both sides of the lower part of the front bumper (below the headlights)

b . 8 ADS cameras: front view*3 (front windshield, 2MP pixels) + side front view*2 (rearview mirror base) + side front view*2 (fender) + rear view*1 (upper part of license plate plate) )

00f7dfcf51dfbbf8b9caf9929898c362.png

ADS camera layout diagram

2. Autonomous driving system computing platform:

  • Core chip : NVIDIA Drive Xavier

  • Computing power : 30Tops

  • Operating system : QNX Safety OS

  • Security level: ASILD level

Note: It is speculated that the Xpeng P5 should use the same autonomous driving domain controller IPU03 (Desay SV) as the P7

c8f47db33a48f3bf12a313ee3727ab2b.png

Autonomous driving domain controller IPU03

3. Autonomous driving functions and application scenarios

e2f8fe9cde1a8427c0830b92309d1d76.png

Functions and application scenarios

Complex scenarios that urban NGP can handle: 1) Urban working conditions and congestion scenarios 2) Identify the drivable range 3) Identify small target objects 4) Identify detourable areas

7241a2585423a77ac7f41039d1c08c3e.png

Complex scenarios that urban NGP can handle

6. Shaoshangjian: Weilai ET7 (estimated delivery time: Q1 quarter of 2022)

1. Autonomous driving perception system

Perception system solution: camera + millimeter wave radar + lidar fusion perception of multiple heterogeneous sensors

Sensor configuration: lidar*1 + ADS camera*7 + surround view camera*4 + millimeter wave radar*5 + ultrasonic radar*12

NIO Aquila super-sensing system: perception sensor + 2 high-precision positioning units + vehicle-road collaborative sensing V2X

bbdf32b06a02621cf62722bfdf1fa235.png

Aquila super sensory system

a . Lidar (Tudatong-Falcon) performance parameters:

  • Type: Dual-axis rotating mirror scanning hybrid solid-state lidar

  • Detection distance: 250m@10% reflectivity

  •  FOV120°*30°

  • Maximum resolution: 0.06°*0.06°

  •  Laser wavelength: 1550nm

  • Point cloud density: equivalent to 300 lines of lidar

b . 7 ADS cameras (8MP): front view*2 (front windshield) + side front view*2 (both sides of the front of the roof) + side rear view*2 (fenders) + rear view*1 (vehicle Top and rear center)

ee5fd6a8067546b09626b9bf2796670b.png

ADS camera layout diagram

c. Ultra-long-distance visual perception : 8MP high-definition camera

Can detect vehicles 680m away + cones 260m away + pedestrians 220m away

a2b783bb4e3a63ef04ee71e953d39a93.png

Comparison of detection distance between 8MP and 1.2MP cameras

The side front-view camera of NIO ET7 is arranged on the roof of the car, which is called a "watchtower layout". The advantages are as follows:

  • The sensor's line of sight can effectively overcome obstructions, reduce blind spots, and improve safety, which is very meaningful for autonomous driving in urban areas.

  • In urban scenes, the sensor's line of sight is easily blocked by green belts and vehicles. Compared with cameras installed on B-pillars and rearview mirrors, high-mounted side front cameras deployed on the roof can reduce blind spots;

  • The front camera is placed on the high side of the roof. Because of its high position, it has a wide field of view and improves the redundancy of forward vision. Even if the front-facing main camera does not work, relying on the two high-mounted side front cameras, complete perception of forward vision can still be achieved.

9b70c1cf2fc2da60820a7c216b7a59de.png

Comparison of the field of view between the B-pillar (left) and the roof (right)

2. Autonomous driving system computing platform: NIO supercomputing platform NIO Adam

1 ) NIO Adam hardware architecture:

  •  Equipped with four NVIDIA Drive Orin chips

  • Adopt NVOS underlying operating system

  • Has 48 CPU cores

  • 256 matrix operation units

  • 8096 floating point units

  •  68 billion transistors

11cc6d21554a56688b0df852b693dbee.png

NIO Adam hardware architecture

2 ) Performance of supercomputing platform NIO Adam

a. Super image processing pipeline: ultra-high bandwidth image interface, ISP can process 6.4 billion pixels per second

b. Ultra-high backbone data network : all sensor and vehicle system signal inputs are losslessly distributed to each computing core in real time

f9c31e05c7b995be021bc0e960e3b56e.png

NIO Adam hardware architecture – 4 dedicated chips

c. 4* high-performance dedicated chips : 2 main control chips + 1 redundant backup chip + 1 dedicated chip for group intelligence and personality training

  • 2 Main control chip : realizes the full stack operation of the NAD algorithm, including multi-scheme mutual verification perception, multi-source high-precision positioning, multi-modal prediction and decision-making; sufficient computing power ensures that NAD can handle complex traffic scenes Process more data faster and more accurately;

  • 1 Redundant backup chip : If any main chip fails, NAD can ensure safety

  • 1. Dedicated chip for group intelligence and personality training : it can accelerate the evolution of NAD, and at the same time, it can conduct personalized local training according to each user's driving environment, improving each user's autonomous driving experience;

Note : NAD - NIO Autonomous Driving

3. Redundant design of autonomous driving system

  • Core controller power supply and communication redundancy

  • Steering system control redundancy

  • Parking brake redundancy

  • Dual motor power redundancy

4. Autonomous driving functions and application scenarios

11bd3a82ab77c97974b889a5ef677792.png

Functions and application scenarios

Note: All ET7 series are equipped with 19 NAD safety and driving assistance functions as standard. The complete functions of NAD will adopt the service subscription model of "monthly activation and monthly payment", that is, ADaaS (ADas a Service)

Conclusion:

1. In terms of autonomous driving perception solutions, multiple heterogeneous sensor fusion solutions based on vision and radar have become the mainstream trend; and high-end autonomous driving driving perception capabilities have been fully upgraded, with high-beam lidar/8MP high-definition cameras/ 4D millimeter wave radar will become a standard feature of future high-end autonomous vehicles;

 2. When promoting the highlights of autonomous driving functions, we no longer focus on the level of autonomous driving, but more on the implementation of specific usage scenarios, such as: highway navigation-assisted driving, urban road navigation-assisted driving, and parking lot automatic parking  ;

3. Although these so-called high-end autonomous driving functions are constantly approaching the L3/L4 level in terms of experience, due to the incomplete technology and legal regulations, the so-called L3/L4 functions currently promoted in the industry are still strictly prohibited by law. For a system defined as L2, once an accident occurs, the main driving responsibility is still the person;

4. High-beam lidar will become one of the necessary sensors for high-end autonomous driving. In order to achieve long detection range, more and more manufacturers are using 1550nm lasers to replace 905nm lasers; at the same time, 4D millimeter wave radar is expected to replace traditional 3D millimeter wave radar and low-beam lidar;

5. In order to quickly create popular models, some car companies require high-end autonomous driving functions. In the short term, they will choose suppliers’ autonomous driving domain control as a transitional solution. In the future, most powerful companies will use their own The developed EE architecture adopts a self-developed domain controller solution to ensure the control of core technologies.

Note 1: To communicate with the author and submit articles, please add the author’s WeChat account sjtusun. Please be sure to indicate your company, position and real name before adding WeChat, otherwise the verification will not pass. Thank you for your understanding.

Remark 2: Next, "Nine Chapters Smart Driving" will publish several exclusive/first major news/in-depth analysis articles. If you don't want to miss it, please scan the QR code below to follow "Nine Chapters Smart Driving".

Recommended reading (hyperlink):

Our personal destiny depends on 30% of hard work and 70% of industrial dividends - the founding words of "Nine Chapters of Intelligent Driving" 

The guiding ideology for the implementation of "L3 level" autonomous driving: assisting people at high speeds and replacing people at low speeds

Regarding lidar, 12 questions that investors and car companies are interested in

Ten major trends in the car manufacturing 2.0 era

Huawei is not "strong" and Apple may not be better than Xiaomi - 15 essays on autonomous driving at auto shows

 Recommend individual exhibitions and concurrent forums. If you are interested, please scan the QR code in the picture to register:

504667d870bf9aaaeebb5a86f3f4603a.png

Guess you like

Origin blog.csdn.net/jiuzhang_0402/article/details/121358943