Academician Dai Qionghai: After 5 years, the large model will become the operating system of AI!

 Datawhale dry goods 

Author: Academician Dai Haiqiong, editor: China Electronics News

f1ee0bd5ac0394415c2ed92a7c7b29a2.jpeg

On April 20, the 2023 IT Market Annual Conference hosted by CCID Consulting Co., Ltd. was held in Beijing. At the meeting, Dai Qionghai, an academician of the Chinese Academy of Engineering, delivered a speech. Dai Qionghai said that broadening the boundaries of data, promoting algorithm innovation, and breaking the bottleneck of computing power will be the only way to bring about application changes in the future and lead the breakthrough of artificial intelligence foundation. Dai Qionghai pointed out that at present, the current situation of my country's artificial intelligence development is in a situation of strong application and relatively weak foundation. Strengthening original innovation and avoiding the plight of "water without a source" is the top priority for my country's artificial intelligence development to break the situation. To promote the innovation and development of artificial intelligence, "data", "algorithm" and "computing power" are the pillars of development: data is the teaching material of artificial intelligence, which determines the scope of intelligence; algorithm is the brain of artificial intelligence, which determines the level of intelligence; computing power is the The engine of artificial intelligence determines the efficiency of intelligence.

New data will become the main driving force for the development of a new generation of AI

Since its inception, artificial intelligence has gone through three iterations. The first generation of artificial intelligence system operates based on rules, the second generation of artificial intelligence is based on statistical machine learning, and the third generation, which is currently widely used artificial intelligence, is based on "deep learning neural network". With the development and application of AlphaGo's core deep learning technology and large model technology represented by ChatGPT, deep learning based on big data has covered many fields such as biomedicine, software and hardware systems, scientific computing, energy and basic mathematics research. fields, promoting the intelligent development of cities, education, medical care, robotics and other fields.

Counting the development history of artificial intelligence, it is not difficult to find that the development of artificial intelligence is always accompanied by the development of data. In the past decades of development, the popularity of informatization has brought about data expansion, and the advent of the Internet era has also led to an inevitable data explosion. With the expansion of the boundaries of data that can be processed, the capabilities of intelligent algorithms, and the scope of the environment that can be affected, the gap between the boundaries of data and the physical environment continues to narrow, and data has become an important factor supporting the development of artificial intelligence. In the current era of big data, large models, and deep learning algorithms, the size and completeness of data directly determine the capabilities of artificial intelligence models, which also means that new data will become one of the main driving forces for the development of a new generation of artificial intelligence.

Dai Qionghai pointed out that at present, my country has already preemptively realized the processing of billion-level data and realized the image output of billion-pixel level. In the future, the key for my country to seize the dominance of artificial intelligence and data development is to realize the transition from limited scenarios to open scenarios, and to realize artificial intelligence's understanding of complex environments.

In 5 years, the large model will become the key basic platform of AI

"Large models will mature faster and bring about application changes, and biologically inspired intelligence will lead to basic breakthroughs in artificial intelligence." Dai Qionghai said that the road to algorithm innovation is to achieve "stronger" through the development of "bigger" and "more refined" directions. the path to the goal. On the one hand, the algorithm must continue to tap the potential of deep learning, broaden the depth and breadth, and increase the neural network, so that the large-scale model can achieve a qualitative change in performance from a quantitative change in scale; A new algorithm, a subversive innovation that forms a new mechanism with a new quality method. Through multi-path collaborative integration, it drives the algorithm breakthrough of artificial intelligence.

The training of large models is the main method to expand the scale of the model. At present, the data of large models is becoming larger, the models are more complex, the tasks are more diverse, and the cost is also getting higher. Dai Qionghai predicted that the large model is expected to become a key basic platform in artificial intelligence applications in about five years, similar to the operating system in the PC era.

It is worth mentioning that Dai Qionghai said that cognitive computing, a new artificial intelligence algorithm and model that integrates brain and cognition, is at the forefront of artificial intelligence research in the world. Cognitive computing refers to the use of advanced neural technology to reveal the multi-level correlation and multi-modal mapping mechanism of brain structure, brain function and intelligence, and to establish cognitive models and brain-like intelligence systems. Dai Qionghai pointed out that cognitive computing is a bridge between brain science and artificial intelligence. It is estimated that 10 years later, artificial intelligence with cognitive intelligence as the core will begin to enter the application stage.

Computing power optimization and innovation are urgent

According to reports, computing power is an important support for the development of artificial intelligence. With the continuous iteration of technology and the continuous breakthrough of data and algorithms, today, the development of computing power has lagged far behind the development of algorithms, and computing power has gradually become a constraint factor for the development of artificial intelligence. Dai Qionghai pointed out that according to Moore's Law, the computing power will double every two years. If the current development needs are to be met, theoretically, the computing power will double every 3 or 4 months. In order to make up for the increasingly serious shortage of computing power and break the bottleneck of artificial intelligence development, the optimization and innovation of computing power is urgent.

Dai Qionghai pointed out that my country's mainstream computing power development path is to focus on the optimization of general-purpose chips and the design of special-purpose chips. At the same time, my country is also actively developing new tracks and developing new computing architectures.

At present, there are several routes for computing power innovation: quantum computing, photoelectric computing, artificial intelligence chips, and general-purpose chips. Revolutionize the speed, energy efficiency, and data throughput of contemporary computing.

9d234f22e9311a8b2bd9051c4f1ef7e5.png

Dry goods learning, like three times

Guess you like

Origin blog.csdn.net/Datawhale/article/details/130716739