Regarding "computing power", this article is worth reading

4cd4a72f00be8de5e0cb0cf5064a05f3.gif

Author | Xiao Zaojun

Source | Fresh Date Classroom

In today’s article, let’s talk about computing power.

In the past two years, computing power can be said to be a hot concept in the ICT industry. It always appears in news reports and speeches by famous people.

So, what exactly is computing power? What categories does computing power include, and what are their uses? What is the current state of development of global computing power?

Next, Xiao Zaojun will give you a detailed science introduction.

536ece274866492976fcc275eee5c578.png

What is computing power

Everyone knows the literal meaning of computing power, which is computing power .

More specifically, computing power is the computing power to achieve target result output by processing information data.

e4e43b98fb4be40ed8ab79061ebafeba.png

We humans actually have this ability. In the course of our lives, calculations are carried out every moment. Our brain is a powerful computing engine.

Most of the time, we perform tool-less calculations through oral arithmetic and mental arithmetic. However, this computing power is a bit low. Therefore, when encountering complex situations, we will use computing tools to perform in-depth calculations.

In ancient times, our primitive tools were straw ropes and stones. Later, with the advancement of civilization, we had more practical computing tools such as abacus (a small stick used for calculation) and abacus, and the level of computing power continued to improve.

By the 1940s, we had a computing revolution.

In February 1946, the world's first digital electronic computer ENIAC was born, marking the official entry of human computing power into the digital electronic era.

c4bf1c0410a1ee2a5427be92c9770353.png

ENIAC, 1946

Later, with the emergence and development of semiconductor technology, we entered the chip era. Chips have become the main carrier of computing power.

c369901be0e9f08443867eb68a268651.png

The world's first integrated circuit (chip), 1958

Time continues to pass.

By the 1970s and 1980s, chip technology had made great progress under the control of Moore's Law. The performance of chips continues to improve and their size continues to decrease. Finally, computers were miniaturized and the PC (personal computer) was born.

49c40b86ee552c552f59b0c93c4e7f6b.png

The world's first PC (IBM5150), 1981

The birth of PC has far-reaching significance. It marks that IT computing power no longer only serves a few large enterprises (mainframes), but has moved towards ordinary households and small and medium-sized enterprises. It has successfully opened the door to the information age for all and promoted the popularization of informatization throughout society.

With the help of PCs, people can fully appreciate the improved quality of life and increased productivity brought by IT computing power. The emergence of PC also laid the foundation for the subsequent vigorous development of the Internet.

After entering the 21st century, computing power has once again undergone tremendous changes.

The symbol of this great change is the emergence of cloud computing technology .

2af79ffe4d778ba030b96e4d8ab008b8.jpeg

Cloud Computing

Before cloud computing, humans suffered from insufficient computing power in single-point computing ( a mainframe or a PC, independently completing all computing tasks ), and had already tried grid computing (decomposing a huge computing task into many Small computing tasks are handed over to different computers) and other distributed computing architectures.

Cloud computing is a new attempt at distributed computing. Its essence is to package and aggregate a large number of scattered computing resources to achieve higher reliability, higher performance, and lower cost computing power.

Specifically, in cloud computing, computing resources such as central processing unit (CPU), memory, hard disk, and graphics card (GPU) are gathered together to form a virtual infinitely expandable "computing resource pool" through software . .

If users have computing power needs, the "computing power resource pool" will dynamically allocate computing power resources, and users pay on demand.

Compared with users purchasing their own equipment, building their own computer rooms, and operating and maintaining their own, cloud computing has obvious cost-effective advantages.

8a08f65b51fbbaeb3e82f5b2423c044f.jpeg

cloud computing data center

After computing power becomes cloud-based, data centers have become the main carrier of computing power. The scale of human computing power has begun a new leap.

d3456fc1f8a3adcda46f81712f8ffc48.png

Classification of computing power

The emergence of cloud computing and data centers is due to the continuous deepening of informatization and digitization, which has triggered a strong demand for computing power in the entire society.

These demands come from the consumer field (mobile Internet, TV drama watching, online shopping, taxi hailing, O2O, etc.), from the industry field (industrial manufacturing, transportation and logistics, financial securities, education and medical care, etc.), and from the urban governance field. (Smart City, One Pass, City Brain, etc.).

Different computing power applications and requirements have different algorithms. Different algorithms also have different requirements for computing power characteristics.

Usually, we divide computing power into two major categories, namely general computing power and dedicated computing power .

647160a56adb03d663b467cd42111871.png

Everyone should have heard that the chips responsible for outputting computing power are divided into general-purpose chips and special-purpose chips.

CPU processor chips like x86 are general-purpose chips. The computing tasks they can complete are diverse and flexible, but their power consumption is higher.

Special-purpose chips mainly refer to FPGA and ASIC .

FPGA is a programmable integrated circuit. It can change the logic structure of the internal chip through hardware programming, but the software is deeply customized to perform specialized tasks.

ASIC is an application specific integrated circuit. As the name suggests, it is a chip customized for professional use, and most of its software algorithms are solidified on the silicon chip.

ASIC can complete specific computing functions and has a relatively simple function, but its energy consumption is very low. FPGA is between general-purpose chips and ASICs.

3f543290f3a1142472e5497a08245a16.png

Let’s take Bitcoin mining as an example.

In the past, people used PCs (x86 general-purpose chips) to mine. Later, the difficulty became more and more difficult, and the computing power was insufficient. So, I started using graphics cards (GPUs) to mine. Later, the energy consumption of the graphics card was too high, and the value of the mined currency could not cover the electricity bill, so FPGA and ASIC cluster array mining began.

In the data center, computing tasks are also divided into basic general computing and HPC high-performance computing .

HPC computing is further subdivided into three categories:

Scientific computing: physical chemistry, meteorology and environmental protection, life sciences, oil exploration, astronomical exploration, etc.

Engineering calculations: computer-aided engineering, computer-aided manufacturing, electronic design automation, electromagnetic simulation, etc.

Intelligent computing: namely artificial intelligence (AI, Artificial Intelligence) computing, including: machine learning, deep learning, data analysis, etc.

Everyone should have heard of scientific computing and engineering computing. These professional scientific research fields generate a large amount of data and require extremely high computing power.

Take oil and gas exploration as an example. Oil and gas exploration, simply put, means doing CT on the surface. After a project, the original data often exceeds 100TB, and may even exceed 1PB. Such a huge amount of data requires massive computing power to support it.

We need to focus on intelligent computing .

AI artificial intelligence is the development direction that the whole society is currently focusing on. Regardless of the field, the application and implementation of artificial intelligence are being studied.

The three core elements of artificial intelligence are computing power, algorithms and data.

99e5491ce7b54d05e13fc8aa54e4b15b.png

As we all know, AI artificial intelligence is a big user of computing power, and it especially "eats" computing power. In artificial intelligence calculations, it involves a lot of multiplication and addition of matrices or vectors, and is highly specific, so it is not suitable to use the CPU for calculations.

In real-life applications, people mainly use GPUs and the aforementioned dedicated chips for calculations. GPUs, in particular, are currently the main force in AI computing power.

Although the GPU is a graphics processor, its number of GPU cores (logical operation units) far exceeds that of the CPU. It is suitable for sending the same instruction stream to many cores in parallel and using different input data for execution to complete graphics processing or big data processing. A large number of simple operations.

Therefore, GPU is more suitable for processing computationally intensive and highly parallelized computing tasks (such as AI computing).

In recent years, due to the strong demand for artificial intelligence computing, the country has also built many intelligent computing centers , which are data centers dedicated to intelligent computing.

In addition to intelligent computing centers, there are now many supercomputing centers . The supercomputing center houses supercomputers such as "Tianhe-1", which are specialized in undertaking various large-scale scientific computing and engineering computing tasks.

The data centers we usually see are basically cloud computing data centers .

6c7596d9cd29a1a3acca2cb7e370ff75.png

The tasks are relatively complex, including basic general-purpose computing and high-performance computing, as well as a large number of heterogeneous computing (computing methods using different types of instruction sets at the same time). Because the demand for high-performance computing is increasing, the proportion of dedicated computing chips is gradually increasing.

TPU, NPU and DPU, which have gradually become popular in the past few years, are actually dedicated chips.

cb029f7506422e5aee31db62fd641a64.png

The "computing power offloading" that everyone often hears now is actually not about deleting computing power, but about transferring many computing tasks (such as virtualization, data forwarding, compression storage, encryption and decryption, etc.) from the CPU to NPU, DPU and other chips. , reducing the computational burden on the CPU.

In recent years, in addition to basic general computing power, intelligent computing power, and supercomputing power, the concept of cutting-edge computing power has also emerged in the scientific community, mainly including quantum computing, photon computing, etc., which deserves attention.

61b4d1627f6a2f37f5a457f4ffc0430b.png

Measurement of computing power

Since computing power is a "ability", of course there will be indicators and benchmark units to measure its strength. The units that everyone is more familiar with should be FLOPS, TFLOPS, etc.

In fact, there are many indicators to measure computing power, such as MIPS, DMIPS, OPS, etc.

60b600f7e4c41b7daffb93ec0b3b1fa1.png

MFLOPS, GFLOPS, TFLOPS, PFLOPS, etc. are all different magnitudes of FLOPS. The specific relationship is as follows:

a33b969dfa3b8a281986b2f71d1609da.png

Floating point numbers have different specifications: FP16, FP32, and FP64

The difference in computing power between different computing power carriers is huge. In order to help everyone better understand this difference, Xiao Zaojun made another computing power comparison table:

bf6582c66aabbaf53b356bc92ee252f6.png

Earlier we mentioned general computing, intelligent computing and supercomputing. Judging from the trend, the computing power of intelligent computing and supercomputing is growing much faster than general computing power.

According to GIV statistics, by 2030, general computing power (FP32) will increase 10 times to 3.3 ZFLOPS. The AI ​​computing power (FP16) will increase 500 times to 105 ZFLOPS.

8c8119af0026387fa27841c454d93a39.png

The current situation and future of computing power

As early as 1961, John McCarthy, the "father of artificial intelligence", proposed the goal of Utility Computing. He believes: "One day, computing may be organized as a public utility, just like the telephone system is a public utility."

Now, his vision has become a reality. Under the digital wave, computing power has become a public basic resource like water and electricity, and data centers and communication networks have also become important public infrastructure.

This is the result of more than half a century of hard work in the IT and communications industries.    

For the entire human society, computing power is no longer a concept with a technical dimension. It has risen to the dimensions of economics and philosophy, becoming the core productivity in the digital economy era and the cornerstone of the digital and intelligent transformation of the entire society .

Each of our lives, as well as the operation of factories and enterprises, and the operation of government departments, are inseparable from computing power. In key areas such as national security, national defense construction, and basic subject research, we also need massive computing power.

Computing power determines the speed of digital economic development and the height of social intelligence development.

According to data jointly released by IDC, Inspur Information, and Tsinghua University Global Industry Research Institute, for every increase in the computing power index by 1 point on average, the digital economy and GDP will increase by 3.5‰ and 1.8‰ respectively.

be3c8b8e128aea17d22a8f88be13b54e.png

The computing power scale and economic development level of countries around the world have shown a significant positive correlation. The larger the computing power of a country, the higher the level of economic development.

b631dacb4ae9762caeff844a7d798aab.png

Ranking of computing power and GDP of countries around the world

(Source: Chi Jiuhong, speech at Huawei Computing Era Summit)

In the field of computing power, competition among countries is becoming increasingly fierce.

In 2020, my country's total computing power reached 135 EFLOPS, a year-on-year increase of 55%, exceeding the global growth rate by about 16 percentage points. Currently, our absolute computing power ranks second in the world.

However, from a per capita perspective, we do not have an advantage and are only at the level of countries with medium computing power.

fd9d2870e23cc83df71c9eb94dd4e49e.png

Comparison of per capita computing power across the world

(Source: Tang Xiongyan, speech at Huawei Computing Era Summit)

Especially in core computing technologies such as chips, there is still a big gap between us and developed countries. Many chokehold technologies have not been solved, which has seriously affected the security of our computing power, and thus affected national security.


Therefore, there is still a long way to go, and we still need to continue to work hard.

In the future society, informatization, digitalization and intelligence will further accelerate. The arrival of the era of intelligent interconnection of all things, the introduction of a large number of intelligent IoT terminals, and the implementation of AI intelligent scenarios will generate unimaginable massive amounts of data.

These data will further stimulate the demand for computing power.

According to Roland Berger's predictions, from 2018 to 2030, the demand for computing power for autonomous driving will increase by 390 times, and the demand for smart factories will increase by 110 times. The per capita computing power demand in major countries will increase by 20% from less than 500 GFLOPS today. times, becoming 10,000 GFLOPS in 2035.

According to the forecast of Inspur Artificial Intelligence Research Institute, by 2025, the global computing power will reach 6.8 ZFLOPS, an increase of 30 times compared with 2020.

A new round of computing power revolution is accelerating.

7f11f194e2a603177840091dfb558fcc.png

Conclusion

Computing power is such an important resource, but in fact, there are still many problems in our utilization of computing power.

For example, the issue of computing power utilization and the balance of computing power distribution. According to IDC data, the utilization rate of enterprises’ scattered small computing power is currently only 10%-15%, which is a huge waste.

Moore's Law began to slow down in 2015, and the growth rate of computing power per unit energy consumption has gradually been widened by the growth rate of data volume. While we continue to explore the potential of chip computing power, we must also consider the resource scheduling issue of computing power.

So, how do we schedule computing power? Can the existing communication network technology meet the scheduling needs of computing power?

references:

1. "White Paper on China's Computing Power Development Index", Academy of Information and Communications Technology;

2. "Computing Power Network Technology White Paper", China Mobile;

3. "What's going on with computing power networks (CAN, CFN, CPN) and counting from the east to the west", QianLing, Zhihu;

4. "China Unicom Computing Power Network White Paper", China Unicom;

5. "Introduction and Prospects for Computing Network Development", Cao Chang;

6. "What is a computing power network", Wu Zhuoran;

7. "Thoughts on the underlying technology of "computing power network"", Yan Guihai;

8. "The demand for AI computing power is growing rapidly, and platform infrastructure has become the focus", GF Securities, Liu Xuefeng, Li Aoyuan, and Wu Zupeng.

549fcca1052175e48ececfac5c82bf3c.gif

Past recommendation

hot search! Only 28% of Huawei’s employees are under 30 years old. Netizens: What about the 35-year-old crisis?

Wei Bokai of China Academy of Information and Communications Technology: Interpretation of Cloud Native Hybrid Standards

crazy? A hacker disclosed the "$25" method to hack into Starlink. SpaceX gave him money and sincerely invited everyone to "hack" it together?

Illustration of React's diff algorithm: the core is just two words - reuse

85b8fff853c3ee22b9954c772065f07f.gif

Click to share

a365c3dda4986ace496f22038d489db1.gif

Click to favorite

6bf9394e1176b90219b824484273a01a.gif

Like it

4495b77536640ab2771552fe021617cb.gif

Click to see

Guess you like

Origin blog.csdn.net/FL63Zv9Zou86950w/article/details/126476416#comments_27907366