Concurrency and Parallel Synchronous or Asynchronous

Concurrency and parallelism

We all know that programmers are very logical creatures. They are not good at words and expressions, but they can communicate with machines in a mysterious language and know how to make machines listen to them. Machines are linear thinking. In order to communicate with machines more efficiently, programmers actively or passively or imperceptibly change their thinking mode, and their thinking gradually becomes linear. The most intuitive performance is that programmers can only do one thing at a time. If they do more than one thing at the same time, they will feel uneasy and lack confidence in their hearts. We jokingly call single-threaded creatures. This is actually a certain degree of fit with the microcosm of the machine.

In the early days, the hardware resources of the machine were insufficient: insufficient memory and low CPU operating efficiency. Only one program could be executed. Someone occupied the CPU. Don’t worry about the others. Wait slowly. Once the CPU is released, it’s up to the skill and whoever grabs it. Whoever counts, of course, there are also queues.

Later, I found that this method is not fair enough. Some programs run for a long time, up to a few minutes, and some run for a short time, and may be finished in a few milliseconds. As a result, a few milliseconds program needs to wait for a few minutes to complete the program, so concurrency is Appeared. Since you all want to use the CPU, you should limit the time, called a time slice, or grab the CPU. Whoever grabs it will execute it, but it can only run for the specified time. If the time exceeds the time, the CPU will be actively surrendered. In this way everyone can operate fairly.

Later, with the development of technology, there is no longer only one CPU, and then multiple programs are executed together. Just like multiple lanes, each lane can be driven, which is parallelism.

With the increasing number of programs executing at the same time, there are more and more cases of grabbing and waiting for resources. One program occupies the CPU, but it needs to wait for another resource. So I just waited until I got the resources. This is a serious waste of CPU resources, because not every program that occupies the CPU is working at 100%. So there is a supervisor. Since you have to wait, you have to wait wherever you go. Take a break and wait for the next time slice to see if the resources are enough. If you have enough, execute it. Don't keep waiting.

But this is not good, and it wastes time to check resources. Then simply go to bed, wait for the resources, wake you up, and then grab the CPU.

Synchronous or asynchronous

The operating rules of the machine world are the rules in the programmer's code.

For example, for an order logic, we need to generate an order, lock inventory, and generate a payment order. After the payment is completed, we need to modify the order status, reduce inventory, modify the payment order status, write off points, and generate pending orders. . . This series of processes involves more than N businesses and also more than N services (microservices). If you wait blindly, it will inevitably cause a lot of waste. For example, the method of locking inventory when generating orders, but thousands of For tens of thousands or even millions of products, finding the specified products and locking the inventory information is time-consuming and laborious.

The simplest way is to generate an order, and then generate a message to tell the inventory service that this product should be locked, hurry up. There is no delay in order service, and user response and satisfaction have been improved.

This is one of the asynchronous scenarios, which belongs to business asynchronous.

There is also a situation where the program is asynchronous.

The most typical is the asynchronous network programming in Netty. When a network request comes, Netty is not stupidly waiting for this step-by-step processing, but the network request processor throws the request packet backwards and tells the younger brothers to handle it yourself. Ha, and then continue to wait to receive other requests. Then the kids, after processing the request, tell the previous network request processor, and then send the response to the requester.

Many asynchronous programming features have also been added to JDk8, such as CompletableFuture.

Synchronous programming and asynchronous programming are like the leader arranging work: Sometimes, he will stare at you and know the end, which is synchronous; sometimes, he will just explain the content of the work, you do it yourself, and report to the leader after you finish it.

Asynchronous programming is the rational use of resources and programming skills. More and more programmers and companies are beginning to pay attention to asynchronous programming.

Recommend a book: "Java Asynchronous Programming Practice", this is a comprehensive analysis of Java asynchronous programming work, for various common asynchronous programming scenarios, in-depth explanation of the principles and methods of asynchronous programming from the perspective of programming language, development framework and so on. The author is a senior Java technical engineer from Taobao. He has deep accumulation in the field of Java asynchronous programming and concurrent programming.

Guess you like

Origin blog.csdn.net/conansix/article/details/103775482
Recommended