[Java Advanced] In-depth understanding of concurrency

Foreword:

        For the multitasking of the operating system, we can simply understand it as the ability to run multiple programs at the same time. Just like in life, we can use a computer to operate QQ while listening to music.

        Here we must first understand the concepts of parallelism and concurrency. Unlike parallelism, concurrent execution is not restricted by the number of CPUs. The operating system gives CPU time slices to each process, giving people a feeling of parallel processing.

        So the so-called multithreading is essentially the multitasking state of the process version, and a program executes multiple tasks at the same time. We can think of threads as "lightweight processes". With the concept of "lightweight", we can imagine that the overhead of creating or destroying a thread is much smaller than that of a process.

        Actually talk about the difference between the two. The main thing is data sharing, each process has its own independent variables, and threads share data. Sharing data may sound dangerous, but it is. So how do threads share data, that is, how should threads communicate? How to solve the competition for resources? This involves related knowledge such as synchronization or locking mechanisms. These parts are also the hardest hit areas for interviews, so I won’t expand on them here for the time being.

        Speaking of threads, in fact, multithreading is also very useful in practical applications. For example, we can chat with different people on QQ at the same time, and a web server can also process several concurrent requests at the same time.

concurrency

        Before there was concurrency, most of us actually used sequential programming, that is, all tasks of the program could only execute one step at any time. So why do we need concurrent programming? In addition to the above-mentioned concurrency that can handle multitasking, there is actually another very important reason to improve the performance running on a single processor.

        In fact, this sentence seems to be a bit contrary to the wishes of the above, why do you say that? Imagine that compared to single-threaded, multi-threaded concurrent execution does not increase the overhead of context switching. In this way, shouldn't concurrency reduce the performance improvement on the processor?

        In fact, we might as well imagine that when a task is blocked , if it is single-threaded, it will cause the subsequent tasks to be unable to execute, and the final result may be the crash of the entire program. If it is concurrent, it will happen in another thread. When blocking, other threads are executing without affecting each other. In fact, from a performance point of view, if there is no task blocking, then there is no point in using concurrency on a single-processing machine.

        In fact, the most direct way to achieve concurrency is to use processes at the operating system level, but as mentioned above, each process has its own independent address space, and the two processes do not interfere with each other. In contrast, concurrent systems like Java share resources such as memory and I/O, so the difficulty in writing multithreading lies in coordinating the use of resources between different thread-driven tasks so that these tasks It will not be accessed by multiple resources at the same time.

        The thread mechanism in Java is preemptive , which means that the scheduling mechanism will periodically interrupt the thread and switch the context to another thread, thus providing each thread with a time slice, so that each thread will be allocated a reasonable number of Time to drive its mission.

Guess you like

Origin blog.csdn.net/weixin_43918614/article/details/123629494