Is Java Concurrent Concurrent Programming Must Be Fast?

Preface

When doing multi-threaded programming, it is hoped that the program will run faster, but there will also be many challenges, such as context switching, deadlock, and resource constraints limited by hardware and software.
This article will introduce these challenges:

  • The concept of context switching in multithreaded runtime
  • Is multithreading fast, because thread creation and context switching are time-consuming and may be slower than synchronous execution
  • How to measure context switching time and the number of context switches
  • How to reduce context switching
  • How to solve the deadlock challenge
  • How to solve the challenge of resource constraints

What is context switching

The CPU executes tasks cyclically through a time slice allocation algorithm. After the current task executes a time slice, it will switch to the next task. However, the state of the previous task will be saved before switching, so that when switching back to this task next time, the state of this task can be loaded again. So the process from saving to reloading a task is a context switch.

In a multithreaded environment, when the state of a thread is converted from Runnable to non-Runnable (Blocked, Waiting, Timed_Waiting), the context information of the corresponding thread (including the contents of the CPU register and the program counter at a certain point in time) needs to be saved , So that when the corresponding thread enters the Runnable state again later, it can continue on the basis of the previous execution progress. And a thread from the non-Runnable state to the Runnable state may involve restoring the previously saved context information. This process of saving and restoring the context of the thread is called context switching.

Context switching will bring additional overhead, which includes the overhead of saving and restoring thread context information, the CPU time overhead for thread scheduling, and the overhead for cpu cache invalidation.

Thread state

Thread state and context switching

Is multithreading fast?

Because of the overhead of thread creation and context switching (thread switching back and forth), it is not necessarily fast with multiple threads.

  • Use Lmbench3can measure the duration of context switching
  • Use vmstatcan measure the number of context switches

How to reduce context switching

  1. Lock-free concurrent programming When
    multiple threads compete for locks, context switching will occur. So when processing data in multiple threads, you can use some methods to avoid using locks. If the ID of the data is segmented according to the hash algorithm, different threads process different segments of data.
  2. The CAS algorithm
    is an efficient machine-level atomic instruction provided on modern processors. These atomic instructions perform read-write-change operations on memory in an atomic manner. Java's Atomic package updates data based on the CAS command, without locking.
  3. Use minimal threads
    Avoid creating unnecessary threads.
  4. Coroutine
    achieve multi-task scheduling in a single-threaded, and switching between a plurality of tasks to maintain a single thread in
  5. Where you can use volatile, do not use synchronized
    volatile. If used properly, it is cheaper to use and execute than synchronized because it will not cause thread context switching and scheduling.

Deadlock

Refers to the phenomenon of multiple threads waiting for each other due to resource grabbing.
When there are more than two arithmetic units, both of them are waiting for the other to stop running to obtain system resources, but no one of them exits early, it is called a deadlock. Or all threads enter the waiting state.

  • Common ways to avoid:
    1. Avoid one thread acquiring multiple locks at the same time
    2. Avoid one thread occupying multiple resources in the lock at the same time, try to ensure that each lock only occupies one resource
    3. Try to use a time lock, use lock.tryLock(timeout) instead of using the internal lock mechanism
    4. For database locks, locking and unlocking must be in the same database connection, otherwise unlocking will fail

The challenge of resource constraints

  • What is resource restriction?
    Resource restriction refers to that the execution speed of the program is limited by computer hardware or software resources during concurrent programming. When doing concurrent programming, consider the limitations of these resources.

  • What are the resource limits?
    Hardware resource limits include: bandwidth upload and download speed , hard disk read and write speed , CPU core number , CPU processing speed , memory size, etc.
    Software resource restrictions include: database connections , socket connections, etc.

  • Problems caused by resource limitations
    In concurrent programming, the principle of speeding up code execution is to turn the serially executed part of the code into concurrent execution. However, in some cases, because of resource constraints, it may increase the time for context switching and resource scheduling to cause slower.

  • How to solve the problem of resource constraints
    For hardware resource constraints, you can consider using clusters to execute programs in parallel.
    For software resource limitations, consider using resource pools to reuse resources.

  • Concurrent programming
    under resource constraints Adjust the concurrency of the program according to different resource constraints, so that the program can execute faster.

Guess you like

Origin blog.csdn.net/u014099894/article/details/102792159