Multithreading study notes (1)--multithreading related concepts

  • foreword

When should we use parallelism? Can any scenario achieve the high performance we expect? At the end of 2014, on the Avoiding ping pong forum, Linus Torvalds put forward a completely different point of view, Give it up, The whole "parallel computing" is the future " is a bunch of crock. The design and implementation of parallel programs are extremely complex, not only in the function separation of the program, but also in the coordination and disorder of multiple threads, which will become obstacles to the correct execution of the program, but sometimes We will use it naturally. For example, when we want to process a photo, a picture with 1024*768 pixels, if we want to traverse every pixel, it will take a lot of time. At this time, we use multi-threading. Go visit and it will be easy to solve the problem.

  • synchronous and asynchronous

 Once a synchronous method starts, the caller must wait until the method call returns before continuing the subsequent behavior. An asynchronous method call is more like a message passing. Once started, the method call will return immediately, and the caller can continue subsequent operations. The whole process Will not hinder the caller's work, if the asynchronous call needs to return a result, then the caller will be notified when the asynchronous call is really completed.


  • Parallelism and Concurrency

 Concurrency means that multiple tasks are performed alternately, and parallelism means that multiple tasks are executed at the same time"


  • critical section
 Represents a common resource or shared resource, which can be used by multiple threads, but only one thread can use it at a time at a time. Once the critical section resource is occupied, other threads must wait to use the resource.
  • blocking and non-blocking
 It is usually used to describe the interaction between multiple threads. For example, if a thread occupies a critical section resource, then all other threads that need this resource must wait in the critical section, and the thread will be suspended while waiting, which is blocking. If the region resources are not released, the threads of the entire critical region cannot work normally. Non-blocking means that threads do not affect each other, and all threads can execute forward.
  • Deadlock , Starvation and Livelock

 Deadlock: This is the most serious situation. Multiple threads occupy some resources, so threads are waiting for other threads to release resources, and no one releases them, which leads to deadlock.

Starvation: A process has been unable to obtain the required resources due to some reasons (such as too low priority), resulting in continuous execution.

Livelock: Deadlock means that they do not release resources from each other. Livelock means that they release resources from each other, causing resources to continuously jump between two threads, but no one can get all the resources to execute.

  • Concurrency level
 Due to the existence of the critical section, the concurrency between multiple threads must be controlled. According to the strategy of controlling concurrency, the concurrency levels are divided into blocking, starvation-free, barrier-free, lock-free, and no-waiting.
  • no hunger
 If the lock is unfair, the thread with high priority can jump in the queue to access the resources in the critical section, which may lead to the starvation of the thread with low priority. Hunger situation.
  • Accessibility
 Multiple threads can enter the critical section without causing a thread to be suspended due to the problem of the critical section, but if multiple threads modify the data in the critical section, the data will be unsafe. After the situation, the program will immediately roll back the modifications made by itself. This may lead to another situation where the program will continuously roll back its current operation. A feasible barrier-free implementation can rely on the "consistency mark" implementation, which is read and saved before the thread operates. After the operation is completed, read it again and compare whether it is the same twice. If the same, it means there is no conflict. If it is different, it means that there is a conflict in the data, and you need to try again.
  • no waiting
 All threads are required to complete the operation in a finite number of steps.
  • no lock
  Lock-free concurrency is barrier-free. In the case of lock-free, all threads can try to access the critical section. However, lock-free concurrency guarantees that one thread must be able to complete the operation and leave the critical section within a limited number of steps.
  • Three properties (atomicity, visibility, orderliness)

 Atomicity: An operation is uninterruptible, even when executed by multiple threads together, once an operation starts, it will not be interfered by other threads.

 Visibility: When a thread modifies the value of a shared variable, can other threads immediately know about the modification

 Ordering: In the case of thread concurrency, the instructions may be rearranged, which may cause the instructions to run in a different order than expected.

  • thread safety
When multiple threads access a class (object or method), this class can always show the correct behavior, then this class is thread-safe (will cause lock contention problems)

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=325165036&siteId=291194637