The difference between pessimistic locking and optimistic locking in Java

Table of contents

1. Pessimistic lock

2. Optimistic lock

3. The difference between pessimistic locking and optimistic locking


1. Pessimistic lock

Pessimistic Lock (Pessimistic Lock) is a traditional thread synchronization method. It assumes that other threads may modify data during the entire data processing process. Therefore, before accessing shared resources, they need to be locked first, which can avoid other threads. Threads modify data to ensure data security.

There are many ways to implement pessimistic locking, the most common is to use the synchronized keyword or the ReentrantLock class to achieve. These mechanisms will lock the entire code block or method to ensure that only one thread can access the shared resource, and other threads must wait for the lock to be released before they can access it.

The disadvantage of pessimistic locking is that it is less efficient, because each thread needs to acquire a lock when accessing shared resources. If the lock competition is fierce, a large number of threads will be blocked, thereby reducing the concurrency of the program. Therefore, in concurrent programming, non-blocking algorithms such as optimistic locking should be used as much as possible to improve the efficiency and scalability of the program.

The following is a sample code using Java pessimistic locking (synchronized):

public class Counter {
    private int count = 0;

    public synchronized void increment() {
        count++;
    }

    public synchronized int getCount() {
        return count;
    }
}

In this example, we define a Counter class to count numbers. In both the increment() method and the getCount() method, we use the synchronized keyword to lock the entire method to ensure that only one thread can access the count variable.

Since the synchronized keyword will lock the entire method, this implementation is not very efficient. If multiple threads call the increment() method or getCount() method at the same time, they may be blocked, resulting in reduced program throughput. Therefore, in actual development, we should try to avoid using pessimistic locks, and use non-blocking algorithms such as optimistic locks to improve program efficiency and scalability.

2. Optimistic lock

Optimistic locking is a way of concurrency control. It assumes that the probability of data conflict is relatively small, so it will not lock when accessing shared resources, but check whether the data has been modified by other threads when updating data, and update if not. Success, otherwise you need to try again.

Optimistic locking is usually implemented through mechanisms such as version numbers or timestamps. When a thread reads data, it will record the current version number or timestamp; when another thread modifies the same data, it will increase the version number or update the timestamp. When updating data, the thread will compare the version number or timestamp recorded by itself with the one in the database. If they are the same, it means that no other thread has modified the data and the update can be successful; otherwise, you need to re-read the data and try to update again .

Compared with pessimistic locking (that is, locking), optimistic locking can improve the concurrency and throughput of the system, but it also has certain risks. If the probability of data conflict is relatively high, it may lead to a large number of retries and failed operations. Therefore, when using optimistic locking, it is necessary to evaluate the probability of data conflicts and the cost of retrying, and select an appropriate version number or timestamp mechanism to ensure correctness.

Optimistic locking is a non-blocking algorithm. It assumes that no other threads will modify the data during the data operation process, so no locking is required, which can improve the concurrency and efficiency of the program. The optimistic lock implementation method commonly used in Java is based on the CAS (Compare and Swap) mechanism. The following is a simple Java code sample:

import java.util.concurrent.atomic.AtomicInteger;

public class OptimisticLockExample {

    private AtomicInteger count = new AtomicInteger(0);

    public void increment() {
        int expectedValue, newValue;
        do {
            expectedValue = count.get();
            newValue = expectedValue + 1;
        } while (!count.compareAndSet(expectedValue, newValue));
    }

    public static void main(String[] args) throws InterruptedException {
        OptimisticLockExample example = new OptimisticLockExample();
        Thread thread1 = new Thread(() -> {
            for (int i = 0; i < 1000; i++) {
                example.increment();
            }
        });

        Thread thread2 = new Thread(() -> {
            for (int i = 0; i < 1000; i++) {
                example.increment();
            }
        });

        thread1.start();
        thread2.start();

        thread1.join();
        thread2.join();

        System.out.println("Count: " + example.count);
    }
}

In this example, we use the AtomicInteger class as a counter, and use the CAS mechanism to implement optimistic locking. The increment() method first obtains the current value and the expected value. If the two are equal, then update the value of the counter, otherwise continue to try to update until it succeeds. In the main() method, we start two threads to access the increment() method, and each increment the counter 1000 times. Finally, output the value of the counter, and you can see that the output result is 2000, indicating that the optimistic lock is successfully implemented.

3. The difference between pessimistic locking and optimistic locking

Pessimistic locking and optimistic locking in Java are two different concurrency control methods.

  • pessimistic lock

Pessimistic locking refers to the use of locking to ensure that only one thread can access the resource at the same time when accessing the shared resource. Both the synchronized keyword and the ReentrantLock class in Java are implementations of pessimistic locks.

The advantage of pessimistic locking is that it is easy to use and can avoid data conflicts, but it needs to lock and release locks, which will affect the performance of the program, especially when multiple threads frequently compete for the same lock, it is easy to cause deadlock and performance bottleneck.

  • optimistic lock

Optimistic locking means that when accessing shared resources, it is assumed that the data will not conflict, so no locks are added, but when the data is updated, it is checked whether the data has been modified by other threads. The CAS (Compare and Swap) mechanism in Java is an implementation of optimistic locking.

The advantage of optimistic locking is that there is no lock overhead, which can improve the performance of the program, but it is necessary to select an appropriate version number or timestamp mechanism according to the specific business scenario to ensure the correctness of the data. If the probability of data conflict is relatively high, it may cause many Retries and failed operations.

In short, in Java, both pessimistic locking and optimistic locking are important means to achieve concurrency control, and the appropriate mechanism needs to be selected according to the specific situation. If shared resource access is frequent or the probability of data conflict is relatively high, pessimistic locking can be considered; if shared resource access is relatively infrequent or the probability of data conflict is relatively small, optimistic locking can be considered.

Guess you like

Origin blog.csdn.net/2301_77899321/article/details/131145380