Java SE (19): Threads


Java SE

Include.


1. Multithreading

what is a process

A process is a task running in the operating system (an application program runs in a process). A
process (process) is a memory area containing certain resources. The operating system uses the process to divide its work into some functional units
. One or more execution units included are called threads (thread), and the process also has a private virtual address space, which is only

thread use case

Threads are usually used in situations where multiple tasks need to be done simultaneously in a program. We can define each task as a thread so that they can work together.
It can also be used in a situation where it can be completed in a single thread, but using multiple threads can be faster. such as downloading files

Fundamentals of threads

One of the advantages of the Java language is that thread processing is relatively simple.
Generally, operating systems support multiple tasks running at the same time. A task is usually a program, and each running program is called a process. When a program is running, it may contain Multiple sequential execution flows, each sequential execution flow is a thread Program
: byte sequence of instruction + data, such as: qq.exe
process: running program, which is the dynamic execution process of the program (running and in memory)
thread: in Inside the process, the process of concurrent luck (the method in Java can be regarded as a thread)
concurrency: the process runs concurrently, and the OS divides the time into many time slices (time slices), which are distributed as evenly as possible to the running programs, microscopically The process stops and starts, and it is running macroscopically. This phenomenon of running is called concurrency, but it is not "simultaneous" in the absolute sense.

Introduction to Thread class

The Thread class represents the thread type.
Any thread object is an instance of the Thread class (subclass).
The Thread class is a thread template (encapsulates complex thread opening and other operations, and encapsulates the differences of the operating system). It can be
realized by rewriting the run method. specific thread

The Thread class creates threads

To create a specific thread, you need to inherit from the Thread class.
Overwrite the run method (that is, update the running process) to implement the user's own process
Create a thread instance (a thread)
Use the start() method of the thread instance to start the thread. After startup, the thread will go as soon as possible. Execute run() concurrently

Runnable implements threads

Create a class that implements the Runnable interface, rewrite the run method,
and use the instance object of the class that implements the Runnable interface as the parameter of the constructor that creates the Thread class object

Thread.currentThread method

Thread's static method currentThread method can be used to obtain the thread running the current code fragment
Thread current = Thread.currentThread();

Get thread information

Thread provides related methods for obtaining thread information:
long getId(): Returns the identifier of the thread
String getName(): Returns the name of the thread
int getPriority(): Returns the priority of the thread
Thread.state getState(): Gets the thread The state of
boolean isAlive(): Tests whether the thread is active
boolean isDaemon(): Tests whether the thread is a daemon thread
boolean isInterrupted(): Tests whether the thread has been interrupted

thread priority

Thread switching is controlled by thread scheduling. We cannot interfere with the code, but we can maximize the program improvement by increasing the priority of the thread. The thread priority is divided into 10 levels, and the values ​​are
respectively 1~10, where 1 is the lowest and 10 is the highest. Threads raise 3 constants to represent the lowest, highest, and default priorities:

Thread.MIN_PRIORITY
Thread.MAX_PRIORITY
Thread.NORM_PRIORITY
void setPriority(int priority): Set the priority of the thread

sleep method

Thread.sleep(times) causes the current thread to give up the processor from Running and enter the Block state, sleep for times milliseconds, and then return to Runnable.
This method declaration throws an InterruptException, so you need to catch this exception when using this method

yield method

Thread's static method yield:
static void yield()
This method is used to use the current thread to actively give up the current cup time slice and return to the Runnable state, waiting for the time slice to be allocated

interrupt method

A thread can wake up another sleep Block thread interrupt() interrupt/interrupt in advance

join method

Thread's method join: void join()
This method is used to wait for the current thread to end
This method statement throws InterruptException

daemon thread

daemon thread
Marks the thread as a daemon thread or a user thread. When the running threads are all daemon threads, the Java virtual machine exits
This method must be called before starting the thread

The concept of thread synchronization

Thread synchronization can be understood as the cooperation between thread A and B. When A executes to a certain extent, it depends on a certain result of B, so it stops and starts, indicating that B runs; B executes as it says, and then gives the result to A; A continues operate. The so-called synchronization means that when a function call is issued, the call will not return until the result is obtained, and other threads cannot call this method
1) Asynchronous and concurrent, each doing its own. Such as: a group of people get on the truck
2) Synchronous and consistent processing. Example: A group of people get on the bus

Synchronized keyword

When multiple threads read and write the same critical resource concurrently, "thread concurrency security issues" will occur.
Common critical resources:

  • Multi-thread shared instance variables
  • static public variable

Use synchronized code blocks to solve thread concurrency security issues
synchronized (synchronized monitor) {}
a synchronized monitor is an arbitrary object instance, and is a mutually exclusive lock mechanism between multiple threads. Multiple threads must use the same "monitor" "Object realizes synchronous mutual exclusion
Common writing method: synchronized(this){}
If the entire process of the method needs to be synchronized, you can simply use the synchronized modification method, which is equivalent to the synchronized(this) of the entire method
to minimize the scope of synchronization and improve concurrency efficiency

Thread-safe API vs non-thread-safe API

StringBuffer is a synchronized synchronized append(); that is: multiple threads cannot access this method at the same time
StringBuilder is not a synchronized append();
Vector and Hashtable are synchronized
ArrayList and HashMap are not synchronized
Ways to obtain thread-safe collections:
Collections.synchronizedList( ) to obtain a thread-safe List collection
Collections.synchronizedMap() to obtain a thread-safe Map

Introduction to wait and notify

Coordination between multiple threads is required.
For example, if a browser displays pictures, displayThread wants to perform the task of displaying pictures, and must wait for the download thread downloadThread to finish downloading the picture. If the picture has not been downloaded yet, the displayThread can be paused. After the downloadThread completes the task, it will notify the displayThread that "the picture is ready and can be displayed"
. wait. When the condition is met, the thread waiting for that condition will be woken up. In Java, the implementation of this mechanism relies on wait/notify. The waiting mechanism is closely related to the locking mechanism

Socket principle

Introduction to client-server (c/s) model

In the C/S mode, the client sends a service request to the server, and the server provides corresponding services after receiving the request.
For example: in a hotel, the customer wants the waiter to order, the waiter notifies the chef of the order, and the chef prepares the order according to the order. After the dishes are served, let the waiter serve them to the customers. This is a C/S working method. If the hotel is regarded as a system, the waiter is the client and the chef is the server. This method of system division and collaborative work is the working method of C/S
Client part: dedicated to each user, responsible for performing front-end functions
Server part: information and functions shared by multiple users, entertaining background services

Socket data access

The client Socket corresponds to the server Socket, and both include input and output streams. The
client's socket.getInputStream() is connected to the server socket.getOutputStream().
The client's socket.getOutputStream() is connected to the server socket.getInputStream()

multi-threaded server

Server-side multithreading framework

The server receives client access in an infinite loop.
Each connection can generate a pair of new Socket instances.
Create an independent thread for each client connection to process client requests.

Thread Pool

In the principle of the Tcp server programming model, each client connection is served by a separate thread. When the session with the client ends, the thread also ends, that is, every time a client connection comes, the server must create a New threads
If there are many clients accessing the server, the server will constantly create and destroy threads, which will seriously affect the performance of the server

Principle of thread pool technology

The concept of thread pool: first create some threads, and their collection is called thread pool. When the server receives a client request, it takes out an idle thread from the thread pool to serve it, and does not close the thread after the service is completed, but It is to return the thread to the thread pool
. In the programming mode of the thread pool, the task is submitted to the entire thread pool instead of directly to a certain thread. Idle thread, then hand over the task to an internal idle thread.
The task is submitted to the entire thread pool.
A thread can only execute one task at the same time, but can submit multiple tasks to a thread pool at the same time.

Introduction to ExecutorService

ExecutorService provides a method to manage termination, and a method that can track the execution status of one or more asynchronous tasks and generate a Future.
ExecutorService can be closed, which will cause it to reject new tasks. Provides two methods to close ExecutorService.
It has only one direct implementation class ThreadPoolExecutor and indirect implementation class ScheduledThreadPoolExecutor

Use ExecutionrService to implement thread pool

ExecutorService is a class provided by java for managing the thread pool.
The thread pool has two main functions: controlling the number of threads and reusing threads.
If a large number of threads are created in a program and destroyed after the task ends, it will cause excessive resource consumption to the system. , and the danger of switching threads excessively, which may cause the system to crash, so we should use the thread pool class to solve this problem

The thread pool has the following implementation strategies:

Executor.newCachedThreadPool() Creates a thread pool that creates new threads as needed, but reuses previously constructed threads when they become
available Queues to run these threads
Executors.newScheduledThreadPool(int corePoolSize)
Creates a thread pool that can be scheduled to run commands after a given delay or periodically
Executors.newSingleThreadExecutor() Creates an Executor that uses a single worker thread, in an unbounded queue to run the thread

buffer queue

The usual practice in server development is to separate the logical processing thread from the I/O processing thread.
Logical processing thread: perform logical processing on received packets.
I/O processing thread: send and receive network data, establish and maintain connections.
Common logic Processing threads and I/O processing threads exchange data through data queues, which is the producer-consumer model.
This data queue is shared by multiple threads, and each access needs to be locked, so how to reduce the overhead of mutual exclusion/synchronization becomes more important

Principle of buffer queue technology

Double-buffered data is two queues. One is responsible for writing data from it, and the other is responsible for reading data. After the logical thread reads the data, it is responsible for exchanging its own queue with the queue of the I/O thread. There are two places that need to be locked
. When data is written from a queue and when two queues are swapped. If it is a buffer, the read and write operations are not separated. The double buffer at least saves the overhead of mutual exclusion/synchronization of the read part of the single buffer. The two
buffers correspond to two mutexes locka and lockb respectively. To control the buffer, the producer and consumer must first obtain the corresponding lock

Buffer queue thread safety issues

In most cases, the producer controls a queue for write operations, and the consumer controls another queue for read operations, that is, logical threads and I/O threads perform exclusive operations. This greatly reduces the overhead of mutual exclusion/synchronization.
When the consumer has finished reading its own queue (corresponding to locka), it immediately releases control of locka and waits for control of lockb. Once the producer releases lockb, the consumer immediately controls lockb and starts reading the queue data corresponding to lockb. At the same time, the producer controls just locka to start writing. This completes the exchange of queues

BlockingDeque programming technology

ArrayBlockingQueue: BlockingQueue of specified size, its constructor must take an int parameter to indicate its size, and the objects it contains are sorted in FIFO (first-in-first-out) order LinkedBlockingQueue: BlockingQueue of variable size, if its constructor takes a
specified The size parameter, the generated BlockingQueue has a size limit. If there is no size parameter, the size of the generated BlockingQueue is determined by Integer.MAX_VALUE, and the objects contained in it are PriorityBlockingQueue sorted in FIFO (first-in-first-out) order: similar
to LinkedBlockQueue, but the sorting of the objects contained in it is not FIFO, but according to the natural sorting order of the objects or
the order determined by the Comparator of the constructor

Use BlockingQueue

BlockingQueue is a double-buffered queue
. When multi-threading is concurrent, if we need to use queues, we can use Queue, but one problem to solve is synchronization, but synchronization operations will reduce the efficiency of concurrent Queue operations. BlockingQueue uses two queues internally, allowing
two One of the threads does storage to the queue at the same time, and the other does a retrieval operation. Improves the access efficiency of the queue while ensuring concurrency safety


Guess you like

Origin blog.csdn.net/qq_45138120/article/details/125357004