Java Concurrency background

In the operating system, concurrency refers to a time period has started several programs are in the running to run between finished and these programs are running on the same processor, but only one program on either a point in time running on a processor.

Life Course Concurrent with the operating system are closely related. Emergence process, making the program saved the state a reality, providing the possibility to switch between processes to achieve concurrent operating systems, greatly improve resource utilization. Although the process appears to solve the operating system concurrency issues, but people have higher requirements for real-time. Since a process consists of several sub-tasks, so people invented threads so that each thread is responsible for a separate sub-tasks, improve response sensitivity of the program. Although a process include multiple threads, but these are the common threads possess the resources and enjoy the process address space. Thus, while multi-threaded improved resource utilization, ensure real-time, but it also brings including safety, activity and performance issues. Overall, the process allows concurrent operating system possible, and let the internal processes of thread concurrency is possible.

A. The origin of processes and threads

 (1) The operating system Why is there progress?

  Speaking of the origin of the process, we need to talk about the history of the development of the operating system.
  Perhaps today we can not imagine that many years ago the computer what it was like. We can now do a lot of things with computers: office, entertainment, Internet, but when the computer was new, in order to solve the problem of mathematical calculations, because a lot of a lot of computation to complete is very time-consuming and labor costs through labor. In the beginning, the computer can only accept certain commands, the user enters a command, the computer will do an operation. When you think about or input data, the computer waiting. Obviously, this is inefficient at, because a lot of the time, the computer is waiting for user input.

  So, can the need to operate a series of instructions written down in advance to form a list, and then handed over to a one-time computer, the computer constantly to read the instructions to perform operations? In this way, batch operating system was born. A plurality of user programs to be executed can be written on the tape, and passes it to the computer reads and executes these programs one by one, and outputs the result written onto another tape.

  Although the birth of a batch operating system, greatly improving the convenience of handling the task, but there is still a big problem:

  If there are two tasks A and B, half of the task A to the execution, it is necessary to read large amounts of data input (I / O operation), but this time the CPU can wait quietly read data to complete the task A continue, so wasted CPU resources. So people want, whether in the process of the task A reading of data, so the task B to execute when the task is finished reading the data A, B make the task pauses, then let the task A continues?

  But there is such a problem, the original every time a computer program which is running, it is always said that only run a program of data in memory. A task and time if you want to perform I / O operations, so to perform the task B, must be loaded into memory in multiple programs, how to deal with it? How to use data from multiple programs carried out to identify it? And, after a pause the program is running, how to restore back to its state before executing it?

  This time, people invented the process, with the process corresponds to a program, each process corresponds to a certain memory address space, and can only use its own memory space, between the various processes from each other. Also, the process of saving the state run the program each time, thus provides the possibility of switching process. When the process is suspended, it will save the current state of the process (such as the use of resources, process ID, process, etc.), switch back again the next time, will recover according to the previously saved state and continue execution.

  This is concurrency, allowing the operating system from the macro looks the same time there are multiple tasks to perform. In other words, let the operating system process concurrently as possible. Note that while there are multiple concurrent tasks from a macro perspective in the implementation, but in fact, either a specific time, only one task in the CPU load (of course it's for a single-core CPU).

(2) Why is there a thread?

  In the process it appeared after the operating system's performance has been greatly improved. Although the process appears to solve the problem of concurrent operating system, but people are still not satisfied, people gradually have real-time requirements. Because a process in a period of time can only do one thing, if a process multiple sub-tasks, can only go one by one to perform these subtasks. For example, for a monitoring system, it not only to the image data displayed on the screen, but also for communication with the server acquires the image data, but also processing people interaction. If one moment that the system is communicating with the server to obtain image data, and the user clicked on a button on the surveillance system, then the system will wait to process the user's operation after completion of acquiring image data, if the acquired image data takes 10s, then users will only have to wait. Obviously, for such a system, it is not met.

  So, can you separate these subtasks enforce it? That is, while the system is acquiring image data, if the user clicks a button, it will pause acquire image data, and go to respond to user actions (often because the user's operation execution time is very short), after handling user actions and then continue to acquire image data. People invented a thread, a thread to allow a child to perform tasks, such a process would include a number of threads, each responsible for a separate sub-tasks. In this way, the user clicks the button, you can pause the thread acquiring image data, so that the UI thread in response to user operation, then switch back after completing a response, so that the thread acquired image obtained CPU resources. Thus, makes the user feel the system is doing more than one thing at the same time, meeting the user requirements for real-time.

  In other words, a process complicated by the operating system so that it becomes possible, so that the internal process and thread concurrency is possible. Note, however, although a process include multiple threads, but these are the common threads possess the resources and enjoy the process address space. Process operating system is the basic unit for resource allocation, and the thread is the basic unit of the operating system scheduling.

 (3) Multi-threaded

  Because multiple threads is common ownership of resources and belongs to the address space of the process, it will have a question: If multiple threads to access a resource at the same time, how to deal with? This problem is complicated by security issues.

  In addition, there may be a friend will ask, and now a lot of time using multi-threaded programming, it is multi-threaded performance is not necessarily due to the single-threaded it? The answer is not necessarily, it depends on the specific configuration tasks and computer. For example: For a single-core CPU, if it is CPU-intensive tasks such as extracting file, multi-threaded performance of single-threaded performance but not as good as extracting files need to have the CPU resources, if multiple threads, thread switching overhead results but will so that performance degradation. But for such interactive type of task, certainly you need to use multiple threads. For multi-core CPU, for decompressing files, certainly better than the single-threaded multi-threaded, because more can take advantage of multiple threads per core resources.

  Although multithreading can improve application performance, but relative to the single-threaded, its programming to be much more complex to consider thread safety issues. Accordingly, the actual programming process, to be selected according to the specific situation.

II. A Brief History of concurrent summary

Early computer does not contain an operating system that executes only one program from start to finish, and the program can access all the resources on your computer. This is expensive and scarce computer resources is a waste;

Occurs such that a computer operating system to run multiple programs at the same time, different programs are running in a separate process, operating system and allocate resources for executing each individual process (eg: coarse-grained slice through time for the programs to share resources, such as CPU, etc.). This will undoubtedly improve the utilization of computer resources;

In the early time-sharing system, the implementation of each process is sequential. Advantages serial programming model is its simplicity and intuitive, because every time it do one thing, do another piece after the finish. Serial programming model there are still computer resource utilization is not high;

Factors have contributed to the process appears also to promote the emergence of thread. Threads allow multiple programs simultaneously control flow in the same process. The threads share resources within the scope of the process, but each thread has its own program counter, stack and local variables, and so on;

Thread also known as lightweight processes. In most modern operating systems, it is based on the thread as a basic unit of scheduling, rather than the process. If there is no clear coordination mechanism, the thread will execute independently of each other. Since all threads in the same process will share the process memory address space, these threads can access the same variables, which need to implement a data more granular than the data sharing mechanism shared between processes. If there is no clear mechanism to synchronize collaborative access to shared data, it will cause unpredictable results.

III. Threads advantage

  • Decoupling, simplify program development

  In the process, if we assign a specific task for each type of thread, it may be formed of an illusion of serial execution, execution logic and the details of the scheduling mechanism and program operations are alternately performed, the asynchronous I / O wait and resource issues separate. By using the thread can be complicated and asynchronous workflow further decomposed into a set of simple and synchronized workflow, the workflow for each run in a separate thread, and interact in a specific synchronization position.
  
  Servlet framework is a good example. Framework responsible for solving some of the details, including request management, thread creation, load balancing, and at the right time a request to the correct application components (corresponding to a specific Servlet). Servlet developers to write not need to know how many requests are being processed at the same time, you do not need to know the input socket (a) whether the flow is blocked. When you call the Servlet service method in response to Web requests, it can be synchronously to process the request as if it is a single-threaded program. This approach simplifies the development of components, greatly reducing the learning curve framework.

  Multithreading also helps sensitive user interface responsive. For example, in the Android development, we often request the network or I / O and other time-consuming operations into a single thread, to increase the sensitivity of the response.

  • Improve resource utilization

  It appears multiprocessor system, so that multiple threads of the same program may be scheduled simultaneously on more than one CPU. Therefore, multi-threaded program can improve the system throughput by increasing processor utilization of resources. In fact, multi-threaded program also helps to achieve higher throughput on a single-processor system (if one thread is waiting for the program I / O operation is completed, another thread can continue to run the program to I / O blocked during continues to run).

IV. Risk thread brought

  • Security Issues

  In the definition of the security thread, the core concept is correct. When multiple threads access a class, no matter what kind of scheduling runtime environment using or how these threads will alternate execution, and does not require any additional synchronization or synergistic in the calling code, the class can show the correct behavior, then this class is thread-safe.

Thread-safe class example:

@NotThreadSafe 
public class UnsafeSequence { 
    private int value;

    /** Returns a unique value. */
    public int getNext() { 
        return value++; 
    } 
} 

While the  increment operation "value ++" appears to be a single operation, but in fact it consists of three separate operations: read value, plus the value 1, and writes the result value. Due to the uncertainty of the order of execution of the respective thread is running, the code may return the same value in different threads calling

  • Active issues

  Activity concern is: a piece of the right things will happen eventually. Activity leading to problems including deadlocks, starvation and so on.

  • Performance issues

  Performance concerns are: the right thing can happen as soon as possible. Comprises a plurality of aspects of the performance problems, such as in response insensitive, throughput is too low, too high consumption of resources. In a multithreaded program, when the thread scheduler temporarily suspend active threads and turn to run another thread, it will operate frequent context switching (Context Switch) occurs, this operation will lead to more time spent on the thread scheduling CPU instead of running threads on.

V. threads everywhere

  In Java, an application corresponds to a JVM instance (JVM process). Java uses a single-threaded programming model, that is, if not actively creating a thread, it will only create a thread in our own program, commonly referred to as the main thread. Note, however, although only one thread to perform the task, does not mean that only one thread in the JVM, JVM instances when it is created, and it will create a lot of other threads (such as garbage collection thread). Since Java programming model uses a single-threaded, so to be noted that during the time-consuming operation UI programmed in for the child thread, the main thread to avoid blocking (when the UI program, i.e. the main thread UI thread, for processing user interaction event).

public  class the Test {
     public  static  void main (String [] args) {
         // get the current thread running code name 
        String curThreadName = Thread.currentThread () getName ();. 
        System.out.println (curThreadName); 
    } 
} / * the Output: 
        main 
 * /

to sum up;

  • Encapsulation process is to run the program, you can save the program running, concurrency operating system;

  • A thread is the process of sub-tasks, ensure real-time program;

  • Process allocation unit is operating system resources, the thread is the basic unit of CPU scheduling;

  • Let concurrent process of the operating system possible, and let the internal processes of concurrent threads possible.

 

Guess you like

Origin www.cnblogs.com/Young111/p/11429989.html