In the recent peak season of gold, silver and silver, various interview articles and videos appeared on the Internet. The following are some of my humble opinions.

C programmers will never die. They just cast into void.

1. What is the difference between a daemon thread and a local thread in java?

There are two types of threads in java: Daemon and User. Any thread can be set as a daemon thread and a user thread through the method Thread.setDaemon(boolon); true sets the thread as a daemon thread, otherwise it is a user thread. Thread.setDaemon() must be called before Thread.start(), otherwise an exception will be thrown during runtime.

The difference between the two: the
only difference is to determine when the virtual machine (JVM) leaves. Daemon provides services for other threads. If all UserThreads have been evacuated, Daemon has no threads to serve, and JVM evacuates. It can also be understood that the daemon thread is a thread created automatically by the JVM but not necessarily), and the user thread is a thread created by the program; for example, the garbage collection thread of the JVM is a daemon thread. When all threads have been evacuated, no more garbage is generated. The daemon thread is naturally There is nothing to do. When the garbage collection thread is the only thread left on the Java virtual machine, the Java virtual machine automatically leaves.

Extension: Thread information printed by Thread Dump. The thread containing the word daemon is the daemon. There may be: service daemon, compilation daemon, windows monitoring Ctrl+break daemon, Finalizer daemon, reference processing daemon Process, GC daemon.

Second, the difference between threads and processes?

A process is the smallest unit for the operating system to allocate resources, and a thread is the smallest unit for the operating system to schedule. A program has at least one process, and a process has at least one thread.

3. What is context switching in multithreading?

Multiple threads will use the CPUs on a group of computers together, and when the number of threads is greater than the number of CPUs allocated to the program, in order to allow each thread to have the opportunity to execute, it is necessary to use the CPU in rotation. Different thread switching uses the switching data generated by the CPU, etc., which is context switching.

Fourth, the difference between deadlock and livelock, and the difference between deadlock and starvation?

Deadlock: Refers to a phenomenon in which two or more processes (or threads) are competing for resources and waiting for each other during the execution process. If there is no external force, they will not be able to advance.
The necessary conditions for a deadlock:
1. Mutual exclusion conditions: The so-called mutual exclusion means that a process monopolizes resources for a certain period of time.
2. Request and hold conditions: When a process is blocked by requesting resources, it keeps on holding the acquired resources.
3. Non-deprivation conditions: the process has obtained resources and cannot be deprived forcibly before they are used up.
4. Cyclic waiting conditions: A cyclic waiting resource relationship is formed between several processes. Use ThreadPoolExecutor to create a custom thread pool. Future represents the result of asynchronous calculation. It provides a method to check whether the calculation is completed to wait for the completion of the calculation, and can use the get() method to get the result of the calculation.

Livelock: The task or executor is not blocked, because certain conditions are not met, resulting in repeated attempts, failures, attempts, and failures. The difference between a livelock and a deadlock is that the entity in the livelock is constantly changing state, so-called "live", while the entity in the deadlock behaves as waiting; the livelock may be unlocked by itself, but the deadlock cannot. Starvation: One or more threads cannot obtain the required resources due to various reasons, resulting in a state in which they have been unable to execute.

Reasons for starvation in Java:
1. High-priority threads swallow all the CPU time of low-priority threads.
2. The thread is permanently blocked in a state waiting to enter the synchronized block, because other threads can always access the synchronized block continuously before it.
3. The thread is waiting for an object that is also permanently waiting for completion (for example, calling the wait method of this object), because other threads are always continuously awakened.
5. What is the thread scheduling algorithm used in Java?
The time slice rotation is adopted. The priority of the thread can be set, and it will be mapped to the priority of the lower system. If it is not specially needed, try not to use it to prevent thread starvation.

5. What is a thread group and why is it not recommended in Java?

The ThreadGroup class can assign threads to a certain thread group. The thread group can have thread objects or thread groups, and the group can also have threads. This organizational structure is somewhat similar to the form of a tree. Why is it not recommended? Because there are many security risks in use, there is no specific investigation. If you need to use it, it is recommended to use the thread pool.

6. Why use Executor framework?

Creating a thread new Thread() every time a task is executed consumes performance, and creating a thread is time-consuming and resource-intensive. The thread created by calling new Thread() lacks management and is called wild thread, and it can be created unlimitedly. The competition between threads will lead to excessive use of system resources and lead to system paralysis, as well as frequent alternation between threads. It also consumes a lot of system resources. Connecting threads started with new Thread() is not conducive to expansion, such as regular execution, regular execution, regular regular execution, thread interruption, etc. are inconvenient to implement.

7. What is the difference between Executor and Executors in Java?

The different methods of the Executors tool class create different thread pools according to our needs to meet business needs.
Executor interface objects can perform our thread tasks. The ExecutorService interface inherits the Executor interface and has been extended to provide more methods for us to obtain the status of the task execution and the return value of the task. Use ThreadPoolExecutor to create a custom thread pool. Future represents the result of asynchronous calculation. It provides a method to check whether the calculation is completed to wait for the completion of the calculation, and can use the get() method to get the result of the calculation.

8. What is an atomic operation? What are the atomic classes in the Java Concurrency API

Atomic operation means "an operation or a series of operations that cannot be interrupted". The processor uses a method based on locking the cache or the bus to achieve atomic operations between multiple processors. Atomic operations can be achieved in Java by means of locks and loop CAS. CAS operation-Compare & Set, or Compare & Swap, now almost all CPU instructions support the atomic operation of CAS. Atomic operation refers to an operation task unit that is not affected by other operations. Atomic operation is a necessary means to avoid data inconsistency in a multi-threaded environment. int++ is not an atomic operation, so when one thread reads its value and adds 1, another thread may read the previous value, which will cause an error. In order to solve this problem, it is necessary to ensure that the increase operation is original. Before JDK1.5, we could use synchronization technology to do this. As of JDK1.5, the java.util.concurrent.atomic package provides atomic packaging classes of int and long types, which can automatically ensure that their operations are atomic and do not need to use java.util.concurrent. This package provides A set of atomic classes. Its basic feature is that in a multithreaded environment, when multiple threads execute methods contained in instances of these classes at the same time, it is exclusive, that is, when a thread enters a method and executes its instructions, it will not be hit by other threads. While other threads are like spin locks, they wait until the execution of the method is completed before the JVM selects another thread from the waiting queue to enter. This is just a logical understanding.

Atomic class: AtomicBoolean, AtomicInteger, AtomicLong, AtomicReference
Atomic array: AtomicIntegerArray, AtomicLongArray, AtomicReferenceArray
Atomic attribute updater: AtomicLongFieldUpdater, AtomicIntegerFieldUpdater,
AtomicReferenceFieldUpdater to solve the ABA problem through the introduction of a variable atomic class: AtomicMarkableReferenceFieldUpdater AtomicStampedReference (by introducing an int to accumulate to reflect whether there has been any change in the middle)

9.
What is the Lock interface in Java Concurrency API ? What are its advantages over synchronization?

The Lock interface provides more scalable lock operations than synchronization methods and synchronization blocks. They allow more flexible structures, can have completely different properties, and can support multiple related types of conditional objects.
Its advantages are: it
can make the lock fairer, it can make the thread respond to interruption while waiting for the lock, it
can let the thread try to acquire the lock, and return immediately when the lock cannot be acquired, or wait for a period of time. It
can be in a different range and in a different order. Acquiring and releasing locks On the whole, Lock is an extended version of synchronized. Lock provides unconditional, pollable (tryLock method), timed (tryLock parameter method), interruptible (lockInterruptibly), and multi-conditional queue (newCondition method) Lock operation. In addition, the implementation classes of Lock basically support unfair locks (default) and fair locks. Synchronized only supports unfair locks. Of course, in most cases, unfair locks are an efficient choice.

10. What is the Executors framework?

The Executor framework is a framework for invoking, scheduling, executing and controlling asynchronous tasks according to a set of execution strategies. Unlimited thread creation can cause application memory overflow. So creating a thread pool is a better solution, because the number of threads can be limited and these threads can be recycled and reused. It is very convenient to create a thread pool using the Executors framework.

11. What is a blocking queue?
What is the realization principle of blocking queue ? How to use blocking queues to implement the producer-consumer model?

Blocking Queue (BlockingQueue) is a queue that supports two additional operations. These two additional operations are: when the queue is empty, the thread that gets the element will wait for the queue to become non-empty. When the queue is full, the thread storing the element will wait for the queue to be available. Blocking queues are often used in producer and consumer scenarios. Producers are threads that add elements to the queue, and consumers are threads that take elements from the queue. The blocking queue is the container in which the producer stores the elements, and the consumer only takes the elements from the container.

JDK7 provides 7 blocking queues. They are:
ArrayBlockingQueue: A bounded blocking queue composed of an array structure.
LinkedBlockingQueue: A bounded blocking queue composed of a linked list structure.
PriorityBlockingQueue: An unbounded blocking queue that supports priority sorting.
DelayQueue: An unbounded blocking queue implemented using priority queues.
SynchronousQueue: A blocking queue that does not store elements.
LinkedTransferQueue: An unbounded blocking queue composed of a linked list structure.
LinkedBlockingDeque: A two-way blocking queue composed of a linked list structure.
When implementing synchronous access before Java 5, you can use an ordinary collection, and then use thread collaboration and thread synchronization to achieve producer and consumer models. The main technology is to use it well, wait, notify, notifyAll, sychronized. These keys word. After Java 5, blocking queues can be used to achieve this. This method greatly simplifies the amount of code, makes multi-threaded programming easier, and guarantees safety. The BlockingQueue interface is a subinterface of Queue. Its main purpose is not as a container, but as a tool for thread synchronization. Therefore, it has an obvious feature. When the producer thread tries to put elements into the BlockingQueue, if the queue has been Full, the thread is blocked. When the consumer thread tries to take out an element from it, if the queue is empty, the thread will be blocked. It is precisely because of this feature that multiple threads in the program alternately send to the BlockingQueue Put elements in, take out
elements, it can well control the communication between threads. The most classic scenario for blocking queues is the reading and parsing of socket client data. The thread that reads the data keeps putting the data into the queue, and then the parsing thread keeps getting the data from the queue for parsing.

This is the interview questions and answers I compiled

This article is shared with friends who need to interview and brush up questions, and I wish everyone smoothly get the offer you want. This information mainly includes Java basics, data structures, jvm, multithreading, etc. Due to limited space, only a small part is shown below. Interview questions, friends who need the full version can click a little link to jump to receive, click here to download the code for free : CSDN

Insert picture description here

Guess you like

Origin blog.csdn.net/dabaoad1122/article/details/114991148