JVM memory and JVM tuning (four) - how to distinguish spam

Determining when the number of references to "reference count" method, and Remove generated by a statistical control of the above mentioned. Garbage collection procedure to the object count is zero. But this does not resolve the circular reference. So, then garbage judgment algorithm implementation, starting from the root node is running, traverse the entire object reference, find objects to survive. Then the realization of this approach, the garbage collection where to start it ? That is, where to start to find out which objects are currently being used by the system. The difference between heap and stack analysis above, where the stack is actually conducted the program execution place, so to get what objects are being used, it needs to start from the Java stack. At the same time, a stack with a corresponding thread, so if there are multiple threads, then you must check all these stacks corresponding to the thread.

    Meanwhile, in addition to the stack, as well as the registers when the system is running, the program is stored in the operating data. Thus, the reference stack or register as a starting point, we can find objects in the heap, and from these objects find references to the heap of other objects, such references gradually expanded, and ultimately to the end of a null reference or a primitive type, so formation of an object corresponds to Java stack referenced as a root object tree, if there are multiple references to the stack, the final object tree will form multiple satellites. These objects in the object tree objects, are needed to run the current system can not be garbage collected. The other remaining objects, the objects can not be seen as a reference to, and can be treated as garbage collection.

Therefore, garbage collection is the starting point for some of the root object (java stack, static variables, registers ...) . The most simple Java stack is the main function Java program execution. Such recovery methods, also the above-mentioned "marker - Clear" recovery mode

 

 

How to deal with debris

   Due to the different survival time is not necessarily Java objects, so after running for some time, if not memory consolidation, fragmented memory fragmentation will occur. The most immediate problem is the fragmentation will lead to lower Unable to allocate chunks of memory space, and process efficiency. So, basically garbage collection algorithm mentioned above, the "copy" mode and "mark - finishing" mode, you can solve the problem of debris.

 

 

How to solve the simultaneous presence of object creation and object recycling issue

    Garbage collection thread is the recovery of memory and the program running thread is consumed (or distribution) of memory, a recovered memory, a memory allocation , from this point of view, the two are contradictory. Therefore, before the existing garbage collection mode, to carry out garbage collection, are generally required to suspend the entire application (ie: pause allocated memory), then garbage collection, recycling and then continue to complete the application. This implementation is the most direct and most effective way to solve the contradiction of both.

But there is an obvious flaw, that is when the sustained increase heap space, garbage collection time will be correspondingly continues to increase, the corresponding application pause time will be a corresponding increase in this way . Some high application corresponding time requirements, such as maximum pause time required is a few hundred milliseconds, then when the heap space than a few G, it is likely to exceed this limit, in this case, the garbage collection system will be a bottleneck run. In order to solve this conflict, with concurrent garbage collection algorithm , using this algorithm, the garbage collector thread running threads to run simultaneously. In this way, to solve the problem pause, but because of the need and also recover objects at the same time a new object is generated, algorithmic complexity would greatly increase the system's processing capacity will be reduced, at the same time, "fragmentation" problem will more difficult to resolve.

 

- Excerpt: https: //www.iteye.com/blog/pengjiaheng-523230

Guess you like

Origin www.cnblogs.com/gllegolas/p/11995191.html