Why Java process uses RAM larger than the Heap Size?

Java process uses virtual memory really is much bigger than Java Heap. JVM including many subsystems: garbage collector, class loading system, JIT compiler and so on, each of these subsystems need a certain amount of RAM to work properly.

When running a Java process, not just the JVM consume RAM, many local libraries (Java class libraries referenced in the local library) may need to allocate native memory, which can not be the JVM Native Memory Tracking mechanisms to monitor. Java application itself may also be used outside the heap memory by DirectByteBuffers other categories.

Then, when a Java process runs, which consume memory part in it? Here we can only be monitored Native Memory Tracking to which part of the show.

A, JVM part

Java Heap: The most significant part, Java objects, and a recovery in this region allocation, determined by the maximum Heap -Xmx.

Collector Garbage: GC data structures and algorithms require additional memory for heap memory management. These data structures include: Mark Bitmap, Mark Stack (between the recording region for reference) (to track the survival of the subject), Remembered Sets the like. Some of these data structures can be directly adjusted, for example: -XX: MarkStackSizeMax, it depends on the distribution of the other stack, for example: the partition size, -XX: G1HeapRegionSize, Remembered smaller the value of the larger value of Sets. Additional memory algorithms require different GC is different, -XX: + UseSerialGC and -XX: + UseShenandoahGC require less additional memory, G1, and CMS will need 10% Heap size as additional memory.

Code Cache: used to store the dynamically generated code: JIT compiled method, runtime interceptors and stubs. The size of this region by -XX: ReservedCodeCacheSize determining (default 240M). -XX-TieredCompilation switched off using a multilayer compiled, the compiled code needs to be reduced, thereby reducing the use of Code Cache.

Compiler: the JIT compiler requires a certain amount in order to work. This value can turn off or reduce the number of multilayer compiling the compiled thread of execution: adjusting (-XX CICompilerCount).

Class loading: metadata category (bytecodes method, symbol table, constant pool, annotation, etc.) is stored in the off-heap region, also called Metaspace. The current process of loading the JVM more classes, it will use the more metaspace. By providing -XX: MaxMetaspaceSize (default infinite) or -XX: CompressedClassSpaceSize (default 1G) can limit the size of the voxel space

The Tables Symbol: the JVM maintained two important hash table: Symbol table includes the name of the class, method, interface language elements, signatures, ID, etc., String table records the reference had been interned string. If Native Tracking String table indicates that use a lot of memory, then that there is an abuse of String.intern method of the Java application.

Threads: thread stack will use RAM, the stack size is determined by -Xss. The default is one thread has a maximum thread stack 1M, fortunately pros and cons of things are not so bad --OS inert policy to allocate memory pages, RAM is very small (usually 80 ~ 200K) for each Java thread is actually used, the authors used this script ( https://github.com/apangin/jstackmem) to count how many RSS space is part of the Java thread .

Second, the external memory heap (Direct buffers)

Java applications can explicitly incorporated by ByteBuffer.allocateDirect external memory heap; default is off heap memory size -Xmx, but this value may be -XX: MaxDirectMemorySize cover. Before JDK11, Direct ByteBuffers is NMT (Native Memory Tracking) exemplified in other portions, outside can be observed through the heap memory usage JMC.

In addition to DirectByteBuffers, MappedByteBuffers also use local memory, MappedByteBuffers role is to map the contents of the file to the virtual memory of the process, NMT does not keep track of them, is not easy to want to limit the size of this part, you can pmap -x Command to observe the actual size of the current process used:

Address           Kbytes    RSS    Dirty Mode  Mapping
...
00007f2b3e557000   39592   32956       0 r--s- some-file-17405-Index.db
00007f2b40c01000   39600   33092       0 r--s- some-file-17404-Index.db

Third, the local library (Native libraries)

JNI code will be loaded on demand by the System.loadLibrary RAM, and this memory is not managed JVM. Here concern is Java class libraries, Java has not closed the local resources can lead to memory leaks, typical examples are: ZipInputStream or DirectoryStream.

JVMTI agent, especially jdwp commissioning agent, may also lead to excessive memory usage (PS: Last year write memory agent code that caused memory leaks remember).

四、Allocator issues

A Java process can call (mmap) or standard library (malloc) method through the system to apply to the OS memory. malloc himself by mmap to apply to the OS relatively large memory, and through their own algorithms to manage memory, which may lead to memory fragmentation, resulting in excessive use of virtual memory. jemalloc is another memory allocator, which requires less power than a conventional malloc footprint of the dispenser, so you can try to use in their method jemalloc C ++ code.

in conclusion

Can not be accurate statistics on a Java virtual memory used by the process, because there are too many factors to consider are listed below:

Total memory = Heap + Code Cache + Metaspace + Symbol tables +
               Other JVM structures + Thread stacks +
               Direct buffers + Mapped files +
               Native Libraries + Malloc overhead + ...

This number (javaadu) focused on the back-end technology, JVM troubleshooting and optimization, Java interview questions, personal growth and self-management, and other topics, providing front-line developers work and growth experience for the reader, you can expect to gain something here.

Guess you like

Origin www.cnblogs.com/javaadu/p/11567496.html