Hashmap summary-reference article version

table of Contents

Underlying principle

Hashmap under high concurrency

to sum up


The underlying principle:

https://zhuanlan.zhihu.com/p/78079598

1. The underlying principle: put, get;

2. The length of the hashMap must be a power of 2;


Hashmap under high concurrency:

https://zhuanlan.zhihu.com/p/81587796 (Analysis details)

1. Hashmap needs to resize when inserting too many elements. The condition of Resize is

HashMap.Size >= Capacity * LoadFactor。

2. Hashmap's Resize includes two steps: expansion and ReHash. ReHash may form a linked list ring in the case of concurrency (essentially because of the header insertion method used in version 1.7. Because the author considers that the elements inserted later are more likely to be used first At this point, the first plug is more convenient to get and improve efficiency, but it causes thread insecurity.)


to sum up

The reason why the expansion of HashMap under concurrency causes an endless loop is that when multiple threads are concurrently, because one thread has completed the expansion first, the original linked list is re-hashed into its own table, and the linked list becomes reversed. When the latter thread expands again, it performs its own hashing again, and again turns the reverse-order linked list into a positive-order linked list. As a result, a circular linked list is formed, which causes an endless loop when get.

Although in JDK1.8, the Java development team corrected this problem, but HashMap always has other thread safety issues. So in the case of concurrency, we should use HastTable or ConcurrentHashMap instead of HashMap.

Guess you like

Origin blog.csdn.net/Longtermevolution/article/details/108382538