java foundation - (5)

HashMap Why should the powers of ⽅ 2 expansion

Why default length is 16, and the expansion is why every ⽅ power of 2
(1) Benefit 1: Get the element index storage location by (n - 1) & hash to get the
attention of the last return is h & (n- 1). If n is not a power of 2, such as 15 ⽐. Then the binary length-1 will become 1110. H is in
the case where the random number, 1110, and operation & done. Mantissa is always zero. So 0001,1001,1101 and other ending in 1 position forever
may be entry accounting Use. This will result in waste, not random and other issues. n-1 ⼆ binary 1 for more bits, then on average the distribution
(2) the benefits of 2: After expansion by the position does not need to re-hash calculation element, give the following embodiment will be described Submenu
original hash value is 5 See the following calculation
0000 0000 0000 0000 0000 0000 0000 1111 // (n-1) = 15 ⼆ hexadecimal
0000 0000 0000 0000 0000 0000 0000 0101 // hash value 5 ⼆ hexadecimal
0000 0000 0000 0000 0000 0000 0000 0101
// see calculated after expansion (expansion to 32)
0000 0000 0000 0000 // 0,000,000,000,011,111 (32-1) = 31 ⼆ hexadecimal
0000 0000 0000 0000 0000 0000 0000 0101 // hash value hexadecimal 5 ⼆ It represents
0000 0000 0000 0000 0,000,000,000,010,101
position after the original expansion position = (5) + oldCap (the capacity of the original set) = 21
The original hash value 14
0000 0000 0000 0000 0000 0000 0000 1111 // (. 1-n-) = 15 ⼆ hexadecimal
0000 0000 0000 0000 0000 0000 0000 1110 // hash value 14 ⼆ hexadecimal
0000 0000 0000 0000 0000 0000 00001110
// see calculated after expansion (expansion to 32)
0000 0000 0000 0000 // 0,000,000,000,011,111 (32-1) = 31 ⼆ hexadecimal
0000 0000 0000 0000 0000 0000 0000 1110 // hash value into 5 ⼆ represents a system
0000 0000 0000 0000 0,000,000,000,001,110
position after expansion = home position (14)

Which, in various modes of difference traverse shutter mode has a set of

for foreach iterator

HashMap What's wrong with multi-threaded environment, how to solve

There will be data loss, the cycle of death and other problems, the solution is Using thread-safe collection

Using a set of optimization

Create a collection of the specified collection length

What set of thread-safe? Why is thread safe?

Vector, Hashtable, lock on Remedies

TreeMap, Hashtable principle underlying implementation

HashSet, TreeSet, LinkedHashSet principle underlying implementation

LinkedList, Stack, Vector principle underlying implementation

Why not save START collection of basic data types

concurrent package ConcurrentHashMap, CopyOnWriteArrayList, ArrayBlockingQueue like features and principles underlying implementation

How do objects than ⽐

Comparable or Comparator

How do objects than ⽐

Comparable or Comparator

Concurrent modification is how associate the resulting abnormal? How to deal with it

Collectios What Remedies, Using what do, how to achieve the underlying

Published 88 original articles · won praise 26 · views 50000 +

Guess you like

Origin blog.csdn.net/weixin_45678915/article/details/104624856