Overview
HashMap before the implementation of an array of JDK1.8 + list, but after JDK1.8 to optimize the underlying HashMap has been changed to a red-black tree list + + implemented by the array, the main purpose is to improve search efficiency.
as the picture shows:
JDK version | Method to realize | Nodes> = 8 | Nodes <= 6 |
---|---|---|---|
1.8 ago | Singly linked list array + | Singly linked list array + | Singly linked list array + |
After 1.8 | + + Singly linked list array black tree | Array + red-black tree | Singly linked list array + |
HashMap
1. inheritance
public class HashMap<K,V> extends AbstractMap<K,V> implements Map<K,V>, Cloneable, Serializable
2. Constant & constructor
// these two values is defined as greater than the number of nodes will be converted into 8 black tree stored static Final int TREEIFY_THRESHOLD = 8 ; // if less than the number of nodes will be converted into 6-way linked list stored static Final int UNTREEIFY_THRESHOLD = 6 ; // black tree minimum length of 64 static Final int MIN_TREEIFY_CAPACITY = 64 ; // the HashMap capacity of the initial size of the static Final int DEFAULT_INITIAL_CAPACITY = <<. 1. 4; // AKA 16 // the HashMap capacity limit static Final int MAXIMUM_CAPACITY = 30. 1 << ; // load factor default size static Final a float DEFAULT_LOAD_FACTOR = 0.75f ; // Node is Map.Entry interface implementation class // Node array data storage capacity of this power is 2 // Each Node essentially a singly linked list transient Node <K, V > [] Table; // HashMap size, which represents the stored value pairs HashMap number transient int size; // number of times is changed HashMap transient int ModCount; // next expansion HashMap size int threshold; // storage load the constant factor Final a float loadFactor; // default constructor public the HashMap () { the this .loadFactor = DEFAULT_LOAD_FACTOR; // All Fields OTHER Defaulted } // specified capacity size public the HashMap ( int initialCapacity) { the this (initialCapacity, DEFAULT_LOAD_FACTOR); } // Specify the size of the load factor and the capacity size public the HashMap ( int initialCapacity, a float loadFactor) { // the specified capacity size not less than 0, otherwise, an IllegalArgumentException exception IF (initialCapacity <0 ) the throw new new IllegalArgumentException ( "Initial Capacity Illegal : "+ initialCapacity); // determines whether the designated capacity greater than the capacity limit of the size of the HashMap IF (initialCapacity> MAXIMUM_CAPACITY) initialCapacity = MAXIMUM_CAPACITY; // specified load factor is less than 0 or not Null, if the determination is established IllegalArgumentException exception is thrown IF (loadFactor <= 0 || Float.isNaN (loadFactor)) the throw new new IllegalArgumentException ( "Illegal Load factor: "+ loadFactor); the this .loadFactor = loadFactor; // set "HashMap threshold" when the amount of data stored in the HashMap reaches threshold, it is necessary to double the capacity of the HashMap. the this .threshold = tableSizeFor (initialCapacity); } // pass a collection Map, the Map Map.Entry all elements in the set are added to the HashMap instance public HashMap (Map <? The extends K,? The extends V> m) { the this .loadFactor = DEFAULT_LOAD_FACTOR; // This constructor main implements Map.putAll () putMapEntries (m, to false ); }
Achieve 3.Node way linked list
// implements the interface Map.Entry static class the Node <K, V> the implements Map.Entry <K, V> { Final int the hash; Final K Key; V value; Node<K,V> next; //构造函数 Node(int hash, K key, V value, Node<K,V> next) { this.hash = hash; this.key = key; this.value = value; this.next = next; } public final K getKey() { return key; } public final V getValue() { return value; } public final String toString() { return key + "=" + value; } public final int hashCode() { return Objects.hashCode(key) ^ Objects.hashCode(value); } public final V setValue(V newValue) { V oldValue = value; value = newValue; return oldValue; } //equals属性对比 public final boolean equals(Object o) { if (o == this) return true; if (o instanceof Map.Entry) { Map.Entry<?,?> e = (Map.Entry<?,?>)o; if (Objects.equals(key, e.getKey()) && Objects.equals(value, e.getValue())) return true; } return false; } }
4.TreeNode red-black tree implementation
static final class TreeNode<K,V> extends LinkedHashMap.LinkedHashMapEntry<K,V> { The TreeNode <K, V> parent; // root black tree the TreeNode <K, V> left; // left tree the TreeNode <K, V> right; // right tree the TreeNode <K, V> PREV; // Several a Boolean Red; // whether mangrove the TreeNode ( int the hash, Key K, V Val, the Node <K, V> Next) { Super (the hash, Key, Val, Next); } /** * Root achieve */ final TreeNode<K,V> root() { for (TreeNode<K,V> r = this, p;;) { if ((p = r.parent) == null) return r; r = p; } } ...
Computing 5.Hash implementation
// The main key parameter is passed with h itself hashCode unsigned 16-bit right shift binary XOR operation to produce a new hash value of static Final int hash (Object key) { int h; return (key == null ?) 0: (H = key.hashCode ()) ^ (H >>> 16 ); }
Extending explain
5.1. The following explains an example made, hash value of the hash function after a key is calculated
5.2 So why do it? Direct access to hash through it very, very key.hashCode ()? Why XOR operation right after 16?
The answer: a relationship with the next table array of computing standard HashMap
Below we explain the put / get function block have emerged in this section of code
// PUT function block in Tab [I = (n--. 1) & the hash]) // GET function code block tab [(n - 1) & hash])
We know that this code is the index node data obtained in accordance with the tab, which is how the hash index obtained after the operational position it! Hypothesis tab.length () = 1 << 4
//put函数代码块中
tab[i = (n - 1) & hash])
//get函数代码块中
tab[(n - 1) & hash])