Introduction to hashMap
HashMap is one of the most commonly used concurrent packages in our daily life. HashMap is not thread-safe, and the insecurity is mainly reflected in high-concurrency scenarios.
1.8 is implemented with array + linked list + red-black tree,
Using array + linked list before 1.8 may lead to deadlock and data loss.
Red-black tree structure: https://www.cs.usfca.edu/~galles/visualization/RedBlack.html (can be verified online!)
The array + linked list + red-black tree structure is as follows:
Comparison of hashMap between 1.8 and 1.7
Version |
Is it thread safe |
Is it in order |
Method to realize |
initialization |
load factor |
Automatic conversion to red-black tree threshold |
Automatic conversion to red linked list threshold |
Linked list insertion method |
hash algorithm |
1.8 |
no |
no |
Array + linked list (two-way) + red-black tree |
16 |
0.75 |
8 |
6 |
tail plugging |
simplify |
1.7 |
no |
no |
Data + linked list (one-way) |
16 |
0.75 |
8 |
6 |
head insertion |
complex (hashability) |
Basic use of hashMap
public static void main(String[] args) throws InterruptedException {
CountDownLatch countDownLatch = new CountDownLatch(2);
HashMap<Integer,Object> hashMap = new HashMap <>();
hashMap.put(1,"hong");
// ConcurrentHashMap<String,Object> concurrentHashMap = new ConcurrentHashMap <>();
// concurrentHashMap.put("hong",1);
new Thread("thread1"){
@Override
public void run() {
hashMap.put(7,"B");
System.out.println(hashMap);
countDownLatch.countDown();
}
}.start();
new Thread("thread2"){
@Override
public void run() {
hashMap.put(3,"A");
System.out.println(hashMap);
countDownLatch.countDown();
}
}.start();
countDownLatch.await();
System.out.println(hashMap.toString()+":"+hashMap.size());
}
result
{1=hong, 7=B}
{1=hong, 3=A, 7=B}
{1=hong, 3=A, 7=B}:3
Source code learning
Location: java.util.HashMap
It can be seen that HashMap inherits from AbstractMap
Description of relevant fields:
Field Name |
mean |
effect |
loadFactor |
load factor |
Mainly used for expansion |
threshold |
capacity |
The result of load factor * capacity during capacity expansion The default is 16 |
initialCapacity |
The default array size of the container |
|
modCount |
Modified times |
Note: The threshold for converting linked list to red and black numbers is 8, and the threshold for converting red and black numbers to linked list is 6.
static final int tableSizeFor(int cap) {
int n = cap - 1;
n |= n >>> 1;
n |= n >>> 2;
n |= n >>> 4;
n |= n >>> 8;
n |= n >>> 16;
return (n < 0) ? 1 : (n >= MAXIMUM_CAPACITY) ? MAXIMUM_CAPACITY : n + 1;
}
add method
public V put(K key, V value) {
//对key做hashcode() 做hash
return putVal(hash(key), key, value, false, true);
}
Add method concrete implementation
final V putVal(int hash, K key, V value, boolean onlyIfAbsent,
boolean evict) {
Node<K,V>[] tab; Node<K,V> p; int n, i;
//table空则创建
if ((tab = table) == null || (n = tab.length) == 0)
n = (tab = resize()).length;
//计算index,并对null做处理
if ((p = tab[i = (n - 1) & hash]) == null)
tab[i] = newNode(hash, key, value, null);
else {
Node<K,V> e; K k;
//节点key存在,直接覆盖value
if (p.hash == hash &&
((k = p.key) == key || (key != null && key.equals(k))))
e = p;
//判断该链为红黑树
else if (p instanceof TreeNode)
e = ((TreeNode<K,V>)p).putTreeVal(this, tab, hash, key, value);
//该链为连表
else {
for (int binCount = 0; ; ++binCount) {
if ((e = p.next) == null) {
p.next = newNode(hash, key, value, null);
//链表长度大于8转换为红黑树进行处理
if (binCount >= TREEIFY_THRESHOLD - 1) // -1 for 1st
treeifyBin(tab, hash);
break;
}
//key已经存在直接覆盖value
if (e.hash == hash &&
((k = e.key) == key || (key != null && key.equals(k))))
break;
p = e;
}
}
if (e != null) { // existing mapping for key
V oldValue = e.value;
if (!onlyIfAbsent || oldValue == null)
e.value = value;
afterNodeAccess(e);
return oldValue;
}
}
++modCount;
//超过最大容量 就扩容
if (++size > threshold)
resize();
afterNodeInsertion(evict);
return null;
}
Capacity expansion beyond the maximum capacity
final Node<K,V>[] resize() {
Node<K,V>[] oldTab = table;
int oldCap = (oldTab == null) ? 0 : oldTab.length;
int oldThr = threshold;
int newCap, newThr = 0;
if (oldCap > 0) {
//超过最大值就不再扩容了,就只好随你去碰撞了...
if (oldCap >= MAXIMUM_CAPACITY) {
threshold = Integer.MAX_VALUE;
return oldTab;
}
//没超过最大值,就扩充原来的2倍
else if ((newCap = oldCap << 1) < MAXIMUM_CAPACITY &&
oldCap >= DEFAULT_INITIAL_CAPACITY)
newThr = oldThr << 1; // double threshold
}
//初始化容量
else if (oldThr > 0) // initial capacity was placed in threshold
newCap = oldThr;
else { // zero initial threshold signifies using defaults
//设置默认值
newCap = DEFAULT_INITIAL_CAPACITY;
newThr = (int)(DEFAULT_LOAD_FACTOR * DEFAULT_INITIAL_CAPACITY);
}
if (newThr == 0) {
float ft = (float)newCap * loadFactor;
newThr = (newCap < MAXIMUM_CAPACITY && ft < (float)MAXIMUM_CAPACITY ?
(int)ft : Integer.MAX_VALUE);
}
threshold = newThr;
@SuppressWarnings({"rawtypes","unchecked"})
Node<K,V>[] newTab = (Node<K,V>[])new Node[newCap];
table = newTab;
if (oldTab != null) {
//每个bucket都移动到新的buckets中
for (int j = 0; j < oldCap; ++j) {
Node<K,V> e;
if ((e = oldTab[j]) != null) {
oldTab[j] = null;
if (e.next == null)
newTab[e.hash & (newCap - 1)] = e;
else if (e instanceof TreeNode)
((TreeNode<K,V>)e).split(this, newTab, j, oldCap);
else { // preserve order 链表优化重hash代码块
Node<K,V> loHead = null, loTail = null;
Node<K,V> hiHead = null, hiTail = null;
Node<K,V> next;
do {
next = e.next;
//原索引
if ((e.hash & oldCap) == 0) {
if (loTail == null)
loHead = e;
else
loTail.next = e;
loTail = e;
}
//原索引+oldCap
else {
if (hiTail == null)
hiHead = e;
else
hiTail.next = e;
hiTail = e;
}
} while ((e = next) != null);
//原索引放到bucket里
if (loTail != null) {
loTail.next = null;
newTab[j] = loHead;
}
//原索引+oldCap放到bucket里
if (hiTail != null) {
hiTail.next = null;
newTab[j + oldCap] = hiHead;
}
}
}
}
}
return newTab;
}
//将普通节点转成树节点
final void treeifyBin(Node<K,V>[] tab, int hash) {
int n, index; Node<K,V> e;
if (tab == null || (n = tab.length) < MIN_TREEIFY_CAPACITY)
resize();
else if ((e = tab[index = (n - 1) & hash]) != null) {
TreeNode<K,V> hd = null, tl = null;
do {
TreeNode<K,V> p = replacementTreeNode(e, null);
if (tl == null)
hd = p;
else {
p.prev = tl;
tl.next = p;
}
tl = p;
} while ((e = e.next) != null);
if ((tab[index] = hd) != null)
hd.treeify(tab);
}
}
//将树节点转成红黑树
final void treeify(Node<K,V>[] tab) {
//根节点
TreeNode<K,V> root = null;
for (TreeNode<K,V> x = this, next; x != null; x = next) {
next = (TreeNode<K,V>)x.next;
x.left = x.right = null;
//如果根节点为空,根节点为黑色
if (root == null) {
x.parent = null;
x.red = false;
root = x;
}
else {
K k = x.key;
int h = x.hash;
//key 所属class
Class<?> kc = null;
for (TreeNode<K,V> p = root;;) {
//dir 标识方向 等于-1时向左子树走 等于1时向右子树走 当前的key为hash值
int dir, ph;
K pk = p.key;
//判断下一个节点小于当前节点时 向左走
if ((ph = p.hash) > h)
//向左子数走标识
dir = -1;
else if (ph < h)
//向右子数走标识
dir = 1;
else if ((kc == null &&
(kc = comparableClassFor(k)) == null) ||
(dir = compareComparables(kc, k, pk)) == 0)
dir = tieBreakOrder(k, pk);
TreeNode<K,V> xp = p;
if ((p = (dir <= 0) ? p.left : p.right) == null) {
x.parent = xp;
if (dir <= 0)
xp.left = x;
else
xp.right = x;
root = balanceInsertion(root, x);
break;
}
}
}
}
moveRootToFront(tab, root);
}
Translate all java.util.HashMap
package java.util;
import java.io.IOException;
import java.io.InvalidObjectException;
import java.io.Serializable;
import java.lang.reflect.ParameterizedType;
import java.lang.reflect.Type;
import java.util.function.BiConsumer;
import java.util.function.BiFunction;
import java.util.function.Consumer;
import java.util.function.Function;
import sun.misc.SharedSecrets;
//hashmap源码实现 默认继承AbstractMap 实现map
public class HashMap<K,V> extends AbstractMap<K,V>
implements Map<K,V>, Cloneable, Serializable {
private static final long serialVersionUID = 362498820763181265L;
//Hash表默认初始容量
static final int DEFAULT_INITIAL_CAPACITY = 1 << 4; // aka 16
//最大Hash表容量
static final int MAXIMUM_CAPACITY = 1 << 30;
//负载因子 0.75
static final float DEFAULT_LOAD_FACTOR = 0.75f;
//链表转红黑树阈值
static final int TREEIFY_THRESHOLD = 8;
//红黑树转链表阈值
static final int UNTREEIFY_THRESHOLD = 6;
//链表转红黑树时hash表最小容量阈值,达不到优先扩容。
static final int MIN_TREEIFY_CAPACITY = 64;
//基本hash的node节点
static class Node<K,V> implements Map.Entry<K,V> {
//当前节点hash值
final int hash;
//当前节点key
final K key;
//当前节点内容
V value;
//下一个节点
Node<K,V> next;
//构造方法
Node(int hash, K key, V value, Node<K,V> next) {
this.hash = hash;
this.key = key;
this.value = value;
this.next = next;
}
//获取当前key
public final K getKey() { return key; }
//获取当前value
public final V getValue() { return value; }
//转字符串方法
public final String toString() { return key + "=" + value; }
//获取hashcode方法
public final int hashCode() {
return Objects.hashCode(key) ^ Objects.hashCode(value);
}
//赋值方法返回旧值
public final V setValue(V newValue) {
V oldValue = value;
value = newValue;
return oldValue;
}
//判断当前对象是否在hash中 如果要存在返回true
public final boolean equals(Object o) {
if (o == this)
return true;
if (o instanceof Map.Entry) {
Map.Entry<?,?> e = (Map.Entry<?,?>)o;
if (Objects.equals(key, e.getKey()) &&
Objects.equals(value, e.getValue()))
return true;
}
return false;
}
}
//生成hash的方法(每次添加时都会调用,获取所在位置)
static final int hash(Object key) {
int h;
//为空返回0不为空先获取当前的hashCode然后^向右移16位生成一个hashCode(也可能重)
return (key == null) ? 0 : (h = key.hashCode()) ^ (h >>> 16);
}
//判断当前是否Comparable实现,如果是返回,不是返回空
static Class<?> comparableClassFor(Object x) {
if (x instanceof Comparable) {
Class<?> c; Type[] ts, as; Type t; ParameterizedType p;
if ((c = x.getClass()) == String.class) // bypass checks
return c;
if ((ts = c.getGenericInterfaces()) != null) {
for (int i = 0; i < ts.length; ++i) {
if (((t = ts[i]) instanceof ParameterizedType) &&
((p = (ParameterizedType)t).getRawType() ==
Comparable.class) &&
(as = p.getActualTypeArguments()) != null &&
as.length == 1 && as[0] == c) // type arg is c
return c;
}
}
}
return null;
}
//如果class是compare类型的则返回对比结果。
@SuppressWarnings({"rawtypes","unchecked"}) // for cast to Comparable
static int compareComparables(Class<?> kc, Object k, Object x) {
return (x == null || x.getClass() != kc ? 0 :
((Comparable)k).compareTo(x));
}
//给定目标容量大小的(总是2的幂次方)(最小为)
//小于8默认返回8 大于8小于16返回16 大于16小于32则为32这样是2的幂次方。
static final int tableSizeFor(int cap) {
int n = cap - 1;
n |= n >>> 1;
n |= n >>> 2;
n |= n >>> 4;
n |= n >>> 8;
n |= n >>> 16;
return (n < 0) ? 1 : (n >= MAXIMUM_CAPACITY) ? MAXIMUM_CAPACITY : n + 1;
}
/* ---------------- Fields -------------- */
//存放数据表的列表
transient Node<K,V>[] table;
//保存缓存的entryset
transient Set<Map.Entry<K,V>> entrySet;
//队列长度
transient int size;
//被修改的次数(用于快速失败)
transient int modCount;
/**
* The next size value at which to resize (capacity * load factor).
*
* @serial
*/
//临界值,作用扩容使用判断 临界值=负载因子*容量
int threshold;
//负载因子 默认为0.75
final float loadFactor;
/* ---------------- Public operations -------------- */
//hashmap的构建方法
//initialCapacity 初始容量
//loadFactor 负载因子
public HashMap(int initialCapacity, float loadFactor) {
//小于0则抛出异常
if (initialCapacity < 0)
throw new IllegalArgumentException("Illegal initial capacity: " +
initialCapacity);
//最大不超过默认值
if (initialCapacity > MAXIMUM_CAPACITY)
initialCapacity = MAXIMUM_CAPACITY;
//小于等于0或类型不是float则抛出异常
if (loadFactor <= 0 || Float.isNaN(loadFactor))
throw new IllegalArgumentException("Illegal load factor: " +
loadFactor);
//负载因子
this.loadFactor = loadFactor;
//初始化容量
this.threshold = tableSizeFor(initialCapacity);
}
//构造方法 带初始负载因子
public HashMap(int initialCapacity) {
this(initialCapacity, DEFAULT_LOAD_FACTOR);
}
//不带参数的默认构造方法
public HashMap() {
this.loadFactor = DEFAULT_LOAD_FACTOR; // all other fields defaulted
}
//带初始值的构造方法
public HashMap(Map<? extends K, ? extends V> m) {
this.loadFactor = DEFAULT_LOAD_FACTOR;
putMapEntries(m, false);
}
//添加元素
final void putMapEntries(Map<? extends K, ? extends V> m, boolean evict) {
//获取map的数量
int s = m.size();
//数量大于0个
if (s > 0) {
//如果表为空 则进行初始化
if (table == null) { // pre-size
//计算出容量 注意这里+1为向上取整,保证是2的倍数。
float ft = ((float)s / loadFactor) + 1.0F;
//如果大于最大容量则取最大,否则取本身
int t = ((ft < (float)MAXIMUM_CAPACITY) ?
(int)ft : MAXIMUM_CAPACITY);
//只有t大于暂存容量才会进行扩容
if (t > threshold)
threshold = tableSizeFor(t);
}
//如果s大于暂存容易那么先扩容(本来就不够)
else if (s > threshold)
resize();
//最后循环添加内容入列表值中(中间可能也会扩容)
for (Map.Entry<? extends K, ? extends V> e : m.entrySet()) {
K key = e.getKey();
V value = e.getValue();
putVal(hash(key), key, value, false, evict);
}
}
}
//返回当前列表长度
public int size() {
return size;
}
//判断是否为空 如果长度为0则为true 否则为false
public boolean isEmpty() {
return size == 0;
}
//获取指定的key是否在队列中,存在返回内容否则返回空
public V get(Object key) {
Node<K,V> e;
//hash(key)计算hashcode值,getNode获取这个key是否存在
//最后赋值给e,再判断为空,最后返回空或e.value的值;
return (e = getNode(hash(key), key)) == null ? null : e.value;
}
//通过hash和key获取返回值,如果没有返回Null
final Node<K,V> getNode(int hash, Object key) {
//tab 用于复制table的数据(临时)
//first 为tab[(n - 1) & hash]) 的数据
//n 为tab.length的总长度
//k 为first.key 用于判断是否为传进来的Key一致
Node<K,V>[] tab; Node<K,V> first, e; int n; K k;
//判断列表不为空 并且长度大于0 且 倒数第一个节点不为空
if ((tab = table) != null && (n = tab.length) > 0 &&
(first = tab[(n - 1) & hash]) != null) {
//如果倒数第一个节点的hash为传入来的hash且 key也相同那么就返回这个节点(找到了)
if (first.hash == hash && // always check first node
((k = first.key) == key || (key != null && key.equals(k))))
return first;
//如果找不到则继续往上找
if ((e = first.next) != null) {
//判断类型是否为TreeNode
if (first instanceof TreeNode)
//返回上一个节点
return ((TreeNode<K,V>)first).getTreeNode(hash, key);
do {
//如果倒数第一个节点的hash为传入来的hash且 key也相同那么就返回这个节点(找到了)
if (e.hash == hash &&
((k = e.key) == key || (key != null && key.equals(k))))
return e;
//下一节点不为空
} while ((e = e.next) != null);
}
}
return null;
}
//判断是否包含节点,如果存在返回true 否则返回false
public boolean containsKey(Object key) {
return getNode(hash(key), key) != null;
}
//添加方法
public V put(K key, V value) {
//添加到val
return putVal(hash(key), key, value, false, true);
}
//添加方法
final V putVal(int hash, K key, V value, boolean onlyIfAbsent,
boolean evict) {
//tab用来复制列表内容
//p 获取最后一位
//n 获取当前队列长度
//i 队列长度-1
Node<K,V>[] tab; Node<K,V> p; int n, i;
//如果列表为空 或 列表的长度为0
if ((tab = table) == null || (n = tab.length) == 0)
//初始化表格
n = (tab = resize()).length;
//如果最后一位为空
if ((p = tab[i = (n - 1) & hash]) == null)
//创建新节点赋给最后一个节点
tab[i] = newNode(hash, key, value, null);
else {
Node<K,V> e; K k;
//最后一位的哈希==传进来的hash且key相同
if (p.hash == hash &&
((k = p.key) == key || (key != null && key.equals(k))))
//将最后一位赋给e
e = p;
//如果 p的节点为树节点类型
else if (p instanceof TreeNode)
//调用添加树节点添加方法(详细看方法)
e = ((TreeNode<K,V>)p).putTreeVal(this, tab, hash, key, value);
else {
//如果不是树节点是链表
for (int binCount = 0; ; ++binCount) {
//如果下一节点为空
if ((e = p.next) == null) {
//创建新节点并将p.next指向它(末尾)
p.next = newNode(hash, key, value, null);
//遍历过程中 binCount大于等于7,那么会进行红默数转换(如果太小会进行扩容 **** 核心)
if (binCount >= TREEIFY_THRESHOLD - 1) // -1 for 1st
treeifyBin(tab, hash);
break;
}
//如果下一节点不为空 且 hash等于e.hash而且key也相同那么证明赋过了(并发解决)直接结束
if (e.hash == hash &&
((k = e.key) == key || (key != null && key.equals(k))))
break;
//如果没有就直接将e赋给p
p = e;
}
}
//e不为空(节点已存在)
if (e != null) { // existing mapping for key
//获取e的内容
V oldValue = e.value;
//如果onlyIfAbsent为false或oldValue为空
if (!onlyIfAbsent || oldValue == null)
//将传进来的值赋给e.value
e.value = value;
//空方法(留给下级实现)
afterNodeAccess(e);
//返回旧值
return oldValue;
}
}
//被修改次数+1
++modCount;
//判断是否需要扩容
if (++size > threshold)
//进行扩容
resize();
//空实现
afterNodeInsertion(evict);
//返回空
return null;
}
//扩容实现方法
final Node<K,V>[] resize() {
//获取当前列表
Node<K,V>[] oldTab = table;
//如果当前表为空那么为0或获取表的长度
int oldCap = (oldTab == null) ? 0 : oldTab.length;
//获取临界值
int oldThr = threshold;
//新容量(扩容2倍)
int newCap, newThr = 0;
if (oldCap > 0) {
if (oldCap >= MAXIMUM_CAPACITY) {
threshold = Integer.MAX_VALUE;
return oldTab;
}
//扩容两倍小于最大值 且 表长度大于等于初始容量
else if ((newCap = oldCap << 1) < MAXIMUM_CAPACITY &&
oldCap >= DEFAULT_INITIAL_CAPACITY)
//新临界值=旧临界值扩容2倍
newThr = oldThr << 1; // double threshold
}
//如果旧临界值大于0
else if (oldThr > 0) // initial capacity was placed in threshold
//新容量等于临界值
newCap = oldThr;
else { // zero initial threshold signifies using defaults
//新容量等于初始默认值
newCap = DEFAULT_INITIAL_CAPACITY;
//计算新的临界值
newThr = (int)(DEFAULT_LOAD_FACTOR * DEFAULT_INITIAL_CAPACITY);
}
//临界值为0
if (newThr == 0) {
//重新计算临界值
float ft = (float)newCap * loadFactor;
//如果超过最大则用最大临界值,否则用计算出来的值
newThr = (newCap < MAXIMUM_CAPACITY && ft < (float)MAXIMUM_CAPACITY ?
(int)ft : Integer.MAX_VALUE);
}
//将新的临界值赋给全局
threshold = newThr;
@SuppressWarnings({"rawtypes","unchecked"})
//新表等于新的初始化
Node<K,V>[] newTab = (Node<K,V>[])new Node[newCap];
table = newTab;
//如果旧值不为空 进行循环的赋给新表,进行全数迁移
if (oldTab != null) {
//迁移旧数据到新表中
for (int j = 0; j < oldCap; ++j) {
Node<K,V> e;
//不为空进行赋值
if ((e = oldTab[j]) != null) {
//将旧表所在位置赋空
oldTab[j] = null;
//如果链表只有一个值,直接新一个新node
if (e.next == null)
//所给所在位置
newTab[e.hash & (newCap - 1)] = e;
//如果为树节点
else if (e instanceof TreeNode)
//分割树,将新表和旧表分害成两个树,会判断是否需要转换成红默数存入新表
((TreeNode<K,V>)e).split(this, newTab, j, oldCap);
else { // preserve order
//低位的hash链表(旧位置)
Node<K,V> loHead = null, loTail = null;
//高位的hash链表(新位置)
Node<K,V> hiHead = null, hiTail = null;
//下一个节点
Node<K,V> next;
//循环处理
do {
//获取下一个节点
next = e.next;
//通过hash&老数据长度如果为0 保持位置不变
if ((e.hash & oldCap) == 0) {
//如果为空保持原位置不动
if (loTail == null)
//记住头部
loHead = e;
else
//赋值给低位
loTail.next = e;
loTail = e;
}
//放到新链表中
else {
if (hiTail == null)
//设置表头
hiHead = e;
else
hiTail.next = e;
hiTail = e;
}
} while ((e = next) != null);
//将分组后的链表映射到新表中
if (loTail != null) {
loTail.next = null;
//原索引不变
newTab[j] = loHead;
}
if (hiTail != null) {
hiTail.next = null;
//需要移到新的索引,位置是 j+oldCap
newTab[j + oldCap] = hiHead;
}
}
}
}
}
//返回新表
return newTab;
}
//将链表转为红黑树方法
final void treeifyBin(Node<K,V>[] tab, int hash) {
//n为列表的长度
//index最后一位的下标
//e最后一个节点
int n, index; Node<K,V> e;
//表为空 或长度小于初始容易进行初始化(扩容)
if (tab == null || (n = tab.length) < MIN_TREEIFY_CAPACITY)
resize();
//如果最后一位不为空
else if ((e = tab[index = (n - 1) & hash]) != null) {
//hd队头 tl队尾
TreeNode<K,V> hd = null, tl = null;
//循环
do {
//新建一个树形节点,内空就是当前e
TreeNode<K,V> p = replacementTreeNode(e, null);
//队尾为空
if (tl == null)
//将p赋给队头
hd = p;
else {
//将p的队头指向tl
p.prev = tl;
//将tl的下一个节点指定p
tl.next = p;
}
//将新建的p赋给队头
tl = p;
//判断e的下一个节点不为空才循环(从头到尾循环)
} while ((e = e.next) != null);
//将组建好的链表赋给队列最后一个
if ((tab[index] = hd) != null)
//将该节点转成红黑树
hd.treeify(tab);
}
}
//添加多个元素方法
public void putAll(Map<? extends K, ? extends V> m) {
putMapEntries(m, true);
}
//删除指定Key方法
public V remove(Object key) {
Node<K,V> e;
return (e = removeNode(hash(key), key, null, false, true)) == null ?
null : e.value;
}
//删除指定key
final Node<K,V> removeNode(int hash, Object key, Object value,
boolean matchValue, boolean movable) {
Node<K,V>[] tab; Node<K,V> p; int n, index;
//表不为空且表的长度大于0且最后一个节点不为空
if ((tab = table) != null && (n = tab.length) > 0 &&
(p = tab[index = (n - 1) & hash]) != null) {
Node<K,V> node = null, e; K k; V v;
//匹配hash地址的第一个节点是否匹配,hash和key都匹配,则是要删除的元素
if (p.hash == hash &&
((k = p.key) == key || (key != null && key.equals(k))))
node = p;
//下一个节点不为空
else if ((e = p.next) != null) {
//判断是否为红黑树
if (p instanceof TreeNode)
node = ((TreeNode<K,V>)p).getTreeNode(hash, key);
else {
//遍历链表
do {
//如果hash、key 相同 则
if (e.hash == hash &&
((k = e.key) == key ||
(key != null && key.equals(k)))) {
node = e;
break;
}
p = e;
} while ((e = e.next) != null);
}
}
//node不为空 且 matchvalue为false 或者 内容一致
if (node != null && (!matchValue || (v = node.value) == value ||
(value != null && value.equals(v)))) {
//判断是否为树节点,则是删除
if (node instanceof TreeNode)
((TreeNode<K,V>)node).removeTreeNode(this, tab, movable);
//node为p进行赋值
else if (node == p)
tab[index] = node.next;
else
//最后获取最后一个节点
p.next = node.next;
//节点自增
++modCount;
//数量减1
--size;
afterNodeRemoval(node);
//返回节点
return node;
}
}
//返回空(找不到)
return null;
}
//清空列表
public void clear() {
Node<K,V>[] tab;
modCount++;
if ((tab = table) != null && size > 0) {
size = 0;
for (int i = 0; i < tab.length; ++i)
tab[i] = null;
}
}
//判断是否包含该内容
public boolean containsValue(Object value) {
Node<K,V>[] tab; V v;
if ((tab = table) != null && size > 0) {
for (int i = 0; i < tab.length; ++i) {
for (Node<K,V> e = tab[i]; e != null; e = e.next) {
if ((v = e.value) == value ||
(value != null && value.equals(v)))
return true;
}
}
}
return false;
}
//将列表转成列表
public Set<K> keySet() {
Set<K> ks = keySet;
if (ks == null) {
ks = new KeySet();
keySet = ks;
}
return ks;
}
//抽象set实现
final class KeySet extends AbstractSet<K> {
public final int size() { return size; }
public final void clear() { HashMap.this.clear(); }
public final Iterator<K> iterator() { return new KeyIterator(); }
public final boolean contains(Object o) { return containsKey(o); }
public final boolean remove(Object key) {
return removeNode(hash(key), key, null, false, true) != null;
}
public final Spliterator<K> spliterator() {
return new KeySpliterator<>(HashMap.this, 0, -1, 0, 0);
}
public final void forEach(Consumer<? super K> action) {
Node<K,V>[] tab;
if (action == null)
throw new NullPointerException();
if (size > 0 && (tab = table) != null) {
int mc = modCount;
for (int i = 0; i < tab.length; ++i) {
for (Node<K,V> e = tab[i]; e != null; e = e.next)
action.accept(e.key);
}
if (modCount != mc)
throw new ConcurrentModificationException();
}
}
}
//获取所有的内容集合列表
public Collection<V> values() {
Collection<V> vs = values;
if (vs == null) {
vs = new Values();
values = vs;
}
return vs;
}
//values的抽象实现
final class Values extends AbstractCollection<V> {
//返回数量
public final int size() { return size; }
//清空方法
public final void clear() { HashMap.this.clear(); }
//迭代器方法
public final Iterator<V> iterator() { return new ValueIterator(); }
//判断是否包含
public final boolean contains(Object o) { return containsValue(o); }
public final Spliterator<V> spliterator() {
return new ValueSpliterator<>(HashMap.this, 0, -1, 0, 0);
}
//循环检查
public final void forEach(Consumer<? super V> action) {
Node<K,V>[] tab;
if (action == null)
throw new NullPointerException();
if (size > 0 && (tab = table) != null) {
int mc = modCount;
for (int i = 0; i < tab.length; ++i) {
for (Node<K,V> e = tab[i]; e != null; e = e.next)
action.accept(e.value);
}
if (modCount != mc)
throw new ConcurrentModificationException();
}
}
}
//将hash转成set列表的方法
public Set<Map.Entry<K,V>> entrySet() {
Set<Map.Entry<K,V>> es;
return (es = entrySet) == null ? (entrySet = new EntrySet()) : es;
}
//entryset抽象实现
final class EntrySet extends AbstractSet<Map.Entry<K,V>> {
public final int size() { return size; }
public final void clear() { HashMap.this.clear(); }
public final Iterator<Map.Entry<K,V>> iterator() {
return new EntryIterator();
}
//判断是否包含方法
public final boolean contains(Object o) {
if (!(o instanceof Map.Entry))
return false;
Map.Entry<?,?> e = (Map.Entry<?,?>) o;
Object key = e.getKey();
Node<K,V> candidate = getNode(hash(key), key);
return candidate != null && candidate.equals(e);
}
//删除方法
public final boolean remove(Object o) {
if (o instanceof Map.Entry) {
Map.Entry<?,?> e = (Map.Entry<?,?>) o;
Object key = e.getKey();
Object value = e.getValue();
return removeNode(hash(key), key, value, true, true) != null;
}
return false;
}
public final Spliterator<Map.Entry<K,V>> spliterator() {
return new EntrySpliterator<>(HashMap.this, 0, -1, 0, 0);
}
public final void forEach(Consumer<? super Map.Entry<K,V>> action) {
Node<K,V>[] tab;
if (action == null)
throw new NullPointerException();
if (size > 0 && (tab = table) != null) {
int mc = modCount;
for (int i = 0; i < tab.length; ++i) {
for (Node<K,V> e = tab[i]; e != null; e = e.next)
action.accept(e);
}
if (modCount != mc)
throw new ConcurrentModificationException();
}
}
}
//获取key如果为空返回默认值
@Override
public V getOrDefault(Object key, V defaultValue) {
Node<K,V> e;
return (e = getNode(hash(key), key)) == null ? defaultValue : e.value;
}
//添加方法
@Override
public V putIfAbsent(K key, V value) {
return putVal(hash(key), key, value, true, true);
}
//删除谅支
@Override
public boolean remove(Object key, Object value) {
return removeNode(hash(key), key, value, true, true) != null;
}
//替换方法
//将 oldValue替换为newValue
@Override
public boolean replace(K key, V oldValue, V newValue) {
Node<K,V> e; V v;
//获取key不为空 且oldValue等于这个对象的值
if ((e = getNode(hash(key), key)) != null &&
((v = e.value) == oldValue || (v != null && v.equals(oldValue)))) {
e.value = newValue;
afterNodeAccess(e);
return true;
}
return false;
}
//替换内容
@Override
public V replace(K key, V value) {
Node<K,V> e;
if ((e = getNode(hash(key), key)) != null) {
V oldValue = e.value;
e.value = value;
afterNodeAccess(e);
return oldValue;
}
return null;
}
//
@Override
public V computeIfAbsent(K key,
Function<? super K, ? extends V> mappingFunction) {
//如果为空跑出异常
if (mappingFunction == null)
throw new NullPointerException();
//计算hash值
int hash = hash(key);
Node<K,V>[] tab; Node<K,V> first; int n, i;
int binCount = 0;
TreeNode<K,V> t = null;
Node<K,V> old = null;
//如果 size大于threshold 或表为空
if (size > threshold || (tab = table) == null ||
(n = tab.length) == 0)
//tab进行初始化并获取长度给n
n = (tab = resize()).length;
//如果first 等于最后一个节点 不为空
if ((first = tab[i = (n - 1) & hash]) != null) {
//判断为TreeNode类型 进行
if (first instanceof TreeNode)
//获取取根节点
old = (t = (TreeNode<K,V>)first).getTreeNode(hash, key);
else {
//不为treeNode则进行循环判断
Node<K,V> e = first; K k;
do {
if (e.hash == hash &&
((k = e.key) == key || (key != null && key.equals(k)))) {
old = e;
break;
}
++binCount;
} while ((e = e.next) != null);
}
//旧值
V oldValue;
//如果旧值相同则直接返回
if (old != null && (oldValue = old.value) != null) {
afterNodeAccess(old);
return oldValue;
}
}
//获取key对应的值
V v = mappingFunction.apply(key);
//空直接返回
if (v == null) {
return null;
//不为空
} else if (old != null) {
//将v赋给old的value并返回
old.value = v;
afterNodeAccess(old);
return v;
}
//如果 t不为空
else if (t != null)
//添加到树中
t.putTreeVal(this, tab, hash, key, v);
else {
//如果都不是 则创建一个新节点放到最后一个位置
tab[i] = newNode(hash, key, v, first);
//如果7则进行红黑树转换
if (binCount >= TREEIFY_THRESHOLD - 1)
treeifyBin(tab, hash);
}
//自增
++modCount;
++size;
afterNodeInsertion(true);
//最终返回v
return v;
}
//指定的key不为空返回key的内容,如果为空则删除这个位置并返回空
public V computeIfPresent(K key,
BiFunction<? super K, ? super V, ? extends V> remappingFunction) {
if (remappingFunction == null)
throw new NullPointerException();
Node<K,V> e; V oldValue;
int hash = hash(key);
if ((e = getNode(hash, key)) != null &&
(oldValue = e.value) != null) {
V v = remappingFunction.apply(key, oldValue);
if (v != null) {
e.value = v;
afterNodeAccess(e);
return v;
}
else
removeNode(hash, key, null, false, true);
}
return null;
}
//同上类似
@Override
public V compute(K key,
BiFunction<? super K, ? super V, ? extends V> remappingFunction) {
if (remappingFunction == null)
throw new NullPointerException();
int hash = hash(key);
Node<K,V>[] tab; Node<K,V> first; int n, i;
int binCount = 0;
TreeNode<K,V> t = null;
Node<K,V> old = null;
if (size > threshold || (tab = table) == null ||
(n = tab.length) == 0)
n = (tab = resize()).length;
if ((first = tab[i = (n - 1) & hash]) != null) {
if (first instanceof TreeNode)
old = (t = (TreeNode<K,V>)first).getTreeNode(hash, key);
else {
Node<K,V> e = first; K k;
do {
if (e.hash == hash &&
((k = e.key) == key || (key != null && key.equals(k)))) {
old = e;
break;
}
++binCount;
} while ((e = e.next) != null);
}
}
V oldValue = (old == null) ? null : old.value;
V v = remappingFunction.apply(key, oldValue);
if (old != null) {
if (v != null) {
old.value = v;
afterNodeAccess(old);
}
else
removeNode(hash, key, null, false, true);
}
else if (v != null) {
if (t != null)
t.putTreeVal(this, tab, hash, key, v);
else {
tab[i] = newNode(hash, key, v, first);
if (binCount >= TREEIFY_THRESHOLD - 1)
treeifyBin(tab, hash);
}
++modCount;
++size;
afterNodeInsertion(true);
}
return v;
}
//合并hash列表
@Override
public V merge(K key, V value,
BiFunction<? super V, ? super V, ? extends V> remappingFunction) {
//value为空抛出异常
if (value == null)
throw new NullPointerException();
//remappingFunction为空抛出异常
if (remappingFunction == null)
throw new NullPointerException();
int hash = hash(key);
Node<K,V>[] tab; Node<K,V> first; int n, i;
int binCount = 0;
TreeNode<K,V> t = null;
Node<K,V> old = null;
//如果大于临界值 或表为空 则进行扩容或初始化
if (size > threshold || (tab = table) == null ||
(n = tab.length) == 0)
n = (tab = resize()).length;
//first为找到的具体位置 如果不为空(证明有值了)
if ((first = tab[i = (n - 1) & hash]) != null) {
//判断类型
if (first instanceof TreeNode)
//获取旧值
old = (t = (TreeNode<K,V>)first).getTreeNode(hash, key);
else {
//如果类型不是treeNode 进行循环匹配
Node<K,V> e = first; K k;
do {
//hash一致且key一致 那么将这个e赋给旧值替换掉
if (e.hash == hash &&
((k = e.key) == key || (key != null && key.equals(k)))) {
old = e;
break;
}
//自增
++binCount;
} while ((e = e.next) != null);
}
}
//如果old不为空
if (old != null) {
V v;
//old的内容不为空
if (old.value != null)
//进行替换旧值
v = remappingFunction.apply(old.value, value);
else
//直接将新值赋上
v = value;
if (v != null) {
//重新赋值
old.value = v;
afterNodeAccess(old);
}
else
//还是为空的话就把这个节点干掉
removeNode(hash, key, null, false, true);
//返回这个v
return v;
}
//判断value不为空
if (value != null) {
//并且t不为空
if (t != null)
//添加到节点中
t.putTreeVal(this, tab, hash, key, value);
else {
//创建新节点并赋给表的指定位置
tab[i] = newNode(hash, key, value, first);
//判断是否需要转换红黑树
if (binCount >= TREEIFY_THRESHOLD - 1)
treeifyBin(tab, hash);
}
//自增
++modCount;
++size;
afterNodeInsertion(true);
}
//返回值
return value;
}
//循环的实现方法
@Override
public void forEach(BiConsumer<? super K, ? super V> action) {
Node<K,V>[] tab;
if (action == null)
throw new NullPointerException();
if (size > 0 && (tab = table) != null) {
int mc = modCount;
for (int i = 0; i < tab.length; ++i) {
for (Node<K,V> e = tab[i]; e != null; e = e.next)
action.accept(e.key, e.value);
}
if (modCount != mc)
throw new ConcurrentModificationException();
}
}
//缓存所有的方法实现
@Override
public void replaceAll(BiFunction<? super K, ? super V, ? extends V> function) {
Node<K,V>[] tab;
//如果方法为空直接抛出异常
if (function == null)
throw new NullPointerException();
//判断表有值
if (size > 0 && (tab = table) != null) {
//获取修改次数
int mc = modCount;
//表长度
for (int i = 0; i < tab.length; ++i) {
//通过循环替换value值
for (Node<K,V> e = tab[i]; e != null; e = e.next) {
e.value = function.apply(e.key, e.value);
}
}
//如果数量不一致直接抛出异常(必须保证全部成功或失败)
if (modCount != mc)
throw new ConcurrentModificationException();
}
}
/* ------------------------------------------------------------ */
// Cloning and serialization
//复制的方法
@SuppressWarnings("unchecked")
@Override
public Object clone() {
HashMap<K,V> result;
try {
result = (HashMap<K,V>)super.clone();
} catch (CloneNotSupportedException e) {
// this shouldn't happen, since we are Cloneable
throw new InternalError(e);
}
result.reinitialize();
result.putMapEntries(this, false);
return result;
}
//获取负载因子方法
final float loadFactor() { return loadFactor; }
//获取当前使用容量
final int capacity() {
return (table != null) ? table.length :
(threshold > 0) ? threshold :
DEFAULT_INITIAL_CAPACITY;
}
//写对象
private void writeObject(java.io.ObjectOutputStream s)
throws IOException {
int buckets = capacity();
// Write out the threshold, loadfactor, and any hidden stuff
s.defaultWriteObject();
s.writeInt(buckets);
s.writeInt(size);
internalWriteEntries(s);
}
//读取象
private void readObject(java.io.ObjectInputStream s)
throws IOException, ClassNotFoundException {
// Read in the threshold (ignored), loadfactor, and any hidden stuff
s.defaultReadObject();
reinitialize();
if (loadFactor <= 0 || Float.isNaN(loadFactor))
throw new InvalidObjectException("Illegal load factor: " +
loadFactor);
s.readInt(); // Read and ignore number of buckets
int mappings = s.readInt(); // Read number of mappings (size)
if (mappings < 0)
throw new InvalidObjectException("Illegal mappings count: " +
mappings);
else if (mappings > 0) { // (if zero, use defaults)
// Size the table using given load factor only if within
// range of 0.25...4.0
float lf = Math.min(Math.max(0.25f, loadFactor), 4.0f);
float fc = (float)mappings / lf + 1.0f;
int cap = ((fc < DEFAULT_INITIAL_CAPACITY) ?
DEFAULT_INITIAL_CAPACITY :
(fc >= MAXIMUM_CAPACITY) ?
MAXIMUM_CAPACITY :
tableSizeFor((int)fc));
float ft = (float)cap * lf;
threshold = ((cap < MAXIMUM_CAPACITY && ft < MAXIMUM_CAPACITY) ?
(int)ft : Integer.MAX_VALUE);
// Check Map.Entry[].class since it's the nearest public type to
// what we're actually creating.
SharedSecrets.getJavaOISAccess().checkArray(s, Map.Entry[].class, cap);
@SuppressWarnings({"rawtypes","unchecked"})
Node<K,V>[] tab = (Node<K,V>[])new Node[cap];
table = tab;
// Read the keys and values, and put the mappings in the HashMap
for (int i = 0; i < mappings; i++) {
@SuppressWarnings("unchecked")
K key = (K) s.readObject();
@SuppressWarnings("unchecked")
V value = (V) s.readObject();
putVal(hash(key), key, value, false, false);
}
}
}
//抽象迭代器
abstract class HashIterator {
Node<K,V> next; // next entry to return
Node<K,V> current; // current entry
int expectedModCount; // for fast-fail
int index; // current slot
HashIterator() {
expectedModCount = modCount;
Node<K,V>[] t = table;
current = next = null;
index = 0;
if (t != null && size > 0) { // advance to first entry
do {} while (index < t.length && (next = t[index++]) == null);
}
}
public final boolean hasNext() {
return next != null;
}
final Node<K,V> nextNode() {
Node<K,V>[] t;
Node<K,V> e = next;
if (modCount != expectedModCount)
throw new ConcurrentModificationException();
if (e == null)
throw new NoSuchElementException();
if ((next = (current = e).next) == null && (t = table) != null) {
do {} while (index < t.length && (next = t[index++]) == null);
}
return e;
}
public final void remove() {
Node<K,V> p = current;
if (p == null)
throw new IllegalStateException();
if (modCount != expectedModCount)
throw new ConcurrentModificationException();
current = null;
K key = p.key;
removeNode(hash(key), key, null, false, false);
expectedModCount = modCount;
}
}
final class KeyIterator extends HashIterator
implements Iterator<K> {
public final K next() { return nextNode().key; }
}
final class ValueIterator extends HashIterator
implements Iterator<V> {
public final V next() { return nextNode().value; }
}
final class EntryIterator extends HashIterator
implements Iterator<Map.Entry<K,V>> {
public final Map.Entry<K,V> next() { return nextNode(); }
}
//分割器
static class HashMapSpliterator<K,V> {
final HashMap<K,V> map;
Node<K,V> current; // 当前节点
int index; // 当前节点下标
int fence; // 最后一个下标
int est; // 当前数量
int expectedModCount; // 用于检查
//构造方法
HashMapSpliterator(HashMap<K,V> m, int origin,
int fence, int est,
int expectedModCount) {
this.map = m;
this.index = origin;
this.fence = fence;
this.est = est;
this.expectedModCount = expectedModCount;
}
//初始化使用到的(最后一个下标)
final int getFence() { // initialize fence and size on first use
int hi;
if ((hi = fence) < 0) {
HashMap<K,V> m = map;
est = m.size;
expectedModCount = m.modCount;
Node<K,V>[] tab = m.table;
hi = fence = (tab == null) ? 0 : tab.length;
}
return hi;
}
//获取当前数量
public final long estimateSize() {
getFence(); // force init
return (long) est;
}
}
//静态 key的分割类
static final class KeySpliterator<K,V>
extends HashMapSpliterator<K,V>
implements Spliterator<K> {
KeySpliterator(HashMap<K,V> m, int origin, int fence, int est,
int expectedModCount) {
super(m, origin, fence, est, expectedModCount);
}
public KeySpliterator<K,V> trySplit() {
int hi = getFence(), lo = index, mid = (lo + hi) >>> 1;
return (lo >= mid || current != null) ? null :
new KeySpliterator<>(map, lo, index = mid, est >>>= 1,
expectedModCount);
}
//同上类似
public void forEachRemaining(Consumer<? super K> action) {
int i, hi, mc;
if (action == null)
throw new NullPointerException();
HashMap<K,V> m = map;
Node<K,V>[] tab = m.table;
if ((hi = fence) < 0) {
mc = expectedModCount = m.modCount;
hi = fence = (tab == null) ? 0 : tab.length;
}
else
mc = expectedModCount;
if (tab != null && tab.length >= hi &&
(i = index) >= 0 && (i < (index = hi) || current != null)) {
Node<K,V> p = current;
current = null;
do {
if (p == null)
p = tab[i++];
else {
action.accept(p.key);
p = p.next;
}
} while (p != null || i < hi);
if (m.modCount != mc)
throw new ConcurrentModificationException();
}
}
//尝试提交操作
public boolean tryAdvance(Consumer<? super K> action) {
int hi;
if (action == null)
throw new NullPointerException();
Node<K,V>[] tab = map.table;
if (tab != null && tab.length >= (hi = getFence()) && index >= 0) {
while (current != null || index < hi) {
if (current == null)
current = tab[index++];
else {
K k = current.key;
current = current.next;
action.accept(k);
if (map.modCount != expectedModCount)
throw new ConcurrentModificationException();
return true;
}
}
}
return false;
}
//返回当前数组数量
public int characteristics() {
return (fence < 0 || est == map.size ? Spliterator.SIZED : 0) |
Spliterator.DISTINCT;
}
}
//内容分割类
static final class ValueSpliterator<K,V>
extends HashMapSpliterator<K,V>
implements Spliterator<V> {
ValueSpliterator(HashMap<K,V> m, int origin, int fence, int est,
int expectedModCount) {
super(m, origin, fence, est, expectedModCount);
}
public ValueSpliterator<K,V> trySplit() {
int hi = getFence(), lo = index, mid = (lo + hi) >>> 1;
return (lo >= mid || current != null) ? null :
new ValueSpliterator<>(map, lo, index = mid, est >>>= 1,
expectedModCount);
}
public void forEachRemaining(Consumer<? super V> action) {
int i, hi, mc;
if (action == null)
throw new NullPointerException();
HashMap<K,V> m = map;
Node<K,V>[] tab = m.table;
if ((hi = fence) < 0) {
mc = expectedModCount = m.modCount;
hi = fence = (tab == null) ? 0 : tab.length;
}
else
mc = expectedModCount;
if (tab != null && tab.length >= hi &&
(i = index) >= 0 && (i < (index = hi) || current != null)) {
Node<K,V> p = current;
current = null;
do {
if (p == null)
p = tab[i++];
else {
action.accept(p.value);
p = p.next;
}
} while (p != null || i < hi);
if (m.modCount != mc)
throw new ConcurrentModificationException();
}
}
public boolean tryAdvance(Consumer<? super V> action) {
int hi;
if (action == null)
throw new NullPointerException();
Node<K,V>[] tab = map.table;
if (tab != null && tab.length >= (hi = getFence()) && index >= 0) {
while (current != null || index < hi) {
if (current == null)
current = tab[index++];
else {
V v = current.value;
current = current.next;
action.accept(v);
if (map.modCount != expectedModCount)
throw new ConcurrentModificationException();
return true;
}
}
}
return false;
}
public int characteristics() {
return (fence < 0 || est == map.size ? Spliterator.SIZED : 0);
}
}
//实体分割迭代器
static final class EntrySpliterator<K,V>
extends HashMapSpliterator<K,V>
implements Spliterator<Map.Entry<K,V>> {
EntrySpliterator(HashMap<K,V> m, int origin, int fence, int est,
int expectedModCount) {
super(m, origin, fence, est, expectedModCount);
}
public EntrySpliterator<K,V> trySplit() {
int hi = getFence(), lo = index, mid = (lo + hi) >>> 1;
return (lo >= mid || current != null) ? null :
new EntrySpliterator<>(map, lo, index = mid, est >>>= 1,
expectedModCount);
}
public void forEachRemaining(Consumer<? super Map.Entry<K,V>> action) {
int i, hi, mc;
if (action == null)
throw new NullPointerException();
HashMap<K,V> m = map;
Node<K,V>[] tab = m.table;
if ((hi = fence) < 0) {
mc = expectedModCount = m.modCount;
hi = fence = (tab == null) ? 0 : tab.length;
}
else
mc = expectedModCount;
if (tab != null && tab.length >= hi &&
(i = index) >= 0 && (i < (index = hi) || current != null)) {
Node<K,V> p = current;
current = null;
do {
if (p == null)
p = tab[i++];
else {
action.accept(p);
p = p.next;
}
} while (p != null || i < hi);
if (m.modCount != mc)
throw new ConcurrentModificationException();
}
}
public boolean tryAdvance(Consumer<? super Map.Entry<K,V>> action) {
int hi;
if (action == null)
throw new NullPointerException();
Node<K,V>[] tab = map.table;
if (tab != null && tab.length >= (hi = getFence()) && index >= 0) {
while (current != null || index < hi) {
if (current == null)
current = tab[index++];
else {
Node<K,V> e = current;
current = current.next;
action.accept(e);
if (map.modCount != expectedModCount)
throw new ConcurrentModificationException();
return true;
}
}
}
return false;
}
public int characteristics() {
return (fence < 0 || est == map.size ? Spliterator.SIZED : 0) |
Spliterator.DISTINCT;
}
}
//创建新节点方法
Node<K,V> newNode(int hash, K key, V value, Node<K,V> next) {
return new Node<>(hash, key, value, next);
}
// 获取新节点方法
Node<K,V> replacementNode(Node<K,V> p, Node<K,V> next) {
return new Node<>(p.hash, p.key, p.value, next);
}
// 创建新节点方法
TreeNode<K,V> newTreeNode(int hash, K key, V value, Node<K,V> next) {
return new TreeNode<>(hash, key, value, next);
}
// For treeifyBin
TreeNode<K,V> replacementTreeNode(Node<K,V> p, Node<K,V> next) {
return new TreeNode<>(p.hash, p.key, p.value, next);
}
//初始化方法
void reinitialize() {
table = null;
entrySet = null;
keySet = null;
values = null;
modCount = 0;
threshold = 0;
size = 0;
}
// Callbacks to allow LinkedHashMap post-actions
void afterNodeAccess(Node<K,V> p) { }
void afterNodeInsertion(boolean evict) { }
void afterNodeRemoval(Node<K,V> p) { }
void internalWriteEntries(java.io.ObjectOutputStream s) throws IOException {
Node<K,V>[] tab;
if (size > 0 && (tab = table) != null) {
for (int i = 0; i < tab.length; ++i) {
for (Node<K,V> e = tab[i]; e != null; e = e.next) {
s.writeObject(e.key);
s.writeObject(e.value);
}
}
}
}
/* ------------------------------------------------------------ */
// Tree bins
//树节点类实现(红黑)
static final class TreeNode<K,V> extends LinkedHashMap.Entry<K,V> {
//红黑树父类
TreeNode<K,V> parent; // red-black tree links
//左边
TreeNode<K,V> left;
//右边
TreeNode<K,V> right;
//上一个节点
TreeNode<K,V> prev; // needed to unlink next upon deletion
//是否为红
boolean red;
//构造方法
TreeNode(int hash, K key, V val, Node<K,V> next) {
super(hash, key, val, next);
}
//获取根节点
final TreeNode<K,V> root() {
for (TreeNode<K,V> r = this, p;;) {
if ((p = r.parent) == null)
return r;
r = p;
}
}
//确保根为第一个节点
static <K,V> void moveRootToFront(Node<K,V>[] tab, TreeNode<K,V> root) {
int n;
if (root != null && tab != null && (n = tab.length) > 0) {
int index = (n - 1) & root.hash;
TreeNode<K,V> first = (TreeNode<K,V>)tab[index];
if (root != first) {
Node<K,V> rn;
tab[index] = root;
TreeNode<K,V> rp = root.prev;
if ((rn = root.next) != null)
((TreeNode<K,V>)rn).prev = rp;
if (rp != null)
rp.next = rn;
if (first != null)
first.prev = root;
root.next = first;
root.prev = null;
}
assert checkInvariants(root);
}
}
//通过key
final TreeNode<K,V> find(int h, Object k, Class<?> kc) {
TreeNode<K,V> p = this;
do {
int ph, dir; K pk;
TreeNode<K,V> pl = p.left, pr = p.right, q;
if ((ph = p.hash) > h)
p = pl;
else if (ph < h)
p = pr;
else if ((pk = p.key) == k || (k != null && k.equals(pk)))
return p;
else if (pl == null)
p = pr;
else if (pr == null)
p = pl;
else if ((kc != null ||
(kc = comparableClassFor(k)) != null) &&
(dir = compareComparables(kc, k, pk)) != 0)
p = (dir < 0) ? pl : pr;
else if ((q = pr.find(h, k, kc)) != null)
return q;
else
p = pl;
} while (p != null);
return null;
}
//查找根节点
final TreeNode<K,V> getTreeNode(int h, Object k) {
return ((parent != null) ? root() : this).find(h, k, null);
}
//对name是否相同 返回 -1和1
static int tieBreakOrder(Object a, Object b) {
int d;
if (a == null || b == null ||
(d = a.getClass().getName().
compareTo(b.getClass().getName())) == 0)
d = (System.identityHashCode(a) <= System.identityHashCode(b) ?
-1 : 1);
return d;
}
//形成树方法
final void treeify(Node<K,V>[] tab) {
TreeNode<K,V> root = null;
for (TreeNode<K,V> x = this, next; x != null; x = next) {
next = (TreeNode<K,V>)x.next;
x.left = x.right = null;
if (root == null) {
x.parent = null;
x.red = false;
root = x;
}
else {
K k = x.key;
int h = x.hash;
Class<?> kc = null;
for (TreeNode<K,V> p = root;;) {
int dir, ph;
K pk = p.key;
if ((ph = p.hash) > h)
dir = -1;
else if (ph < h)
dir = 1;
else if ((kc == null &&
(kc = comparableClassFor(k)) == null) ||
(dir = compareComparables(kc, k, pk)) == 0)
dir = tieBreakOrder(k, pk);
TreeNode<K,V> xp = p;
if ((p = (dir <= 0) ? p.left : p.right) == null) {
x.parent = xp;
if (dir <= 0)
xp.left = x;
else
xp.right = x;
root = balanceInsertion(root, x);
break;
}
}
}
}
moveRootToFront(tab, root);
}
//替换树节点方法
final Node<K,V> untreeify(HashMap<K,V> map) {
Node<K,V> hd = null, tl = null;
for (Node<K,V> q = this; q != null; q = q.next) {
Node<K,V> p = map.replacementNode(q, null);
if (tl == null)
hd = p;
else
tl.next = p;
tl = p;
}
return hd;
}
//添加树方法
final TreeNode<K,V> putTreeVal(HashMap<K,V> map, Node<K,V>[] tab,
int h, K k, V v) {
Class<?> kc = null;
boolean searched = false;
TreeNode<K,V> root = (parent != null) ? root() : this;
for (TreeNode<K,V> p = root;;) {
int dir, ph; K pk;
if ((ph = p.hash) > h)
dir = -1;
else if (ph < h)
dir = 1;
else if ((pk = p.key) == k || (k != null && k.equals(pk)))
return p;
else if ((kc == null &&
(kc = comparableClassFor(k)) == null) ||
(dir = compareComparables(kc, k, pk)) == 0) {
if (!searched) {
TreeNode<K,V> q, ch;
searched = true;
if (((ch = p.left) != null &&
(q = ch.find(h, k, kc)) != null) ||
((ch = p.right) != null &&
(q = ch.find(h, k, kc)) != null))
return q;
}
dir = tieBreakOrder(k, pk);
}
TreeNode<K,V> xp = p;
if ((p = (dir <= 0) ? p.left : p.right) == null) {
Node<K,V> xpn = xp.next;
TreeNode<K,V> x = map.newTreeNode(h, k, v, xpn);
if (dir <= 0)
xp.left = x;
else
xp.right = x;
xp.next = x;
x.parent = x.prev = xp;
if (xpn != null)
((TreeNode<K,V>)xpn).prev = x;
moveRootToFront(tab, balanceInsertion(root, x));
return null;
}
}
}
//删除树节点
final void removeTreeNode(HashMap<K,V> map, Node<K,V>[] tab,
boolean movable) {
int n;
if (tab == null || (n = tab.length) == 0)
return;
int index = (n - 1) & hash;
TreeNode<K,V> first = (TreeNode<K,V>)tab[index], root = first, rl;
TreeNode<K,V> succ = (TreeNode<K,V>)next, pred = prev;
if (pred == null)
tab[index] = first = succ;
else
pred.next = succ;
if (succ != null)
succ.prev = pred;
if (first == null)
return;
if (root.parent != null)
root = root.root();
if (root == null
|| (movable
&& (root.right == null
|| (rl = root.left) == null
|| rl.left == null))) {
tab[index] = first.untreeify(map); // too small
return;
}
TreeNode<K,V> p = this, pl = left, pr = right, replacement;
if (pl != null && pr != null) {
TreeNode<K,V> s = pr, sl;
while ((sl = s.left) != null) // find successor
s = sl;
boolean c = s.red; s.red = p.red; p.red = c; // swap colors
TreeNode<K,V> sr = s.right;
TreeNode<K,V> pp = p.parent;
if (s == pr) { // p was s's direct parent
p.parent = s;
s.right = p;
}
else {
TreeNode<K,V> sp = s.parent;
if ((p.parent = sp) != null) {
if (s == sp.left)
sp.left = p;
else
sp.right = p;
}
if ((s.right = pr) != null)
pr.parent = s;
}
p.left = null;
if ((p.right = sr) != null)
sr.parent = p;
if ((s.left = pl) != null)
pl.parent = s;
if ((s.parent = pp) == null)
root = s;
else if (p == pp.left)
pp.left = s;
else
pp.right = s;
if (sr != null)
replacement = sr;
else
replacement = p;
}
else if (pl != null)
replacement = pl;
else if (pr != null)
replacement = pr;
else
replacement = p;
if (replacement != p) {
TreeNode<K,V> pp = replacement.parent = p.parent;
if (pp == null)
root = replacement;
else if (p == pp.left)
pp.left = replacement;
else
pp.right = replacement;
p.left = p.right = p.parent = null;
}
TreeNode<K,V> r = p.red ? root : balanceDeletion(root, replacement);
if (replacement == p) { // detach
TreeNode<K,V> pp = p.parent;
p.parent = null;
if (pp != null) {
if (p == pp.left)
pp.left = null;
else if (p == pp.right)
pp.right = null;
}
}
if (movable)
moveRootToFront(tab, r);
}
//map分割方法
final void split(HashMap<K,V> map, Node<K,V>[] tab, int index, int bit) {
TreeNode<K,V> b = this;
// Relink into lo and hi lists, preserving order
TreeNode<K,V> loHead = null, loTail = null;
TreeNode<K,V> hiHead = null, hiTail = null;
int lc = 0, hc = 0;
for (TreeNode<K,V> e = b, next; e != null; e = next) {
next = (TreeNode<K,V>)e.next;
e.next = null;
if ((e.hash & bit) == 0) {
if ((e.prev = loTail) == null)
loHead = e;
else
loTail.next = e;
loTail = e;
++lc;
}
else {
if ((e.prev = hiTail) == null)
hiHead = e;
else
hiTail.next = e;
hiTail = e;
++hc;
}
}
if (loHead != null) {
if (lc <= UNTREEIFY_THRESHOLD)
tab[index] = loHead.untreeify(map);
else {
tab[index] = loHead;
if (hiHead != null) // (else is already treeified)
loHead.treeify(tab);
}
}
if (hiHead != null) {
if (hc <= UNTREEIFY_THRESHOLD)
tab[index + bit] = hiHead.untreeify(map);
else {
tab[index + bit] = hiHead;
if (loHead != null)
hiHead.treeify(tab);
}
}
}
//读取红黑树方法
static <K,V> TreeNode<K,V> rotateLeft(TreeNode<K,V> root,
TreeNode<K,V> p) {
TreeNode<K,V> r, pp, rl;
if (p != null && (r = p.right) != null) {
if ((rl = p.right = r.left) != null)
rl.parent = p;
if ((pp = r.parent = p.parent) == null)
(root = r).red = false;
else if (pp.left == p)
pp.left = r;
else
pp.right = r;
r.left = p;
p.parent = r;
}
return root;
}
static <K,V> TreeNode<K,V> rotateRight(TreeNode<K,V> root,
TreeNode<K,V> p) {
TreeNode<K,V> l, pp, lr;
if (p != null && (l = p.left) != null) {
if ((lr = p.left = l.right) != null)
lr.parent = p;
if ((pp = l.parent = p.parent) == null)
(root = l).red = false;
else if (pp.right == p)
pp.right = l;
else
pp.left = l;
l.right = p;
p.parent = l;
}
return root;
}
static <K,V> TreeNode<K,V> balanceInsertion(TreeNode<K,V> root,
TreeNode<K,V> x) {
x.red = true;
for (TreeNode<K,V> xp, xpp, xppl, xppr;;) {
if ((xp = x.parent) == null) {
x.red = false;
return x;
}
else if (!xp.red || (xpp = xp.parent) == null)
return root;
if (xp == (xppl = xpp.left)) {
if ((xppr = xpp.right) != null && xppr.red) {
xppr.red = false;
xp.red = false;
xpp.red = true;
x = xpp;
}
else {
if (x == xp.right) {
root = rotateLeft(root, x = xp);
xpp = (xp = x.parent) == null ? null : xp.parent;
}
if (xp != null) {
xp.red = false;
if (xpp != null) {
xpp.red = true;
root = rotateRight(root, xpp);
}
}
}
}
else {
if (xppl != null && xppl.red) {
xppl.red = false;
xp.red = false;
xpp.red = true;
x = xpp;
}
else {
if (x == xp.left) {
root = rotateRight(root, x = xp);
xpp = (xp = x.parent) == null ? null : xp.parent;
}
if (xp != null) {
xp.red = false;
if (xpp != null) {
xpp.red = true;
root = rotateLeft(root, xpp);
}
}
}
}
}
}
static <K,V> TreeNode<K,V> balanceDeletion(TreeNode<K,V> root,
TreeNode<K,V> x) {
for (TreeNode<K,V> xp, xpl, xpr;;) {
if (x == null || x == root)
return root;
else if ((xp = x.parent) == null) {
x.red = false;
return x;
}
else if (x.red) {
x.red = false;
return root;
}
else if ((xpl = xp.left) == x) {
if ((xpr = xp.right) != null && xpr.red) {
xpr.red = false;
xp.red = true;
root = rotateLeft(root, xp);
xpr = (xp = x.parent) == null ? null : xp.right;
}
if (xpr == null)
x = xp;
else {
TreeNode<K,V> sl = xpr.left, sr = xpr.right;
if ((sr == null || !sr.red) &&
(sl == null || !sl.red)) {
xpr.red = true;
x = xp;
}
else {
if (sr == null || !sr.red) {
if (sl != null)
sl.red = false;
xpr.red = true;
root = rotateRight(root, xpr);
xpr = (xp = x.parent) == null ?
null : xp.right;
}
if (xpr != null) {
xpr.red = (xp == null) ? false : xp.red;
if ((sr = xpr.right) != null)
sr.red = false;
}
if (xp != null) {
xp.red = false;
root = rotateLeft(root, xp);
}
x = root;
}
}
}
else { // symmetric
if (xpl != null && xpl.red) {
xpl.red = false;
xp.red = true;
root = rotateRight(root, xp);
xpl = (xp = x.parent) == null ? null : xp.left;
}
if (xpl == null)
x = xp;
else {
TreeNode<K,V> sl = xpl.left, sr = xpl.right;
if ((sl == null || !sl.red) &&
(sr == null || !sr.red)) {
xpl.red = true;
x = xp;
}
else {
if (sl == null || !sl.red) {
if (sr != null)
sr.red = false;
xpl.red = true;
root = rotateLeft(root, xpl);
xpl = (xp = x.parent) == null ?
null : xp.left;
}
if (xpl != null) {
xpl.red = (xp == null) ? false : xp.red;
if ((sl = xpl.left) != null)
sl.red = false;
}
if (xp != null) {
xp.red = false;
root = rotateRight(root, xp);
}
x = root;
}
}
}
}
}
/**
* 递归检查
*/
static <K,V> boolean checkInvariants(TreeNode<K,V> t) {
TreeNode<K,V> tp = t.parent, tl = t.left, tr = t.right,
tb = t.prev, tn = (TreeNode<K,V>)t.next;
if (tb != null && tb.next != t)
return false;
if (tn != null && tn.prev != t)
return false;
if (tp != null && t != tp.left && t != tp.right)
return false;
if (tl != null && (tl.parent != t || tl.hash > t.hash))
return false;
if (tr != null && (tr.parent != t || tr.hash < t.hash))
return false;
if (t.red && tl != null && tl.red && tr != null && tr.red)
return false;
if (tl != null && !checkInvariants(tl))
return false;
if (tr != null && !checkInvariants(tr))
return false;
return true;
}
}
}
Why does the underlying layer of HashMap use a red-black tree instead of a completely balanced binary tree?
The reason is because the balanced tree requires that the height difference between the left subtree and the right subtree of each node is at most equal to 1, resulting in each insertion and deletion, which will break the balance and cause reordering every time. This performance is very
You can refer to this article
at last
Hashmap is one of our commonly used concurrent packages, but many students do not know whether it is thread-safe or not when using it, but they will go up, especially some adding and deleting at the same time often lead to some online evaluations, and some copies Replacing data in the transfer cycle will cause a series of safety production problems, so special attention should be paid to its use. The above are only some of my translations that refer to Internet articles and are for reference only.
reference:
https://zhuanlan.zhihu.com/p/28587782
https://blog.csdn.net/xufei5789651/article/details/123518630
https://blog.csdn.net/anlian523/article/details/103639094
https://blog.csdn.net/abcdef314159/article/details/77193888
https://pdai.tech/md/java/collection/java-map-HashMap&HashSet.html
https://www.bilibili.com/video/BV1k4411J7yq/?vd_source=7d0e42b081e08cb3cefaea55cc1fa8b7