Look at the code:
HashMap
Package com.hash; Import the java.util.HashMap; Import ; a java.util.Map public class HashMapTest { / ** because of the lower map size * NUMBER = 50, 50 represents a method of threads are executed put the safety of the thread 50 should be 2500 * / public static Final int NUMBER = 50 ; public static void main (String [] args) { the Map <String, String> Map = new new the HashMap <> (); for ( int I = 0; I <NUMBER; ++ I ) { new new the Thread ( new new HashMapTask (Map)) Start ().; } the try { Thread.sleep(3000); } catch (InterruptedException e) { e.printStackTrace(); } System.out.println("map size = " + map.size()); } } class HashMapTask implements Runnable { Map<String, String> map; public HashMapTask(Map<String, String> map) { this.map = map; } @Override public void run() { for (int i = 0; i < HashMapTest.NUMBER; i++) { map.put(i + "-" + Thread.currentThread().getName(), "test"); } } }
operation result:
Opened 50 threads to HashMap
add elements, each thread executes 50 times put
method, in the case of thread-safe, map
should have 2500 key-value pairs, but the results are mostly small and execution of 2500 (and will not die cycle).
Hashtable
Package com.hash; Import java.util.Hashtable; public class HashTableTest { / ** * NUMBER = 50, 50 represent execution threads were put method 50 * in the case of thread-safe because of the table size be 2500 * / public static Final int NUMBER = 50 ; public static void main (String [] args) { the Hashtable <String, String> Table = new new the Hashtable <> (); for ( int I = 0; I <NUMBER; I ++ ) { new new the Thread ( new new HashTableTask (Table)) Start ().; } the try { Thread.sleep(3000); } catch (InterruptedException e) { e.printStackTrace(); } System.out.println("table size = " + table.size()); } } class HashTableTask implements Runnable { Hashtable<String, String> table; public HashTableTask(Hashtable<String, String> table) { this.table = table; } @Override public void run() { for (int i = 0; i < HashTableTest.NUMBER; i++) { table.put(i + "-" + Thread.currentThread().getName(), "test"); } } }
operation result:
No matter how many times the run, the result is size = 2500.
ConcurrentHashMap
Package com.hash; Import a java.util.Map; Import a java.util.concurrent.ConcurrentHashMap; public class ConcurrentHashMapTest { / ** at * NUMBER = 50, 50 represent execution threads are thread-put method 50 because of the safety of map size should be 2500 * / public static Final int NUMBER = 50 ; public static void main (String [] args) { the Map <String, String> Map = new new of ConcurrentHashMap <> (); for ( int I = 0; I < NUMBER; I ++ ) { new new the Thread (new ConcurrentHashMapTask(map)).start(); } try { Thread.sleep(3000); } catch (InterruptedException e) { e.printStackTrace(); } System.out.println("map size = " + map.size()); } } class ConcurrentHashMapTask implements Runnable { Map<String, String> map; public ConcurrentHashMapTask(Map<String, String> map) { this.map = map; } @Override public void run() { for (int i = 0; i < ConcurrentHashMapTest.NUMBER; i++) { map.put(i + "-" + Thread.currentThread().getName(), "test"); } } }
operation result:
No matter how many times the run, the result is size = 2500.
to sum up:
1.HashMap is thread safe.
2.Hashtable and ConcurrentHashMap are thread-safe.
I would like thread-safe, but I want efficient?
Use of ConcurrentHashMap, the underlying array + piecewise linked list implementation, security thread, into N by Segment Map (portion), may provide the same thread safe, but to enhance the efficiency of N times, 16 times the default lift.
Hashtable main reason for the low efficiency is the use of the synchronized keyword lock to put other operations carried out, and is synchronized keyword lock on the entire Hash tables, that is, each thread to exclusively lock the entire table, resulting in low efficiency, ConcurrentHashMap stored in objects and an array of segment, i.e. the entire hash table is divided into a plurality of segments; segment and each element, i.e. each segment is similar to a the Hashtable; Thus, according to the hash algorithm is first put operation when executed locate the element belongs Segment, and then lock the Segment can, therefore, ConcurrentHashMap but multi-threaded operation put in multi-threaded programming.