Is following code Thread safe

Amit :

I have a scenario where i have to maintain a Map which can be populated by multiple threads ,each modifying there respective List (unique identifier/key being thread name) and when the list size for a thread exceeds a fixed batch size we have to persist the records in DB.

Sample code below:

private volatile ConcurrentHashMap<String, List<T>>  instrumentMap = new ConcurrentHashMap<String, List<T>>();
private ReadWriteLock lock ;

public void addAll(List<T> entityList, String threadName) {
    try {
        lock.readLock().lock();
        List<T> instrumentList = instrumentMap.get(threadName);
        if(instrumentList == null) {
            instrumentList = new ArrayList<T>(batchSize);
            instrumentMap.put(threadName, instrumentList);
        }

        if(instrumentList.size() >= batchSize -1){
            instrumentList.addAll(entityList);
            recordSaver.persist(instrumentList); 
            instrumentList.clear();
        } else {
            instrumentList.addAll(entityList);  
        }
    } finally {
        lock.readLock().unlock();
    }

}

There is one more separate thread running after every 2 minutes to persist all the records in Map (to make sure we have something persisted after every 2 minutes and map size does not gets too big) and when it starts it block all other threads (check the readLock and writeLock usawhere writeLock has higher priority)

if(//Some condition) {
                    Thread.sleep(//2 minutes);
                    aggregator.getLock().writeLock().lock();
                    List<T> instrumentList = instrumentMap .values().stream().flatMap(x->x.stream()).collect(Collectors.toList());
                    if(instrumentList.size() > 0) {

                        saver.persist(instrumentList);
                        instrumentMap .values().parallelStream().forEach(x -> x.clear());
                    aggregator.getLock().writeLock().unlock();
                }

This solution is working fine almost for every scenario we tested except sometime we see some of the records went missing i.e. not persisted at all although they were added fine in Map

My question is what is the problem with this code? Is ConcurrentHashMap not the best solution here? Does usage of read/write lock has some problem here? Should i go with sequential processing?

Andy Turner :

No, it's not thread safe.

The problem is that you are using the read lock of the ReadWriteLock. This doesn't guarantee exclusive access for making updates. You'd need to use the write lock for that.

But you don't really need to use a separate lock at all. You can simply use the ConcurrentHashMap.compute method:

instrumentMap.compute(threadName, (tn, instrumentList) -> {
  if (instrumentList == null) {
    instrumentList = new ArrayList<>();
  }

  if(instrumentList.size() >= batchSize -1) {
    instrumentList.addAll(entityList); 
    recordSaver.persist(instrumentList); 
    instrumentList.clear();
  } else {
    instrumentList.addAll(entityList);
  }

  return instrumentList;
});

This allows you to update items in the list whilst also guaranteeing exclusive access to the list for a given key.

I suspect that you could split the compute call into computeIfAbsent (to add the list if one is not there) and then a computeIfPresent (to update/persist the list): the atomicity of these two operations is not necessary here. But there is no real point in splitting them up.


Additionally, instrumentMap almost certainly shouldn't be volatile. Unless you really want to reassign its value (given this code, I doubt that), remove volatile and make it final.

Similarly, non-final locks are questionable too. If you stick with using a lock, make that final too.

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=89198&siteId=1