Sum of values from hashmap using streams

kAmol :

I have an hashmap of huge size (around 10^60). I am putting values in each entry one by one. Problem is to get the sum of values from hashmap for given range of keys. eg: Putting it in simple Hashmap has entries from 0 to 1000 as a key and every key has an value(BigInteger). Now problem is to get sum of values from range (say) 37 to 95.

I have tried with iterator but when we go for huge map of size 10^60, it is time taking operation for large range of indices.

I am trying it with streams but as am new to the streams/parallelStreams, I am not getting actual idea about it.

BigInteger index1 = new BigInteger(array[1]); // array[1] min value
BigInteger index2 = new BigInteger(array[2]); // array[2] max value
BigInteger max = index1.max(index2); // getting max and min range from index1 and index2
BigInteger min = index1.min(index2);
AtomicReference<Long> atomicSum = new AtomicReference<Long>(0l);
hashMap.entrySet().parallelStream().
    forEach(e -> {
        if (e.getKey().compareTo(min) == 1 && e.getKey().compareTo(max) == -1) {
            atomicSum.accumulateAndGet(e.getValue().longValue(), (x,y) -> x+y);
        }
    });

I searched over SO and few are related to list or without Streams. Please also suggest if any improvement is possible like using some other data structure instead of HashMap.

Naman :

You seem to be looking for something like :

BigInteger sumOfValues = hashMap.entrySet().stream()
        .filter(e -> e.getKey().compareTo(min) > 0 && e.getKey().compareTo(max) < 0)
        .map((Map.Entry::getValue))
        .reduce(BigInteger.ZERO, BigInteger::add);

or stated as in your code

Long sumOfValues = hashMap.entrySet().stream()
        .filter(e -> e.getKey().compareTo(min) > 0 && e.getKey().compareTo(max) < 0)
        .map((Map.Entry::getValue))
        .reduce(BigInteger.ZERO, BigInteger::add).longValue();

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=132355&siteId=1