Redis pipeline mechanism (pipeline)

Redis pipeline mechanism (pipeline)

 

    The pipeline mechanism of Redis is actually designed for batch reading and writing. If you read and write data to redis multiple times, a link is established each time, which is more resource-intensive and busy, so I thought of pipelines Mechanism (pipeline), only one connection is established, and then read or write is performed in batches.

 

 

 

 

insert data effect

 

      Non-pipeline: data volume = 2W, execution time = 1+min

      Pipeline: data volume = 2W, execution time = 1~2 seconds

 

 

 

 

Code explanation

 

 

    public boolean add(final List<T> list) {
        final boolean result = stringRedisTemplate.execute(new RedisCallback<Boolean>() {
            @Override
            public Boolean doInRedis(final RedisConnection connection)
                    throws DataAccessException {
                for (final T object : list) {
                    final Map<String, String> hash = RedisMapUtil.toMap(object);
                    final Map<byte[], byte[]> hashes = new LinkedHashMap<byte[], byte[]>(hash.size());
                    for (final Map.Entry<String, String> entry : hash.entrySet()) {
                        hashes.put(rawHashKey(entry.getKey()), rawHashValue(entry.getValue()));
                    }
                    final byte[] key = stringRedisTemplate.getStringSerializer().serialize(getDefaultKey(object.getId()));
                    connection.hMSet(key, hashes);
                }
                return true;
            }
        }, false, true);
        return result;
    }

    public boolean addString(final Map<String, String> map) {
        final boolean result = stringRedisTemplate.execute(new RedisCallback<Boolean>() {
            @Override
            public Boolean doInRedis(final RedisConnection connection)
                    throws DataAccessException {
                for (final Map.Entry<String, String> entry : map.entrySet()) {
                    final byte[] key = stringRedisTemplate.getStringSerializer().serialize(entry.getKey());
                    final byte[] value = stringRedisTemplate.getStringSerializer().serialize(entry.getValue());
                    connection.set(key, value);
                }
                return true;
            }
        }, false, true);
        return result;
    }

    @SuppressWarnings("unchecked")
    private <HK> byte[] rawHashKey(final HK hashKey) {
        Assert.notNull(hashKey, "non null hash key required");
        if (stringRedisTemplate.getHashKeySerializer() == null && hashKey instanceof byte[]) {
            return (byte[]) hashKey;
        }

        final RedisSerializer<HK> serializer = (RedisSerializer<HK>) stringRedisTemplate.getHashKeySerializer();
        return serializer.serialize(hashKey);
    }

    @SuppressWarnings("unchecked")
    private <HV> byte[] rawHashValue(final HV value) {
        if (stringRedisTemplate.getHashValueSerializer() == null & value instanceof byte[]) {
            return (byte[]) value;
        }

        final RedisSerializer<HV> serializer = (RedisSerializer<HV>) stringRedisTemplate.getHashValueSerializer();
        return serializer.serialize(value);
    }

 

 

The above is the method written by myself

  • The first is to call redisTemplate.execute(RedisCallback<T> action, boolean exposeConnection, boolean pipeline), note here that exposeConnection = false, pipeline = true
  • Then, an inner class is new in the first parameter, and the method inside receives a connection, which is a connection we want to execute in batches
  • If you want to save an object, you still need to convert the object to Map<String, String> first, and then convert it to byte[], and the key is String to byte[], which is much simpler
  • Finally, if it is an object, call connection.hMSet(key, hashes);, if not, call connection.set(key, value);
  • A very special place is that the content of the connection is always set in the connection, and it is only executed once at the end.

 

 

 

refer to

http://xinklabi.iteye.com/blog/2195547

 

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=326256511&siteId=291194637