Redis fuzzy operation problem analysis and solution (based on redisTemplate)

Demand background

		redis存在多个前缀相同的key,有时需要对其进行批量查询,删除,插入。当key大了,会坑惨CPU,因为redis通过tcp对外提供服务,是要i一个多路复用的单线程,每次请求的命令都是生成一个连接,所以后面的命令会阻塞直到前面的服务处理完毕才会继续。所以呢,存在一个隐患,这个隐患也只有量起来后才会出现,当我们循环获取key的时候,一开一合,耗时啊。

solution

Above version 2.8, too low seems impossible.

It's time to introduce pipeline. Pipeline is a technology that solves the delay caused by a lot of blockage when executing a large number of commands.

In fact, the principle is very simple. The pipeline is to send all the commands at once to avoid the network overhead caused by frequent sending and receiving. After redis packages and receives a bunch of commands, they execute them in sequence, and then package the results back to the client.

According to the actual situation of the cache data structure in the project, the data structure is string type, using the multiGet method of RedisTemplate; the data structure is hash, using Pipeline (pipeline), combining commands, and batching redis.

The so-called non-inductive operation.

scene 1

Batch insert/query the value of the key with the specified prefix
Use multiGet to obtain in batches

Full version example

/**
 * 批量获取key的value
 *
 * @param keys
 * @return
 */
public List<Object> getMul(Collection<String> keys) {
    
    
    List<String> list= (List<String>) keys;
    //方法1
	redisTemplate.executePipelined(new SessionCallback<Object>() {
    
    
		@Override
		public <K, V> Object execute(RedisOperations<K, V> redisOperations) throws DataAccessException {
    
    
			for (String s : list) {
    
    
				//查询
				redisTemplate.opsForValue().get(s);
				//插入
				redisTemplate.opsForValue().set(s, "testValue");
			}
			return null;
		}
	});

	//方法2
	redisTemplate.executePipelined(new RedisCallback<Object>() {
    
    
		@Override
		public Object doInRedis(RedisConnection redisConnection) throws DataAccessException {
    
    
			StringRedisConnection stringRedisConnection = (StringRedisConnection) redisConnection;
			for (String s : list) {
    
    
				//查询
				stringRedisConnection.get(s);
				//插入
				stringRedisConnection.set(s, "testValue");
			}
			return null;
		}
	});
	return redisTemplate.opsForValue().multiGet(keys);
}

Scene 2

To delete the key values ​​of the specified prefix in batches,
1. You can use the keys method to fuzzy match and then call the delete method to kill, but when the key is too large, the matching is too slow, which affects performance.
2. You can consider two steps to deal with this cache.
Since you need to delete in batches, you must insert them. The method of batch inserting is the same as that of single inserting. Needless to say. However, when inserting, we can do one more step to put these keys with the same prefix into a list set. The code is as shown in the figure.
Insert picture description here
This should be clear. Let’s delete the key with the specified prefix. , Then first take out all the keys from this list, and then call the delete method. The delete method that comes with redisTemplate supports batch deletion, and there is no need to introduce a pipeline to rewrite that stuff.
code show as below:
Insert picture description here

This avoids fuzzy matching of all, of course, this kind of writing needs to consider some details, such as consistency!

Guess you like

Origin blog.csdn.net/NICVSY/article/details/112984072