kafka消费者消费乱序问题

 问题:

复现:

生产数据 先add 再delete操作:
//使用kafka模拟 先add  再delete操作
Message message = new Message(UUID.randomUUID().toString(), "add product", new Date());
kafkaTemplate.send("product", JSON.toJSONString(message));
Message message1 = new Message(UUID.randomUUID().toString(), "delete product", new Date());
kafkaTemplate.send("product", JSON.toJSONString(message1));

在本实验中kafka是3个节点,所以有3个partation分区,每次发送数据轮训选择一个patition发送

(1)指定了patition,则直接使用;

(2)未指定patition但指定key,通过对key的value进行hash出一个patition;

(3)patition和key都未指定,使用轮询选出一个patition。

消费数据
Consumer-A 

2019-12-13 20:29:26.437 [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] INFO  c.i.k.spring_kafka.KafkaListeners - 消费数据入库----Message{id='e819cb58-4ac2-485e-84dc-8a782c68f631', msg='add product', sendTime=Fri Dec 13 20:29:25 CST 2019}

2019-12-13 20:29:26.437---->add product

Consumer-A1

2019-12-13 20:29:26.384 [org.springframework.kafka.KafkaListenerEndpointContainer#0-9-C-1] INFO  c.i.k.spring_kafka.KafkaListeners - 消费数据入库----Message{id='bc895e02-ce52-46a4-86de-259d40c16b82', msg='delete product', sendTime=Fri Dec 13 20:29:26 CST 2019}

2019-12-13 20:29:26.384----->delete product

发现消费者先删除再添加,则数据库出现了脏数据!!!

原因:

生产者轮询向partition发送数据,消费者是随机订阅消息的(可能有一个消费者消费两个partition或者一个消费者消费一个partition),随机发送数据必然导致多个消费者消费数据,出现消费错乱问题

解决方案:

生产者生产数据,指定partation,则kafka消费机制则要求同组只有一个消费者消费数据,

生产数据:

//使用kafka模拟 先add  再delete操作
Message message = new Message(UUID.randomUUID().toString(), "add product", new Date());
//public ListenableFuture<SendResult<K, V>> send(String topic, int partition, V data)
kafkaTemplate.send("product",1, JSON.toJSONString(message));
Message message1 = new Message(UUID.randomUUID().toString(), "delete product", new Date());
//public ListenableFuture<SendResult<K, V>> send(String topic, int partition, V data)
kafkaTemplate.send("product", 1,JSON.toJSONString(message1));

另外当我指定指定partation=3时,就出现如下错误,因为集群kafka集群是三个节点的,partation是 0/1/2,没有3,所以报错

kafkaTemplate.send("product",3, JSON.toJSONString(message));

2019-12-14 10:44:27.687 [http-nio-8080-exec-1] ERROR o.a.c.c.C.[.[.[.[dispatcherServlet] - Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed; nested exception is org.apache.kafka.common.KafkaException: Invalid partition given with record: 3 is not in the range [0...3).] with root cause
org.apache.kafka.common.KafkaException: Invalid partition given with record: 3 is not in the range [0...3).

消费数据:

2019-12-13 20:39:06.784 [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] INFO  c.i.k.spring_kafka.KafkaListeners - 消费数据入库----Message{id='9aec3a47-dc5a-40de-ba93-2a60a1865d4c', msg='add product', sendTime=Fri Dec 13 20:39:06 CST 2019}
2019-12-13 20:39:06.784 [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] INFO  c.i.k.spring_kafka.KafkaListeners - 消费数据入库----Message{id='bd918926-9981-4cba-b7c8-3a220a0219de', msg='delete product', sendTime=Fri Dec 13 20:39:06 CST 2019}

2019-12-13 20:39:22.702 [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] INFO  c.i.k.spring_kafka.KafkaListeners - 消费数据入库----Message{id='3aac99d7-ce91-49ce-ab7e-80484c6702b7', msg='add product', sendTime=Fri Dec 13 20:39:22 CST 2019}
2019-12-13 20:39:22.702 [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] INFO  c.i.k.spring_kafka.KafkaListeners - 消费数据入库----Message{id='cf44778d-31e2-4b73-a7a3-69b42a7581b8', msg='delete product', sendTime=Fri Dec 13 20:39:22 CST 2019}

2019-12-13 20:39:24.050 [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] INFO  c.i.k.spring_kafka.KafkaListeners - 消费数据入库----Message{id='26705470-b9ad-46a4-8c10-754d9761537d', msg='add product', sendTime=Fri Dec 13 20:39:24 CST 2019}
2019-12-13 20:39:24.050 [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] INFO  c.i.k.spring_kafka.KafkaListeners - 消费数据入库----Message{id='7369b9c8-cbed-416f-bbc3-73b814cc7c39', msg='delete product', sendTime=Fri Dec 13 20:39:24 CST 2019}
 

1)多次测试后始终发现路由到Consumer-A1消费数据,则解决消费时序问题

2)当手动结束Consumer-A1所在进程,生产者重新生产数据,则又路由到Consumer-A消费到了数据

发布了135 篇原创文章 · 获赞 16 · 访问量 9万+

猜你喜欢

转载自blog.csdn.net/nmjhehe/article/details/103533016