Detailed Explanation of SpringBoot Integrated Kafka

1. Use idea to create SpringBoot project

1.1 Create a SpringBoot program using Spring Initializr

insert image description here
click Next.

1.2 Add dependencies

insert image description here

Dependency description:

  • LombokSimplify entity class development.

  • Spring WebLet the project integrate weband develop all dependencies, including Spring MVC, built-in, tomcatetc.

  • Spring for Apache KafkaIt is the integration dependency of Springand Kafka.

Click after the configuration is complete Finish.

1.3 View pom file

insert image description here
Because it is the first integration Kafka, just wait for the related dependencies to beMaven downloaded at this time, and the red color will disappear after the download is complete.Kafka

This creates an Kafkaintegrated SpringBoot Webproject

2. Create a producer

2.1 Configure the producer application.yml file

# 连接Kafka
spring:
  kafka:
    bootstrap-servers: localhost:9092
    # 生产者 key value的序列化方式
    producer:
      key-serializer: org.apache.kafka.common.serialization.StringSerializer
      value-serializer: org.apache.kafka.common.serialization.StringSerializer

2.2 Create producer interface

insert image description here

Create a producer under the controller package, and send the data passed in by the interface toKafka

import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;

import javax.annotation.Resource;

@RestController
@RequestMapping("/producer")
public class ProducerController {
    
    

    @Resource
    private KafkaTemplate<String, String> kafka;

    @PostMapping
    public String data(@RequestBody String msg) {
    
    
        // 通过Kafka发出数据
        kafka.send("test", msg);
        return "ok";
    }
}

3. Create consumers

3.1 Configure the consumer application.yml file

# 连接Kafka
spring:
  kafka:
    bootstrap-servers: localhost:9092
    # 消费者 key value的反序列化方式
    consumer:
      key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
      value-deserializer: org.apache.kafka.common.serialization.StringDeserializer
      #指定消费者组的 group_id
      group-id: kafka-test

3.2 Create a consumer

import org.springframework.kafka.annotation.KafkaListener;
@Configuration
public class KafkaConsumer {
    
    

    // 指定要监听的 topic
    @KafkaListener(topics = "test")
    public void consumeTopic(String msg) {
    
    
        // 参数: 从topic中收到的 value值
        System.out.println("收到的信息: " + msg);
    }

}

3.3 Description

Since both producers and consumers are written in this Demo.

So the overall application.ymlfile is as follows (the above is just for distinction and understanding):

# 连接Kafka
spring:
  kafka:
    bootstrap-servers: 127.0.0.1:9092
    # 生产者 key value的序列化方式
    producer:
      key-serializer: org.apache.kafka.common.serialization.StringSerializer
      value-serializer: org.apache.kafka.common.serialization.StringSerializer
    # 消费者 key value的反序列化方式
    consumer:
      key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
      value-deserializer: org.apache.kafka.common.serialization.StringDeserializer
      #指定消费者组的 group_id
      group-id: kafka-test

4. Test producer consumer

4.1 Use ApiPost to send a Post request to call the producer interface

insert image description here
After discovering that I clicked send, the console quickly returned one ok.

It means that the producer has successfully produced data and has sent this piece of data to testthis .topichello kafka

4.2 Observe the consumer console

insert image description here

It is found that the consumer has received the data from the producer and printed the data on the console.

The above is the basic way of SpringBootintegration Kafka. Even complex uses in the future will be converted from simplicity.

Guess you like

Origin blog.csdn.net/qq_44749491/article/details/130152503