Spring Cloud Stream, the message-driven component of Spring Cloud
1、Spring Cloud Stream
Spring Cloud Stream is a framework for building highly scalable and message-driven microservices based on message middleware. It can be independent of Spring Boot and can be used in production Spring applications. Spring Cloud Stream supports integration with a variety of message middleware, such as: Kafka, RabbitMQ, etc. By using Spring Integration to provide a connection with the message broker, it provides a platform-neutral interface for the message publishing and consumption of the application, which simplifies The complexity of the use of message middleware by developers.
The Spring Cloud Stream framework provides a programming model similar to Spring programming habits, and introduces the three core concepts of publish-subscribe, consumer group, and message partition.
Official document: https://spring.io/projects/spring-cloud-stream
2. Spring Cloud Stream message model framework diagram
From the above figure, we can know that the following roles are involved in the process of Spring Cloud Stream sending and receiving messages:
- The message sender is the business logic of microservice A in the figure above. It is mainly based on business needs to send messages.
- The message sending channel interface-Source, is mainly provided by Spring Cloud Stream for message senders to send messages, and it will be bound to a specific message channel. This interface can realize the serialization of messages.
- Message channel Channel, the message channel is an abstraction of the message queue, used to store the message published by the message sender or the message to be consumed by the consumer. The message channel Channel sends to which message queue, it needs to be configured in the configuration file.
- The message binder Binder is mainly used to interact with specific message middleware, but also isolates the implementation details of message middleware. Spring Cloud Stream provides a binder for middleware such as RabbitMQ and Apache Kafka by default. Developers only need to introduce the corresponding binder in the application to realize the docking with RabbitMQ or Kafka, so as to send and monitor messages.
- Message middleware (MQ), there are many optional message middleware, such as RabbitMQ, Apache Kafka, etc.
- The message monitoring channel interface Sink, similar to Source, the message monitoring channel interface Sink mainly realizes the interaction with the message consumer, and the water flow realizes the deserialization of message data. The message monitoring channel is also bound to specific queues and topics through configuration.
3. The concrete realization of the message model
Spring Cloud Stream provides a lot of out-of-the-box interface declarations and annotations to declare constraints on message sending and monitoring channels, thereby simplifying development, but also allowing developers to focus more on the implementation of business logic instead of messages The use of middleware. We now learn about the application of Spring Cloud Stream by implementing two services: message production and message consumption (message monitoring).
3.1. Consumer realization (stream-consumer)
First, modify the pom file to add stream-related dependencies, as shown below:
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-stream-rabbit</artifactId>
</dependency>
Then, modify the application.properties configuration file as follows:
spring.application.name=stream-consumer
server.port=8120
#rabbitmq 配置
#spring.rabbitmq.host=192.168.1.9
#spring.rabbitmq.port=5672
#spring.rabbitmq.username=admin
#spring.rabbitmq.password=admin
spring.cloud.stream.binders.defaultRabbit.type=rabbit
spring.cloud.stream.binders.defaultRabbit.environment.spring.rabbitmq.host=192.168.1.9
spring.cloud.stream.binders.defaultRabbit.environment.spring.rabbitmq.username=admin
spring.cloud.stream.binders.defaultRabbit.environment.spring.rabbitmq.password=admin
spring.cloud.stream.binders.defaultRabbit.environment.spring.rabbitmq.virtual-host=/
#通道,类似kafka中的Topic
spring.cloud.stream.bindings.input.destination=TopicA
#分组,一个分组内只有一个消息者消费消息
spring.cloud.stream.bindings.input.group=groupA
#spring.cloud.stream.bindings.input.content-type=application/json
among them,
- spring.cloud.stream.binders.defaultRabbit.* is mainly configured to link rabbitMq related information, if it is the local default, it can be omitted.
- spring.cloud.stream.bindings.input.destination=TopicA, refers to the topic name corresponding to the input channel
- spring.cloud.stream.bindings.input.group=groupA, specify the consumption group, and only one messager in a group consumes messages
Next, create a consumer class ConsumerController, which is a message listener.
@EnableBinding(Sink.class)
public class ConsumerController {
Logger logger = Logger.getLogger(ConsumerController.class);
@StreamListener(Sink.INPUT)
public void consumer(Object payload) {
logger.info("consumer: " + payload);
}
}
In the above code,
-
The @EnableBinding annotation is used to specify one or more interfaces defined with @Input or @Output annotations to realize the binding of the message channel (Channel). The binding of the channels of the @Input interface is realized here.
-
Sink interface, in this interface, a channel named input is bound by @Input annotation
-
The @StreamListener annotation registers the modified method as an event listener for the data stream on the message middleware. The attribute value in the annotation corresponds to the name of the message channel to be monitored. The default channel "input" is monitored here.
Finally, create a SpringBoot startup class, then start the application, the console prints the following log, you can see the subject information and grouping information bound to the channel. At this time, the consumer is created successfully.
2020-11-21 18:23:52.743 INFO 10264 --- [ main] o.s.c.s.m.DirectWithAttributesChannel : Channel 'application-1.input' has 1 subscriber(s).
2020-11-21 18:23:52.744 INFO 10264 --- [ main] o.s.i.endpoint.EventDrivenConsumer : Adding {logging-channel-adapter:_org.springframework.integration.errorLogger} as a subscriber to the 'errorChannel' channel
2020-11-21 18:23:52.744 INFO 10264 --- [ main] o.s.i.channel.PublishSubscribeChannel : Channel 'application-1.errorChannel' has 1 subscriber(s).
2020-11-21 18:23:52.744 INFO 10264 --- [ main] o.s.i.endpoint.EventDrivenConsumer : started bean '_org.springframework.integration.errorLogger'
2020-11-21 18:23:52.745 INFO 10264 --- [ main] o.s.c.s.binder.DefaultBinderFactory : Creating binder: defaultRabbit
2020-11-21 18:24:02.988 INFO 10264 --- [ main] o.s.c.s.binder.DefaultBinderFactory : Caching the binder: defaultRabbit
2020-11-21 18:24:02.988 INFO 10264 --- [ main] o.s.c.s.binder.DefaultBinderFactory : Retrieving cached binder: defaultRabbit
2020-11-21 18:24:03.086 INFO 10264 --- [ main] c.s.b.r.p.RabbitExchangeQueueProvisioner : declaring queue for inbound: TopicA.groupA, bound to: TopicA
2020-11-21 18:24:03.088 INFO 10264 --- [ main] o.s.a.r.c.CachingConnectionFactory : Attempting to connect to: [192.168.1.9:5672]
2020-11-21 18:24:03.121 INFO 10264 --- [ main] o.s.a.r.c.CachingConnectionFactory : Created new connection: rabbitConnectionFactory#3d64c581:0/SimpleConnection@119b0892 [delegate=amqp://[email protected]:5672/, localPort= 56470]
2020-11-21 18:24:03.178 INFO 10264 --- [ main] o.s.c.stream.binder.BinderErrorChannel : Channel 'TopicA.groupA.errors' has 1 subscriber(s).
2020-11-21 18:24:03.178 INFO 10264 --- [ main] o.s.c.stream.binder.BinderErrorChannel : Channel 'TopicA.groupA.errors' has 2 subscriber(s).
2020-11-21 18:24:03.198 INFO 10264 --- [ main] o.s.i.a.i.AmqpInboundChannelAdapter : started bean 'inbound.TopicA.groupA'
3.2. Generate implementation (stream-provider)
First of all, adding dependencies is similar to consumption, because api interfaces are required, so web-related dependencies are additionally added. as follows:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-stream-rabbit</artifactId>
</dependency>
Then, modify the application.properties configuration file as follows:
spring.application.name=stream-provider
server.port=8121
#rabbitmq 配置
spring.cloud.stream.binders.defaultRabbit.type=rabbit
spring.cloud.stream.binders.defaultRabbit.environment.spring.rabbitmq.host=192.168.1.9
spring.cloud.stream.binders.defaultRabbit.environment.spring.rabbitmq.username=admin
spring.cloud.stream.binders.defaultRabbit.environment.spring.rabbitmq.password=admin
spring.cloud.stream.binders.defaultRabbit.environment.spring.rabbitmq.virtual-host=/
spring.cloud.stream.bindings.output.destination=TopicA
#spring.cloud.stream.bindings.input.content-type=application/json
Among them, spring.cloud.stream.bindings.output.destination is used to specify the theme name of the output channel, and you need to pay attention to the difference with the input channel configuration.
Create a message sending class SendMsgService, which injects a MessageChannel instance through the constructor to send messages.
@EnableBinding(Source.class)
@Component
public class SendMsgService {
private MessageChannel output;
@Autowired
public SendMsgService(MessageChannel output){
this.output = output;
}
public void sendMsg(String name){
output.send(MessageBuilder.withPayload(name).build());
}
}
Provide a ProviderController class to define the API interface of the test.
@Controller
public class ProviderController {
Logger logger = Logger.getLogger(ProviderController.class);
@Autowired
private SendMsgService sendMsgService;
@RequestMapping("/produce")
@ResponseBody
public String produce(String name){
logger.info("调用服务提供者ProviderController的produce()方法!");
sendMsgService.sendMsg(name == null ? "World!" : name);
return "消息发送成功!";
}
}
Finally, create a SpringBoot startup class to start the application. After the startup is successful, visit http://localhost:8121/produce?name=aaaa address, and then look at the consumer console, you can see that there are related logs printed, indicating that the subscription-consumer usage method test was successful.