Spring Cloud - 8 (Spring Cloud Stream)

Disclaimer: This article is a blogger original article, follow the CC 4.0 BY-SA copyright agreement, reproduced, please attach the original source link and this statement.
This link: https://blog.csdn.net/ysl19910806/article/details/97689877

Spring Cloud Stream


 

Kafka


Official website

http://kafka.apache.org/

 

The main purpose

  • Messaging middleware
  • Streaming calculation processing
  • Journal

 

Download: http://kafka.apache.org/downloads

 

Execute scripts directory / bin

windows in its separate directory

 

Quick Start

Download and unzip the archive kafka

Run Service

Windows are an example, first open the cmd:

1. Start the zookeeper:

bin\windows\zookeeper-server-start.bat config\zookeeper.properties

2. Start kafka:

bin\windows\kafka-server-start.bat config\server.properties

 

Create a theme topic

bin\windows\kafka-topics.bat --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1 --topic cloud

cloud

 

Manufacturer: send a message

\bin\windows>kafka-console-producer.bat --broker-list localhost:9092 --topic cloud
>hello

 

Consumers: receiving a message

\bin\windows>kafka-console-consumer.bat --bootstrap-server localhost:9092 --topic cloud --from-beginning
hello

Compare similar products

  • ActiveMQ: JMS (Java Message Service) specification to achieve
  • RabbitMQ: AMQP (Advanced Message Queue Protocol) specification to achieve
  • Kafka: not achieve a certain specification, it is relatively flexible and performance advantages

 

Spring Kafka


import java.util.Properties;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.Future;

import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.ProducerRecord;
import org.apache.kafka.clients.producer.RecordMetadata;
import org.apache.kafka.common.serialization.StringSerializer;

public class TestKafka {

	public static void main(String[] args) throws ExecutionException, InterruptedException {

		Properties properties = new Properties();

		properties.setProperty("bootstrap.servers", "localhost:9092");
		properties.setProperty("key.serializer", StringSerializer.class.getName());
		properties.setProperty("value.serializer", StringSerializer.class.getName());

		// 创建 Kafka Producer
		KafkaProducer<String, String> kafkaProducer = new KafkaProducer(properties);

		// 创建Kafka消息 = ProducerRecord
		String topic = "cloud";

		Integer partition = 0;

		Long timestamp = System.currentTimeMillis();

		String key = "message-key";
		String value = "how are you!";

		ProducerRecord<String, String> record = new ProducerRecord<String, String>(topic, partition, timestamp, key, value);

		// 发送kafka消息
		Future<RecordMetadata> metadataFuture = kafkaProducer.send(record);

		// 强制执行
		metadataFuture.get();

	}

}


 

The official document: https://docs.spring.io/spring-kafka/reference/html/

 

Design Patterns

Spring community data (Spring-data) operation, a basic model, Template mode:

  • JDBC:JdbcTemplate
  • Redis:RedisTemplate
  • Kafka:KafkaTemplate
  • JMS: JmsTemplate
  • Rest: Rest Template

XXXTemplate must realize XXXOpeations

KafkaTemplate implements KafkaOpeations

 

Maven relies

<dependency>

 <groupId>org.springframework.kafka</groupId>

<artifactId>spring-kafka</artifactId>

</dependency>

 

Automatic assembler: KafkaAutoConfiguration

Wherein KafkaTemplate are automatically assembled:

import org.springframework.boot.autoconfigure.condition.ConditionalOnMissingBean;
import org.springframework.boot.autoconfigure.kafka.KafkaProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.kafka.core.ProducerFactory;
import org.springframework.kafka.support.ProducerListener;

@Configuration
public class KafkaAutoConfiguration {
	
	 private final KafkaProperties properties;

    public KafkaAutoConfiguration(KafkaProperties properties) {
        this.properties = properties;
    }
	    
	@Bean
	@ConditionalOnMissingBean(KafkaTemplate.class)
	public KafkaTemplate<?,?> kafkaTemplate(ProducerFactory<Object,Object> kafkaProducerFactory,
	
	    ProducerListener<Object,Object> kafkaProducerListener){
	    KafkaTemplate<Object,Object> kafkaTemplate = new KafkaTemplate<Object,Object>(kafkaProducerFactory);
	
	    kafkaTemplate.setProducerListener(kafkaProducerListener);
	    kafkaTemplate.setDefaultTopic(this.properties.getTemplate().getDefaultTopic());
	
	    return kafkaTemplate;
	}
}

 

Creating Producer

Increased configuration Producers

application.properties

   Global Configuration:

    ### Kafka configuration Producers     

    spring.kafka.producer.bootstrapServers = localhost:9092

### Kafka生产者配置

# spring.kafka.producer.bootstrapServers = localhost:9092

spring.kafka.producer.keySerializer = org.apache.kafka.common.serialization.StringSerializer

spring.kafka.producer.valueSerializer = org.apache.kafka.common.serialization.StringSerializer
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;

@RestController
public class KafkaProducerController {

	private final KafkaTemplate<String, String> kafkaTemplate;

	private final String topic;

	@Autowired
	public KafkaProducerController(KafkaTemplate<String, String> kafkaTemplate, @Value("${kafka.topic}") String topic) {

		this.kafkaTemplate = kafkaTemplate;
		this.topic = topic;
	}

	@PostMapping("/message/send")
	public Boolean sendMessage(@RequestParam(required=false)String message) {
		kafkaTemplate.send(topic, message);
		return true;
	}

}

 

Creating consumer

Increase consumer Configuration

### Kafka消费者配置
spring.kafka.consumer.groupId = cloud-1

spring.kafka.consumer.keyDeserializer = org.apache.kafka.common.serialization.StringDeserializer

spring.kafka.consumer.valueDeserializer = org.apache.kafka.common.serialization.StringDeserializer
import org.springframework.kafka.annotation.KafkaListener;
import org.springframework.stereotype.Component;

@Component
public class KafkaConsumerListener {

	@KafkaListener(topics = "${kafka.topic}")
	public void onMessage(String message) {

		System.out.print("kafka 消费者监听器,接收到消息:" + message);

	}
}

 

 

Spring Cloud Stream


basic concept

  • Source: Source, synonyms: Producer, Publisher
  • Sink: Receiver, synonyms: Consumer, Subscriber
  • Processor: For the upper class is concerned Sink, Source for indecent purposes is

 

Reactive Streams:

  • Publisher
  • Subscriber
  • Processor

 

Spring Cloud Stream Binder:kafka

 

Defines a standard message sender

@Component
@EnableBinding(Source.class)
public class MessageProducerBean {

    @Autowired
    @Qualifier(Source.OUTPUT)
    private MessageChannel messageChannel; 
    
    @Autowired
    private Source source;
    
    
    public void send(String message) {
        messageChannel.send(MessageBuilder.withPayload(message).build());
        source.output().send(MessageBuilder.withPayload(message).build());
    }

}

 

injection

@RestController
public class KafkaProducerController {

    private final KafkaTemplate<String, String> kafkaTemplate;

    private final MessageProducerBean messageProducerBean;

    private final String topic;

    @Autowired
    public KafkaProducerController(KafkaTemplate<String, String> kafkaTemplate, 
            @Value("${kafka.topic}") String topic,
            MessageProducerBean messageProducerBean) {

        this.kafkaTemplate = kafkaTemplate;
        this.messageProducerBean = messageProducerBean;
        this.topic = topic;
    }

    @PostMapping("/message/send")
    public Boolean sendMessage(@RequestParam(required=false)String message) {
        kafkaTemplate.send(topic, message);
        return true;
    }
    
    @GetMapping("/message/output/send")
    public Boolean outputSend(@RequestParam String message) {
        messageProducerBean.send(message);
        return true;
    }    
    

}

 

Sink achieve the standard monitor

@Component
@EnableBinding(Sink.class)
public class MessageConsumerBean {

    @Autowired
    @Qualifier(Sink.INPUT)
    private SubscribableChannel subscribableChannel;
    
    @Autowired
    private Sink sink;

    @PostConstruct
    public void init() {
        subscribableChannel.subscribe(new MessageHandler() {
            
            @Override
            public void handleMessage(Message<?> message) throws MessagingException {
                System.out.println("init: "+message.getPayload());
                
            }
        });
    }
    
    @ServiceActivator(inputChannel = Sink.INPUT)
    public void serviceActivator(Object message) {
        System.out.println("serviceActivator: "+message);
    }
    
    @StreamListener(Sink.INPUT)
    public void streamListener(String message) {
        System.out.println("streamListener: "+message);
    }
    
}

Custom criteria message sender

public interface MySource {
    
    String MYOUTPUT = "myoutput";

    
    @Output(MySource.MYOUTPUT)
    MessageChannel myoutput();

}
@Component
@EnableBinding(MySource.class)
public class MyMessageProducerBean {

	@Autowired
	@Qualifier(MySource.MYOUTPUT)
	private MessageChannel messageChannel; 
	
	@Autowired
	private MySource mySource;
	
	
	public void send(String message) {
		messageChannel.send(MessageBuilder.withPayload(message).build());
		mySource.myoutput().send(MessageBuilder.withPayload(message).build());
	}

}

 

Custom Sink monitor

public interface MySink {

    String MYINPUT = "myinput";

    @Input(MySink.MYINPUT)
    SubscribableChannel input();

}

 

@Component
@EnableBinding(MySink.class)
public class MyMessageConsumerBean {

    @Autowired
    @Qualifier(MySink.MYINPUT)
    private SubscribableChannel subscribableChannel;
    
    @Autowired
    private MySink mySink;

    @PostConstruct
    public void init() {
        subscribableChannel.subscribe(new MessageHandler() {
            
            @Override
            public void handleMessage(Message<?> message) throws MessagingException {
                System.out.println("my - Sink: "+message.getPayload());
                
            }
        });
    }
    
    @ServiceActivator(inputChannel = MySink.MYINPUT)
    public void serviceActivator(Object message) {
        System.out.println("my - serviceActivator: "+message);
    }
    
    @StreamListener(MySink.MYINPUT)
    public void streamListener(String message) {
        System.out.println("my - streamListener: "+message);
    }
    
}

Configuration Item

kafka.topic.my = mytopic
spring.cloud.stream.bindings.myoutput.destination=${kafka.topic.my}
spring.cloud.stream.bindings.myinput.destination=${kafka.topic.my}

 

Spring Cloud Stream Binder:rabbit


Kafka reconstruction project, delete the strong dependence

 

Stream-kafka realize source: https: //pan.baidu.com/s/1RX5W2wMj4h_SKDkjlPQHkA extraction code: lwak 
 

Stream-rabbit realize source: https: //pan.baidu.com/s/1AX6asvmATN9-dYrhIIfS7w extraction code: dsc5 
 

Guess you like

Origin blog.csdn.net/ysl19910806/article/details/97689877