Kafka Java API(详解与代码实战)

Producer API

  1. 添加依赖
<dependency>
    <groupId>org.apache.kafka</groupId>
    <artifactId>kafka-clients</artifactId>
    <version>0.11.0.0</version>
</dependency>
  1. 相关API
    KafkaProducer:需要创建一个生产者对象,用来发送数据。
    ProducerConfig:获取所需的一系列配置参数。
    ProducerRecord:每条数据都要封装成一个ProducerRecord对象。

  2. 异步发送 不带回调函数的Producer

public class CustomProducer {
    
    


   public static void main(String[] args) throws Exception {
    
    
        //1. 初始化参数信息
        Map<String, Object> configs = new HashMap<>();
        configs.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,"kafka:9092");
        configs.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        configs.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
		//2. 创建生产者
        KafkaProducer<String,String> producer = new KafkaProducer<>(configs);
        //3. 发送数据
        for (int i=0;i<10;i++){
    
    
            producer.send(new ProducerRecord<>("topica","hello"+i));
        }

        producer.close();
    }

}
  1. 异步发送 带回调函数的Producer
    回调函数会在producer收到ack时调用,为异步调用,该方法有两个参数,分别是RecordMetadata和Exception,如果Exception为null,说明消息发送成功,如果Exception不为null,说明消息发送失败。

注意:消息发送失败会自动重试,不需要我们在回调函数中手动重试。

public class CustomProducer_CallBack {
    
    


    public static void main(String[] args) throws Exception {
    
    
        Map<String, Object> configs = new HashMap<>();
        configs.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,"kafka:9092");
        configs.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        configs.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);

        KafkaProducer<String,String> producer = new KafkaProducer<>(configs);
        for (int i=0;i<10;i++){
    
    
            producer.send(new ProducerRecord<>("topica", "hello" + i), new Callback() {
    
    
                @Override
                public void onCompletion(RecordMetadata metadata, Exception exception) {
    
    
                    if(exception == null){
    
    
                 System.out.println("发送成功:"+metadata.partition());//数据所在分区
                 System.out.println("发送成功:"+metadata.topic());//数据所对应的topic
                 System.out.println("发送成功:"+metadata.offset());//数据的offset
                    }
                }
            });
        }

        producer.close();
    }

}

Consumer API

  1. 相关API
    KafkaConsumer:需要创建一个消费者对象,用来消费数据
    ConsumerConfig:获取所需的一系列配置参数
    ConsuemrRecord:每条数据都要封装成一个ConsumerRecord对象

  2. Consumer接收数据

public class CustomConsumer {
    
    

    public static void main(String[] args) {
    
    
        //1. 初始化配置信息
        Map<String,Object> map = new HashMap<>();
        map.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG,"kafka:9092");
        map.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
        map.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,StringDeserializer.class);
        map.put(ConsumerConfig.GROUP_ID_CONFIG,"g1");
	    //2. 创建Consumer
        KafkaConsumer<String,String> kafkaConsumer = new KafkaConsumer(map);
        //订阅 topic-user的数据 
        kafkaConsumer.subscribe(Arrays.asList("topica"));

        while (true){
    
    
            //3. 消费数据
            ConsumerRecords<String, String> consumerRecords = kafkaConsumer.poll(100);
            for (ConsumerRecord<String, String> consumerRecord : consumerRecords) {
    
    
                System.out.println(consumerRecord);
            }
        }

    }

}

猜你喜欢

转载自blog.csdn.net/gym02/article/details/111018144