The java code implements the from-beginning function of the kafka consumer

Just add in the code

props.put("auto.offset.reset", "earliest");
props.put("group.id", UUID.randomUUID().toString());

 complete example

//1, prepare the configuration file
	    Properties props = new Properties();
	    props.put("bootstrap.servers", "hadoop1:9092");
	    props.put("acks", "all");
	    props.put("retries", 0);
	    props.put("batch.size", 16384);
	    props.put("linger.ms", 1);
	    props.put("buffer.memory", 33554432);
	    
	    props.put("group.id", "test");
	    props.put("enable.auto.commit", "true");
	    props.put("auto.commit.interval.ms", "1000");
	    props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
	    props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
	    
	    props.put("auto.offset.reset", "earliest");
	    props.put("group.id", UUID.randomUUID().toString());
	    
	    
	    
	     
	 // 2. Create KafkaConsumer
	    KafkaConsumer<String, String> kafkaConsumer = new KafkaConsumer<String, String>(props);
	    // 3. Subscribe to data, the topic here can be multiple
	    kafkaConsumer.subscribe(Arrays.asList("yun03"));
	    // 4. Get data
	    while (true) {
	        ConsumerRecords<String, String> records = kafkaConsumer.poll(100);
	        for (ConsumerRecord<String, String> record : records) {
	            System.out.printf("topic = %s,offset = %d, key = %s, value = %s%n",record.topic(), record.offset(), record.key(), record.value());
	        }
	    }

 

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=326172017&siteId=291194637