first step: download the 0.10.2.0 compressed package, unzip it from the
official website http://kafka.apache.org/download
> tar - xzf kafka_2.11-0.10.2.0.tgz
> cd kafka_2.11-0.10.2.0
Step 2 Start the server
Start the zookeeper server
> bin/zookeeper-server-start.sh config/zookeeper.properties
Start the Kafka server
> bin/kafka-server-start.sh config/server.properties
Step 3 Create a theme
> bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic test > bin/kafka-topics.sh --list --zookeeper localhost:2181
Step 4 Send a message to the subject
> bin/kafka-console-producer.sh --broker-list localhost:9092 --topic test This is a message This is another message
The fifth step is to create a user (consumer) to receive information
> bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic test --from-beginning This is a message This is another message
Using java to create a demo is mainly for creating topics and sending information, which mainly depends on the kafka producer (Producer).
//Connection configuration Properties props = new Properties(); props.put("bootstrap.servers", "localhost:9092"); props.put("acks", "all"); props.put("retries", 0); props.put("batch.size", 16384); props.put("linger.ms", 1); props.put("buffer.memory", 33554432); props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer"); props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer"); //Producer interface and related implementation classes Producer<String, String> producer = new KafkaProducer<>(props); for (int i = 0; i < 100; i++) { //send messages /** * ProducerRecord(String topic, K key, V value) Create a record to * be sent to Kafka */ producer.send(new ProducerRecord<String, String>("my-topic", Integer.toString(i), Integer.toString(i))); } producer.close();
Actively pulling information from the server requires a Consumer.
Properties props = new Properties(); props.put("bootstrap.servers", "localhost:9092"); props.put("group.id", "test"); props.put("enable.auto.commit", "true"); props.put("auto.commit.interval.ms", "1000"); props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer"); props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer"); KafkaConsumer<String, String> consumer = new KafkaConsumer<>(props); consumer.subscribe(Arrays.asList("my-topic")); while (true) { ConsumerRecords<String, String> records = consumer.poll(100); for (ConsumerRecord<String, String> record : records) { System.out.printf("topic = %s, partition = %d, offset = %d, key = %s, value = %s%n", record.topic(),record.partition(),record.offset(), record.key(), record.value()); } }
Use the main method to start the consumerTester class and then start the producerTester, and you can see the received information in the console.
--- exec-maven-plugin:1.2.1:exec (default-cli) @ KafkaDemo --- SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". SLF4J: Defaulting to no-operation (NOP) logger implementation SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details topic = my-topic, partition = 0, offset = 300, key = 0, value = 0 topic = my-topic, partition = 0, offset = 301, key = 1, value = 1 topic = my-topic, partition = 0, offset = 302, key = 2, value = 2 topic = my-topic, partition = 0, offset = 303, key = 3, value = 3 topic = my-topic, partition = 0, offset = 304, key = 4, value = 4 topic = my-topic, partition = 0, offset = 305, key = 5, value = 5 topic = my-topic, partition = 0, offset = 306, key = 6, value = 6 topic = my-topic, partition = 0, offset = 307, key = 7, value = 7 topic = my-topic, partition = 0, offset = 308, key = 8, value = 8 topic = my-topic, partition = 0, offset = 309, key = 9, value = 9
For specific configuration, see the attached maven project.