Kafka部署手册(Linux)

Kafka部署手册

1. 创建目录

mkdir -p /data/kafka
复制代码

2. 下载安装包

wget -O /data/kafka/kafka_2.13-2.6.0.tgz  https://apache-mirror.rbc.ru/pub/apache/kafka/2.6.0/kafka_2.13-2.6.0.tgz
复制代码

命令方式可能会下载失败,可以访问官网进行下载: kafka.apache.org/downloads

3. 解压

tar -xzf kafka_2.13-2.6.0.tgz
复制代码

4. 进入Kafka的安装目录

cd kafka_2.13-2.6.0
复制代码

5. 启动kafak 运行环境

nohup bin/zookeeper-server-start.sh config/zookeeper.properties 
nohup bin/kafka-server-start.sh config/server.properties 
复制代码

6. 创建topic

bin/kafka-topics.sh --create --topic quickstart-events --bootstrap-server localhost:9092
复制代码

7. WordCountDemo

可以使用以下Demo进行测试,需要将 kafka-broker1 更换为Kafka所在Linux的主机IP。

     import org.apache.kafka.common.serialization.Serdes;
                   import org.apache.kafka.common.utils.Bytes;
                   import org.apache.kafka.streams.KafkaStreams;
                   import org.apache.kafka.streams.StreamsBuilder;
                   import org.apache.kafka.streams.StreamsConfig;
                   import org.apache.kafka.streams.kstream.KStream;
                   import org.apache.kafka.streams.kstream.KTable;
                   import org.apache.kafka.streams.kstream.Materialized;
                   import org.apache.kafka.streams.kstream.Produced;
                   import org.apache.kafka.streams.state.KeyValueStore;

                   import java.util.Arrays;
                   import java.util.Properties;

                   public class WordCountApplication {

                       public static void main(final String[] args) throws Exception {
                           Properties props = new Properties();
                           props.put(StreamsConfig.APPLICATION_ID_CONFIG, "wordcount-application");
                           props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "kafka-broker1:9092");
                           props.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass());
                           props.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, Serdes.String().getClass());

                           StreamsBuilder builder = new StreamsBuilder();
                           KStream<String, String> textLines = builder.stream("TextLinesTopic");
                           KTable<String, Long> wordCounts = textLines
                               .flatMapValues(textLine -> Arrays.asList(textLine.toLowerCase().split("\\W+")))
                               .groupBy((key, word) -> word)
                               .count(Materialized.<String, Long, KeyValueStore<Bytes, byte[]>>as("counts-store"));
                           wordCounts.toStream().to("WordsWithCountsTopic", Produced.with(Serdes.String(), Serdes.Long()));

                           KafkaStreams streams = new KafkaStreams(builder.build(), props);
                           streams.start();
                       }

                   }


作者:JacobHuang
链接:https://juejin.cn/post/6941550641185554445

猜你喜欢

转载自blog.csdn.net/daocaokafei/article/details/115188472