Kafka+flume use

1. Install and use kafka
1. Download the kafka installation package and
select binary download
Binary package
2. Unzip

tar -zxvf kafka_2.13-2.6.0.tgz -C ./

After decompression, enter the kafka directory Kafka directory structure
3. Modify the configuration file (under the config directory)

vim server.properties

Require unique in the whole cluster

broker.id=0

Allow external connections

listeners=PLAINTEXT://0.0.0.0:9092
advertised.listeners=PLAINTEXT://192.168.xx.xx:9092

Zookeeper node addresses, separated by commas ","

zookeeper.connect=localhost:2181

4. Start zookeeper (under the kafka directory) and run in the background

bin/zookeeper-server-start.sh -daemon config/zookeeper.properties

After startup, you can see the process QuorumPeerMain

jps
5074 Jps
5032 QuorumPeerMain

5. Start Kafka

bin/kafka-server-start.sh -daemon config/server.properties

After startup, you can see the process Kafka

jps
5446 Kafka
5510 Jps
5032 QuorumPeerMain

----Stop the kafka process

bin/kafka-server-stop.sh

----Start the consumer

bin/kafka-console-consumer.sh --bootstrap-server localhost:9092  --topic 主题名 --from-beginning

----Start the producer

bin/kafka-console-producer.sh --broker-list localhost:9092 --topic 主题名

Two, flume installation and use
1, download the installation package,
select binary download
Download the flume installation package
2, unzip

tar -xvf apache-flume-1.9.0-bin.tar

After decompression, enter the flume directory
flume directory
3. Modify the configuration file (under the config directory)

cp flume-conf.properties.template flume-conf.properties
vim flume-conf.properties
a1.sources = s1
a1.channels = c1
a1.sinks = k1

a1.sources.s1.type = syslogudp
a1.sources.s1.bind = 0.0.0.0
a1.sources.s1.port = 44444

a1.channels.c1.type = memory

a1.sinks.k1.type = org.apache.flume.sink.kafka.KafkaSink
a1.sinks.k1.kafka.topic = DjangoLog(主题名称)
a1.sinks.k1.kafka.bootstrap.servers = 192.168.xx.xx:9092

a1.sources.s1.channels = c1
a1.sinks.k1.channel = c1

4. Start flume

bin/flume-ng agent --conf conf/ --conf-file conf/flume-conf.properties --name a1 &

5. Test whether flume can receive the message

telnet 192.168.xx.xxx 44444
telnet 192.168.xx.xxx 44444
Trying 192.168.xx.xxx...
Connected to 192.168.xx.xxx.
Escape character is '^]'.
123
OK

6. Use Kafka Consumer View

bin/kafka-console-consumer.sh --bootstrap-server localhost:9092  --topic DjangoLog --from-beginning
123

Guess you like

Origin blog.csdn.net/weixin_44784018/article/details/109488122