window7下Flume+kafka监听文件变化

Flume安装

下载Flume安装包

解压

需要JDK,以及配置JAVA_HOME
在环境变量中配置FLUME_HOME
并在Path中添加%FLUME_HOME%\conf;%FLUME_HOME%\bin

修改配置文件

将flume-conf.properties.template、flume-env.ps1.template、flume-env.sh.template的.template后缀删除
修改内容 flume-env.sh

export JAVA_HOME=D:/softs/Java/jdk1.8.0_271

flume-env.ps1

$FLUME_CLASSPATH="D:/softs/apache-flume-1.9.0-bin/lib"

flume-conf.properties

# Name the components on this agent
a1.sources = r1
a1.sinks = k1
a1.channels = c1

# Describe/configure the source
# a1.sources.r1.type = netcat
# a1.sources.r1.bind = localhost
# a1.sources.r1.port = 44444
a1.sources.r1.type = exec
a1.sources.r1.command = D:/softs/tail/tail.exe -f D:/home/logs/TestWeb.log
a1.sources.r1.fileHeader = true
a1.sources.r1.deserializer.outputCharset=UTF-8

# Describe the sink
a1.sinks.k1.channel = c1
a1.sinks.k1.type = org.apache.flume.sink.kafka.KafkaSink
a1.sinks.k1.kafka.topic = flume-data
a1.sinks.k1.kafka.bootstrap.servers = 192.168.88.62:9092
a1.sinks.k1.kafka.flumeBatchSize = 20
a1.sinks.k1.kafka.producer.acks = 1
a1.sinks.k1.kafka.producer.linger.ms = 1
a1.sinks.k1.kafka.producer.compression.type = snappy

# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100

# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1

这里需要一个windows版本的tail小工具 tail工具下载
进入bin目录

运行Flume

flume-ng agent -c ../conf -f ../conf/flume-conf.properties -n a1 -property flume.root.logger=INFO,console

启动成功后
在kafka的topic【flume-data】可查看消费信息

 bin/kafka-console-consumer.sh --bootstrap-server 192.168.88.62:9092 --topic flume-data

可以手动写一条消息至log文件中:
在这里插入图片描述
Test Message 回车保存
kafka消费信息

[master@localhost kafka_2.12-2.6.0]# bin/kafka-console-consumer.sh --bootstrap-server 192.168.88.62:9092 --topic flume-data
Test Message

猜你喜欢

转载自blog.csdn.net/ASAS1314/article/details/110874238