storm流程——flume和kafka的连接

flume和kafka的连接参考博客:flume,kafka,storm,mysql的整合
相关资源在这flume2kafka相关jar包及配置文件
若想连接起flume和kafka,需要在flume/conf目录下,创建一个.conf文件,在lib目录下添加相关jar包。
步骤:
1.在flume/conf目录下创建相关.conf文件,
(1)创建flume2kafka.conf文件

vi flume2kafka.conf

(2)在flume2kafka.conf文件中添加相关内容

############################################
#  producer config
###########################################

#agent section
producer.sources = s
producer.channels = c
producer.sinks = r

#source section
#设置读取方式
producer.sources.s.type = exec
#设置读取数据的地址及命令(tail -f -n+1 /home/storm/work/access.log)
producer.sources.s.command = tail -f -n+1 /home/storm/work/access.log
producer.sources.s.channels = c

# Each sink's type must be defined
producer.sinks.r.type = org.apache.flume.plugins.KafkaSink
producer.sinks.r.metadata.broker.list=master:9092
producer.sinks.r.partition.key=0
producer.sinks.r.partitioner.class=org.apache.flume.plugins.SinglePartition
producer.sinks.r.serializer.class=kafka.serializer.StringEncoder
producer.sinks.r.request.required.acks=0
producer.sinks.r.max.message.size=1000000
producer.sinks.r.producer.type=sync
producer.sinks.r.custom.encoding=UTF-8
#设置kafka的topic为:flume2kafka
producer.sinks.r.custom.topic.name=flume2kafka

#Specify the channel the sink should use
producer.sinks.r.channel = c

# Each channel's type is defined.
producer.channels.c.type = memory
producer.channels.c.capacity = 1000

2.在lib目录下添加相关jar包

kafka_2.9.2-0.8.0-beta1.jar
metrics-annotation-2.2.0.jar
scala-compiler-2.9.2.jar

3.启动该flume任务

bin/flume-ng agent -n producer -f conf/flume2kafka.conf  -Dflume.root.logger=INFO,console >>logs/flume.log 2>&1 &

4.启动kafka及kafka的consumer任务(查看是否有数据传输)
(1)启动kafka

sbin/start-kafka.sh

(2)启动consumer任务

bin/kafka-console-consumer.sh --zookeeper master:2181 --topic flume2kafka --from-beginning

猜你喜欢

转载自blog.csdn.net/levi_moon/article/details/51637938