flume多个agent连接(配置文件)

-------

从tail命令获取数据发送到avro端口

另一个节点可配置一个avro源来中继数据,发送外部存储

tail-avro.conf

##################
# Name the components on this agent
a1.sources = r1
a1.sinks = k1
a1.channels = c1


# Describe/configure the source
a1.sources.r1.type = exec
a1.sources.r1.command = tail -F /home/hadoop/log/test.log
a1.sources.r1.channels = c1


# Describe the sink
#绑定的不是本机, 是另外一台机器的服务地址, sink端的avro是一个发送端, avro的客户端, 往shizhan02这个机器上发
a1.sinks = k1
a1.sinks.k1.type = avro
a1.sinks.k1.channel = c1
a1.sinks.k1.hostname = shizhan02
a1.sinks.k1.port = 4141
a1.sinks.k1.batch-size = 2






# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100


# Bind the source and sink to the channel
a1.sources.r1.channels = c1

a1.sinks.k1.channel = c1


-------

从avro端口接收数据,下沉到logger

bin/flume-ng agent -c conf -f conf/avro-logger.conf -n al -Dflume.root.logger=INFO,console
#########


采集配置文件,avro-hdfs.conf


# Name the components on this agent
a1.sources = r1
a1.sinks = k1
a1.channels = c1


# Describe/configure the source
#source中的avro组件是接收者服务, 绑定本机
a1.sources.r1.type = avro
a1.sources.r1.channels = c1
a1.sources.r1.bind = 0.0.0.0
a1.sources.r1.port = 4141


# Describe the sink
a1.sinks.k1.type = logger


# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100


# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1




发送数据:
$ bin/flume-ng avro-client -H localhost -p 4141 -F /usr/logs/log.10

猜你喜欢

转载自blog.csdn.net/peng_0129/article/details/80793440