log4j发送日志给flume,并通过过滤器,将日志存入hdfs中,通过日期分区存放

版权声明:本文为博主原创文章,遵循 CC 4.0 BY-SA 版权协议,转载请附上原文出处链接和本声明。
本文链接: https://blog.csdn.net/weixin_41876523/article/details/90916085

一、log4j配置文件修改需要在发送端引入依赖包

<dependency>
    <groupId>org.apache.flume.flume-ng-clients</groupId>
    <artifactId>flume-ng-log4jappender</artifactId>
    <version>1.6.0</version>
</dependency>
og4j.rootLogger = INFO,stdout,flume
log4j.appender.flume=org.apache.flume.clients.log4jappender.Log4jAppender
log4j.appender.flume.Hostname=192.168.190.131
log4j.appender.flume.Port=5555
log4j.appender.flume.UnsafeMode=true
log4j.appender.flume.layout=org.apache.log4j.PatternLayout

二、配置flume conf文件

a1.sources = r1
a1.sinks = k1
a1.channels = c1

# Describe/configure the source
a1.sources.r1.type = avro
a1.sources.r1.bind = 0.0.0.0
a1.sources.r1.port = 5555

a1.sources.r1.interceptors = i1
a1.sources.r1.interceptors.i1.type = regex_extractor                                                                                                              
a1.sources.r1.interceptors.i1.regex = (^[^|]*|$)
a1.sources.r1.interceptors.i1.serializers = s1
a1.sources.r1.interceptors.i1.serializers.s1.name = log4jEventType

# Describe the sink
a1.sinks.k1.type =hdfs
a1.sinks.k1.hdfs.path = /log4j/flume/%{log4jEventType}/%Y%m
a1.sinks.k1.hdfs.fileType=DataStream
a1.sinks.k1.hdfs.filePrefix = log4j-
a1.sinks.k1.hdfs.rollInterval = 86400
a1.sinks.k1.hdfs.rollSize = 0
a1.sinks.k1.hdfs.rollCount = 0
a1.sinks.k1.hdfs.useLocalTimeStamp = true

# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100
 
# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1

其中可能会坑的地方

a1.sources.r1.interceptors.i1.regex = (^[^|]*|$)这个需要用()括起来,才能识别为一个regex替换出来的文件

a1.sinks.k1.hdfs.path = /log4j/flume/%{log4jEventType}/%Y%m存放地址中出现日期变量的话,需要用到a1.sinks.k1.hdfs.useLocalTimeStamp = true。或者将需要的变量加入到head头里面

猜你喜欢

转载自blog.csdn.net/weixin_41876523/article/details/90916085