数据离线分析:kafka+flume+hdfs

       数据采集到kafka中之后,既可以对数据进行离线分析,又可以对数据进行实时分析,有些数据适合进行离线分析,比如用户画像。离线分析,需要将数据从kafka中存到hdfs中,这里使用flume,将数据从kafka中导入到hdfs中。flume的启动配置文件(kafkaToHdfs.conf):

# ------------------- 定义数据流----------------------
# source的名字
a1.sources = s1
a1.channels = c1
a1.sinks = k1
#-------- kafkaSource相关配置-----------------
a1.sources.s1.type = org.apache.flume.source.kafka.KafkaSource
a1.sources.s1.channels = c1
a1.sources.s1.batchSize = 5000
a1.sources.s1.kafka.bootstrap.servers = mini02:9092,mini03:9092,mini04:9092
a1.sources.s1.kafka.topics = gsNgixTopic01
a1.sources.s1.kafka.consumer.group.id = flumetest02
#---------hdfsSink 相关配置------------------
a1.sinks.k1.type = hdfs
a1.sinks.k1.channel = c1
a1.sinks.k1.hdfs.path = hdfs://mini02:9000/wc/%{topic}/%y-%m-%d
a1.sinks.k1.hdfs.rollSize = 0
a1.sinks.k1.hdfs.rollCount = 0
a1.sinks.k1.hdfs.rollInterval = 5
a1.sinks.k1.hdfs.threadsPoolSize = 30
a1.sinks.k1.hdfs.fileType=DataStream
a1.sinks.k1.hdfs.writeFormat=Text
#------- memoryChannel相关配置-------------------------
a1.channels.c1.type = memory
a1.channels.c1.capacity = 100000
a1.channels.c1.transactionCapacity = 10000

#注意:a1.sources.s1.kafka.consumer.group.id的配置
//启动flume
./bin/flume-ng agent -n a1 -c conf/ -f conf/tempConf/kafkaToHdfs.conf


猜你喜欢

转载自blog.csdn.net/hefrankeleyn/article/details/79954125
今日推荐