flume写入

1、flume节点上,创建写入kafka的配置文件fowardkafka.conf

[root@slave4] /usr/local/flume$ vim conf/fowardkafka.conf

Flume2KafkaAgent.sources=mysource
Flume2KafkaAgent.channels=mychannel
Flume2KafkaAgent.sinks=mysink

Flume2KafkaAgent.sources.mysource.type=spooldir
Flume2KafkaAgent.sources.mysource.channels=mychannel
Flume2KafkaAgent.sources.mysource.spoolDir=/root/flume-data

Flume2KafkaAgent.sinks.mysink.channel=mychannel
Flume2KafkaAgent.sinks.mysink.type=org.apache.flume.sink.kafka.KafkaSink
Flume2KafkaAgent.sinks.mysink.kafka.bootstrap.servers=192.168.255.121:9092,192.168.255.122:9092,192.168.255.123:9092 #kafka节点列表
Flume2KafkaAgent.sinks.mysink.kafka.topic=flume-data
Flume2KafkaAgent.sinks.mysink.kafka.batchSize=20
Flume2KafkaAgent.sinks.mysink.kafka.producer.requiredAcks=1

Flume2KafkaAgent.channels.mychannel.type=memory
Flume2KafkaAgent.channels.mychannel.capacity=30000
Flume2KafkaAgent.channels.mychannel.transactionCapacity=100

2、启动flume agent(相当于kafka生产者)

bin/flume-ng agent -c conf -f conf/fowardkafka.conf -n Flume2KafkaAgent -Dflume.root.logger=INFO,console

3、模拟日志生产过程

[root@slave4] ~$ mkdir flume-data
[root@slave4] ~$ vim createData.sh 
#!/bin/bash
for i in {101..1000}
do
    sleep 1
    echo "exec$i" >> /root/flume-data/test.log
done

[root@slave4] ~$ ./createData.sh

4、在kafka节点上启动kafka消费者,来消费flume创建的消息

[root@slave1] /usr/local/kafka$ bin/kafka-console-consumer.sh --zookeeper 192.168.255.121:2181 --topic flume-data --from-beginning
Using the ConsoleConsumer with old consumer is deprecated and will be removed in a future major release. Consider using the new consumer by passing [bootstrap-server] instead of [zookeeper].
exec1
exec2
exec3
exec4
exec5
exec6
exec7
exec8
exec9
exec10

猜你喜欢

转载自blog.csdn.net/fanren224/article/details/84063894