flume + flume + kafka Messaging

For consumption by the flume flume to collect monitoring data on other machines, kafka sent to the machine.

Environment: slave installed flume, master installed flume + kafka (here with two virtual machines, three or more may be used)

masterIP 192.168.83.128    slaveIP 192.168.83.129

flume test.log by monitoring changes in the file, change the information collected is sent to the host, and then be sent to the consumer kafka

1, the configuration slave1 example.conf configuration file conf directory in the flume, no create one

The Components ON to the #NAME the this Agent 
a1.sources = R1 
a1.sinks = K1 
a1.channels = C1 

# the Describe / Configure The Source 
a1.sources.r1.type = Exec 
# monitor file folder test.log a1.sources .r1.command
= tail -F / Home / QQ / PP / data / the test.log a1.sources.r1.channels = C1 # the Describe the sink ## is a sink end avro data sender a1.sinks = K1 ## Avro type arranged to set the message a1.sinks.k1.type = Avro a1.sinks.k1.channel = C1 ## to sink to the master machine a1.sinks.k1.hostname = 192.168.83.133 ##下沉到mini2中的44444 a1.sinks.k1.port = 44444 a1.sinks.k1.batch-size = 2 # Use a channel which buffers events in memory a1.channels.c1.type = memory a1.channels.c1.capacity = 1000 a1.channels.c1.transactionCapacity = 100 # Bind the source and sink to the channel a1.sources.r1.channels = c1 a1.sinks.k1.channel = c1

2, the configuration flume / conf inside example.conf (subscript red Note) the master

The Components ON #me the this Agent 
a1.sources = R1 
a1.sinks = K1 
a1.channels = C1 

# the Describe / Configure The Source 
## Source Avro assembly is in a service recipient a1.sources.r1.type
 = Avro 
A1 .sources.r1.channels = C1
 a1.sources.r1.bind = 0.0.0.0 
a1.sources.r1.port = 44444 

# the describe the sink 
# a1.sinks.k1.type = Logger
 # sink for the configuration described using kafka for data consumption 
a1.sinks.k1.type = org.apache.flume.sink.kafka.KafkaSink 
a1.sinks.k1.topic = flume_kafka
a1.sinks.k1.brokerList = 192.168.83.128:9092,192.168.83.129:9092
a1.sinks.k1.requiredAcks = 1
a1.sinks.k1.batchSize = 20

# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100

# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1

 

3, writes the string to the listener file (program loop write, without having to manually modify test.log files)

[root@s1 # cd /home/qq/pp/data
[root@s1 home/qq/pp/data# while true
> do
> echo "toms" >> test.log
> sleep 1
> done

4, see the above program is executed

#cd /home/qq/pp/data
#tail -f test.log

5, the message recipient opens the master flume

Flume into the installation directory, execute the following statement

bin/flume-ng agent -c conf -f conf/example.conf -n a1 -Dflume.root.logger=INFO,console

Now back to print out some information

6, start the slave of the flume

Flume into the installation directory, execute the following statement

bin/flume-ng agent -c conf -f conf/example.conf -n a1 -Dflume.root.logger=INFO,console

7,  into the master --- kafka installation directory

    1) Start zookeeper

      bin/zookeeper-server-start.sh -daemon config/zookeeper.properties

    2) Start kafka Service

      bin/kafka-server-start.sh -daemon config/server.properties 

    3) Creating topic

kafka-topics.sh --create --topic flume_kafka  --zookeeper 192.168.83.129:2181,192.168.83.128:2181 --partitions 2 --replication-factor 1

    4) Create a consumer

bin/kafka-console-consumer.sh --bootstrap-server 192.168.83.128:9092,192.168.83.129:9092 --topic flume_kafka --from-beginning

    5) then you will see the information written consumption of the print window,  

                    

Behind the storm will be combined with the data of real consumption, so stay tuned.

 

 

 

 

reference: 

https://blog.csdn.net/luozhonghua2014/article/details/80369469?utm_source=blogxgwz5

https://blog.csdn.net/wxgxgp/article/details/85701844

https://blog.csdn.net/tototuzuoquan/article/details/73203241

 

Guess you like

Origin www.cnblogs.com/51python/p/10963699.html