用flume 采集log4j 日志 到hdfs


1.    Web server 集群数据采集采用的架构

 

 

2.    在每台web server上启动一个flume agent ( Flume1.3.1 : http://flume.apache.org/download.html  ),启动命令为:./bin/flume-ng agent --conf-file ./conf/flume.conf --name a1 -Dflume.root.logger=INFO,console

flume-conf文件如下:

# Name the components on this agent
a1.sources = r1
a1.sinks = k1
a1.channels = c1


# Describe/configure the source
a1.sources.r1.type = avro
a1.sources.r1.bind = localhost
a1.sources.r1.port = 41414

# Describe the sink
a1.sinks.k1.type = logger

# avro sink
a1.sinks.k1.type = avro
a1.sinks.k1.channel = c1
a1.sinks.k1.hostname = consolidationServerHost
a1.sinks.k1.port = 41414


# Use a channel which buffers events in file
a1.channels = c1
a1.channels.c1.type = file
a1.channels.c1.checkpointDir = /mnt/flume/checkpoint
a1.channels.c1.dataDirs = /mnt/flume/data

# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1

3.    日志合并服务器上的启动一个flume agent,用于合并来自于web server上的flume agent传送过来的日志。flume-conf文件如下:

# Name the components on this agent
a1.sources = r1
a1.sinks = k1
a1.channels = c1


# Describe/configure the source
a1.sources.r1.type = avro
a1.sources.r1.bind = localhost
a1.sources.r1.port = 41414
a1.sources.r1.interceptors = i1
a1.sources.r1.interceptors.i1.type = timestamp

#hdfs sink
a1.sinks.k1.type = hdfs
a1.sinks.k1.hdfs.path = hdfs://hostname:9000/path/to/log/dir /%Y-%m-%d/%H
a1.sinks.k1.hdfs.fileType = DataStream
a1.sinks.k1.hdfs.filePrefix = appName-
a1.sinks.k1.hdfs.rollInterval = 3600


# Use a channel which buffers events in file
a1.channels = c1
a1.channels.c1.type = file
a1.channels.c1.checkpointDir = /mnt/flume/checkpoint
a1.channels.c1.dataDirs = /mnt/flume/data

# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1


4.    每台web server上的log4j.properties配置文件中加入以下内容:

log4j.rootLogger=INFO,flume

log4j.appender.flume=org.apache.flume.clients.log4jappender.Log4jAppender
log4j.appender.flume.Hostname=localhost
log4j.appender.flume.Port=41414

5.    每台web server上的app的lib库里面添加以下来自于flume的jar包:
avro-1.7.2.jar
avro-ipc-1.7.2.jar
flume-ng-log4jappender-1.3.1.jar
flume-ng-sdk-1.3.1.jar
jackson-core-asl-1.9.3.jar
jackson-mapper-asl-1.9.3.jar
log4j-1.2.16.jar
netty-3.4.0.Final.jar
slf4j-api-1.6.1.jar

6.    测试用的程序如下:

package flume.log4j.test;

import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.nio.charset.Charset;
import org.apache.log4j.Logger;

public class LogTestApp {

  public static void main(String[] args) throws IOException {
    Logger logger = Logger.getLogger(LogTestApp.class);
    BufferedReader in = new BufferedReader(
        new InputStreamReader(System.in, Charset.forName("UTF-8")));
    String line;

    System.out.println("Initializing Flume log4j appender test.");
    System.out.println("Each line entered will be sent to Flume.");

    // send this line to Flume
    logger.info("LogTestApp initialized");

    while ((line = in.readLine()) != null) {
      System.out.println("Sending to log4j: " + line);
      logger.info(line);
    }
  }
}

猜你喜欢

转载自llystar.iteye.com/blog/1879578