log4J日志收集(filebeat+logstash+Elasticsearch )

一、说明


最近在学习logstash,发现内容特别多,除了基本的知识的学习以外,最多的就是插件的学习了,可是我看了下官网的插件,最新的版本6.3的插件展示如下:
- 输入插件(Input plugins)

beats,cloudwatch,couchdb_changes,dead_letter_queue,elasticse,rch,exec,file,ganglia,gelf,generator,github,google_pubsub,graphite,heartbeat,http,http_poller,imap,irc,jdbc,jms,jmx,kafka,kinesis,log4j,lumberjack,meetup,pipe,puppet_facter,rabbitmq,redis,relp,rss,s3,salesforce,snmptrap,sqlite,sqs,stdin,stomp,syslog,tcp,twitter,udp,unix,varnishlog,websocket,wmi,xmpp
boundary,circonus,cloudwatch,csv,datadog,datadog_metrics,elasticsearch,email,exec,file,ganglia,gelf,google_bigquery,graphite,graphtastic,http,influxdb,irc,juggernaut,kafka,librato,loggly,lumberjack,metriccatcher,mongodb,nagios,nagios_nsca,opentsdb,pagerduty,pipe,rabbitmq,redis,redmine,riak,riemann,s3,sns,solr_http,sqs,statsd,stdout,stomp,syslog,tcp,timber,udp,webhdfs,websocket,xmpp,zabbix
aggregate,alter,cidr,cipher,clone,csv,date,de_dot,dissect,dns,drop,elapsed,elasticsearch,environment,extractnumbers,fingerprint,geoip,grok,i18n,jdbc_static,jdbc_streaming,json,json_encode,kv,metricize,metrics,mutate,prune,range,ruby,sleep,split,syslog_pri,throttle,tld,translate,truncate,urldecode,useragent,uuid,xml
avro,cef,cloudfront,collectd,dots,edn,edn_lines,es_bulk,fluent,graphite,gzip_lines,json,json_lines,line,msgpack,multiline,netflow,nmap,plain,protobuf,rubydebug

特别补充:我对这个小demo进行了测试,发现一直获取不到日志,最后发现,logstash已经弃用了这个功能,只能说浪费了很多时间,新的版本是用的filebeat 从log4j那里获取日志!!
所以新的想法是,在客户端安装filebeats收集日志,发给logstash,在存入Elasticsearch ,最后展现在kibana页面上

二、log4j基础


Log4j 是一个使用 Java 语言编写的,可靠、快速、灵活的日志框架(API),使用 Apache Software License 授权。它被移植到 C、C++、C#、Perl、Python、Ruby 和 Eiffel 语言中。

Log4j 是高度可配置的,在运行期使用外部的配置文件对其进行配置。它按照优先级别记录日志,并可将日志信息定向输出到各种介质,比如数据库、文件、控制台、Unix Syslog等。

Log4j 主要由三部分组成:
- 1.loggers:负责采集日志信息。
- 2.appenders:负责将日志信息发布到不同地方。
- 3.layouts:负责以各种风格格式化日志信息。

三、新建java项目


下面通过实际的工程配置学习如何配置log4j。工程目录结构如下图所示:
image.png
pom.xml文件中加入log4j的依赖,版本为1.2.17,pom.xml中的代码如下:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>com.logs</groupId>
    <artifactId>log4idemo1</artifactId>
    <version>1.0-SNAPSHOT</version>
    <!-- https://mvnrepository.com/artifact/log4j/log4j -->
    <dependencies>
        <dependency>
            <groupId>log4j</groupId>
            <artifactId>log4j</artifactId>
            <version>1.2.17</version>
        </dependency>
    </dependencies>

</project>

在resource目录下新建log4j.properties,加入以下配置:

### 设置###
log4j.rootLogger = debug,stdout,D,E,logstash

### 输出信息到控制抬 ###
log4j.appender.stdout = org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target = System.out
log4j.appender.stdout.layout = org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern = [%-5p] %d{yyyy-MM-dd HH:mm:ss,SSS} method:%l%n%m%n

### 输出DEBUG 级别以上的日志到=/user/zhujie/Documents/elk/log4j/debug.log###
log4j.appender.D = org.apache.log4j.DailyRollingFileAppender
log4j.appender.D.File = /Users/bee/Documents/elk/log4j/debug.log
log4j.appender.D.Append = true
log4j.appender.D.Threshold = DEBUG 
log4j.appender.D.layout = org.apache.log4j.PatternLayout
log4j.appender.D.layout.ConversionPattern = %-d{yyyy-MM-dd HH:mm:ss}  [ %t:%r ] - [ %p ]  %m%n

### 输出ERROR 级别以上的日志到=/user/zhujie/Documents/elk/log4j/error.log  ###
log4j.appender.E = org.apache.log4j.DailyRollingFileAppender
log4j.appender.E.File =/Users/bee/Documents/elk/log4j/error.log 
log4j.appender.E.Append = true
log4j.appender.E.Threshold = ERROR 
log4j.appender.E.layout = org.apache.log4j.PatternLayout
log4j.appender.E.layout.ConversionPattern = %-d{yyyy-MM-dd HH:mm:ss}  [ %t:%r ] - [ %p ]  %m%n

#输出日志到logstash
log4j.appender.logstash=org.apache.log4j.net.SocketAppender
log4j.appender.logstash.RemoteHost=127.0.0.1
log4j.appender.logstash.port=4560
log4j.appender.logstash.ReconnectionDelay=60000
log4j.appender.logstash.LocationInfo=true

配置文件中,把日志输出了四份:

  • 第一份输出到控制台
  • 第二份把DEBUG 级别以上的日志到文件
  • 第三份把输出ERROR 级别以上的日志到文件
  • 第四份输出到logstash

在java目录下添加Log4jTest.java,内容如下:

import org.apache.log4j.Logger;
/**
 * Created by bee on 17/3/6.
 */
public class Log4jTest {
    public static final Logger logger=Logger.getLogger(Log4jTest.class);

    public static void main(String[] args) {
        logger.debug("This is a debug message!");
        logger.info("This is info message!");
        logger.warn("This is a warn message!");
        logger.error("This is error message!");

        try{
           System.out.println(5/0);
        }catch(Exception e){
            logger.error(e);
        }
    }
}

四、Filebeat的安装使用

五、启动项目进行日志收集

1.信息配置


如上一章所讲的,安装好了filebeat之后,为了适配这个项目,还需要进行一些配置工作。

修改filebeat.yml 信息,你可以通过命令
grep -v "#" /opt/elk/filebeat-6.2.2-linux-x86_64/filebeat.yml |grep -v "^$"
对配置好的信息进行查询验证

filebeat.prospectors:
- type: log
  enabled: true
  paths:
    - /user/zhujie/Documents/elk/log4j/debug.log
    - /user/zhujie/Documents/elk/log4j/error.log 
filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: false
setup.template.settings:
  index.number_of_shards: 3
setup.kibana:
output.logstash:
  hosts: ["10.25.0.221:4560"]

主要是保证目录的正确性、发给logstash的地址和端口的正确就行了。

2.启动项目


记住启动顺序,先启动Elasticsearch ;然后打开logstash,用的就是刚刚配好的conf文件,记住要使用自己的主机地址;然后打开filebeats;各个插件都启动成功之后,运行写好的java代码就行了。

下面是我跑了代码后各个插件的展示情况
- 运行java代码

[root@hadoop01 zhujie]# java -jar log4jdemo1.jar 
[DEBUG] 2018-07-27 00:06:00,853 method:Log4jTest.main(Log4jTest.java:9)
This is a debug message!
log4j:WARN Detected problem with connection: java.net.SocketException: 断开的管道 (Write failed)
[INFO ] 2018-07-27 00:06:00,928 method:Log4jTest.main(Log4jTest.java:10)
This is info message!
[WARN ] 2018-07-27 00:06:00,928 method:Log4jTest.main(Log4jTest.java:11)
This is a warn message!
[ERROR] 2018-07-27 00:06:00,928 method:Log4jTest.main(Log4jTest.java:12)
This is error message!
[ERROR] 2018-07-27 00:06:00,929 method:Log4jTest.main(Log4jTest.java:17)
java.lang.ArithmeticException: / by zero
[root@hadoop01 zhujie]#
  • filebeat展示情况



2018-07-27T00:25:13.356+0800    INFO    [monitoring]    log/log.go:124  Non-zero metrics in the last 30s        {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":140,"time":142},"total":{"ticks":300,"time":311,"value":300},"user":{"ticks":160,"time":169}},"info":{"ephemeral_id":"0e0a2bf4-5015-4cda-b25a-e8a0ce97a8ad","uptime":{"ms":1200124}},"memstats":{"gc_next":4194304,"memory_alloc":1532352,"memory_total":14280232}},"filebeat":{"harvester":{"open_files":0,"running":0}},"libbeat":{"config":{"module":{"running":0}},"pipeline":{"clients":1,"events":{"active":0}}},"registrar":{"states":{"current":9}},"system":{"load":{"1":0,"15":0.05,"5":0.01,"norm":{"1":0,"15":0.05,"5":0.01}}}}}}
2018-07-27T00:25:13.479+0800    INFO    log/harvester.go:216    Harvester started for file: /user/zhujie/Documents/elk/log4j/error.log
2018-07-27T00:25:13.480+0800    INFO    log/harvester.go:216    Harvester started for file: /user/zhujie/Documents/elk/log4j/debug.log
2018-07-27T00:25:13.480+0800    DEBUG   [publish]       pipeline/processor.go:275       Publish event: {
  "@timestamp": "2018-07-26T16:25:13.480Z",
  "@metadata": {
    "beat": "filebeat",
    "type": "doc",
    "version": "6.2.2"
  },
  "message": "2018-07-27 00:25:11  [ main:0 ] - [ DEBUG ]  This is a debug message!",
  "prospector": {
    "type": "log"
  },
  "beat": {
    "name": "hadoop01",
    "hostname": "hadoop01",
    "version": "6.2.2"
  },
  "source": "/user/zhujie/Documents/elk/log4j/debug.log",
  "offset": 432
}
2018-07-27T00:25:13.480+0800    DEBUG   [publish]       pipeline/processor.go:275       Publish event: {
  "@timestamp": "2018-07-26T16:25:13.480Z",
  "@metadata": {
    "beat": "filebeat",
    "type": "doc",
    "version": "6.2.2"
  },
  "prospector": {
    "type": "log"
  },
  "beat": {
    "name": "hadoop01",
    "hostname": "hadoop01",
    "version": "6.2.2"
  },
  "message": "2018-07-27 00:25:11  [ main:23 ] - [ INFO ]  This is info message!",
  "source": "/user/zhujie/Documents/elk/log4j/debug.log",
  "offset": 499
}
2018-07-27T00:25:13.480+0800    DEBUG   [publish]       pipeline/processor.go:275       Publish event: {
  "@timestamp": "2018-07-26T16:25:13.480Z",
  "@metadata": {
    "beat": "filebeat",
    "type": "doc",
    "version": "6.2.2"
  },
  "source": "/user/zhujie/Documents/elk/log4j/debug.log",
  "offset": 568,
  "message": "2018-07-27 00:25:11  [ main:24 ] - [ WARN ]  This is a warn message!",
  "prospector": {
    "type": "log"
  },
  "beat": {
    "name": "hadoop01",
    "hostname": "hadoop01",
    "version": "6.2.2"
  }
}
2018-07-27T00:25:13.480+0800    DEBUG   [publish]       pipeline/processor.go:275       Publish event: {
  "@timestamp": "2018-07-26T16:25:13.480Z",
  "@metadata": {
    "beat": "filebeat",
    "type": "doc",
    "version": "6.2.2"
  },
  "source": "/user/zhujie/Documents/elk/log4j/debug.log",
  "offset": 637,
  "message": "2018-07-27 00:25:11  [ main:24 ] - [ ERROR ]  This is error message!",
  "prospector": {
    "type": "log"
  },
  "beat": {
    "name": "hadoop01",
    "hostname": "hadoop01",
    "version": "6.2.2"
  }
}
2018-07-27T00:25:13.480+0800    DEBUG   [publish]       pipeline/processor.go:275       Publish event: {
  "@timestamp": "2018-07-26T16:25:13.480Z",
  "@metadata": {
    "beat": "filebeat",
    "type": "doc",
    "version": "6.2.2"
  },
  "source": "/user/zhujie/Documents/elk/log4j/debug.log",
  "offset": 724,
  "message": "2018-07-27 00:25:11  [ main:25 ] - [ ERROR ]  java.lang.ArithmeticException: / by zero",
  "prospector": {
    "type": "log"
  },
  "beat": {
    "name": "hadoop01",
    "hostname": "hadoop01",
    "version": "6.2.2"
  }
}
2018-07-27T00:25:13.480+0800    DEBUG   [publish]       pipeline/processor.go:275       Publish event: {
  "@timestamp": "2018-07-26T16:25:13.480Z",
  "@metadata": {
    "beat": "filebeat",
    "type": "doc",
    "version": "6.2.2"
  },
  "offset": 225,
  "message": "2018-07-27 00:25:11  [ main:24 ] - [ ERROR ]  This is error message!",
  "prospector": {
    "type": "log"
  },
  "beat": {
    "name": "hadoop01",
    "hostname": "hadoop01",
    "version": "6.2.2"
  },
  "source": "/user/zhujie/Documents/elk/log4j/error.log"
}
2018-07-27T00:25:13.480+0800    DEBUG   [publish]       pipeline/processor.go:275       Publish event: {
  "@timestamp": "2018-07-26T16:25:13.480Z",
  "@metadata": {
    "beat": "filebeat",
    "type": "doc",
    "version": "6.2.2"
  },
  "source": "/user/zhujie/Documents/elk/log4j/error.log",
  "offset": 312,
  "message": "2018-07-27 00:25:11  [ main:25 ] - [ ERROR ]  java.lang.ArithmeticException: / by zero",
  "prospector": {
    "type": "log"
  },
  "beat": {
    "name": "hadoop01",
    "hostname": "hadoop01",
    "version": "6.2.2"
  }
}
  • logstash接受日志情况








{
       "message" => "2018-07-27 00:27:24 [ main:21 ] - [ ERROR ] This is error message!",
        "offset" => 537,
    "prospector" => {
        "type" => "log"
    },
          "tags" => [
        [0] "beats_input_codec_plain_applied"
    ],
      "@version" => "1",
          "beat" => {
         "version" => "6.2.2",
        "hostname" => "hadoop01",
            "name" => "hadoop01"
    },
          "host" => "hadoop01",
    "@timestamp" => 2018-07-26T16:27:33.501Z,
        "source" => "/user/zhujie/Documents/elk/log4j/error.log"
}
{
       "message" => "2018-07-27 00:27:24 [ main:22 ] - [ ERROR ] java.lang.ArithmeticException: / by zero",
        "offset" => 624,
    "prospector" => {
        "type" => "log"
    },
          "tags" => [
        [0] "beats_input_codec_plain_applied"
    ],
      "@version" => "1",
          "beat" => {
         "version" => "6.2.2",
        "hostname" => "hadoop01",
            "name" => "hadoop01"
    },
          "host" => "hadoop01",
    "@timestamp" => 2018-07-26T16:27:33.501Z,
        "source" => "/user/zhujie/Documents/elk/log4j/error.log"
}
{
       "message" => "2018-07-27 00:27:24 [ main:0 ] - [ DEBUG ] This is a debug message!",
        "offset" => 1156,
    "prospector" => {
        "type" => "log"
    },
          "tags" => [
        [0] "beats_input_codec_plain_applied"
    ],
      "@version" => "1",
          "beat" => {
         "version" => "6.2.2",
        "hostname" => "hadoop01",
            "name" => "hadoop01"
    },
          "host" => "hadoop01",
    "@timestamp" => 2018-07-26T16:27:33.501Z,
        "source" => "/user/zhujie/Documents/elk/log4j/debug.log"
}
{
       "message" => "2018-07-27 00:27:24 [ main:21 ] - [ INFO ] This is info message!",
        "offset" => 1223,
    "prospector" => {
        "type" => "log"
    },
          "tags" => [
        [0] "beats_input_codec_plain_applied"
    ],
      "@version" => "1",
          "beat" => {
         "version" => "6.2.2",
        "hostname" => "hadoop01",
            "name" => "hadoop01"
    },
          "host" => "hadoop01",
    "@timestamp" => 2018-07-26T16:27:33.501Z,
        "source" => "/user/zhujie/Documents/elk/log4j/debug.log"
}
{
       "message" => "2018-07-27 00:27:24 [ main:21 ] - [ WARN ] This is a warn message!",
        "offset" => 1292,
    "prospector" => {
        "type" => "log"
    },
          "tags" => [
        [0] "beats_input_codec_plain_applied"
    ],
      "@version" => "1",
          "beat" => {
         "version" => "6.2.2",
        "hostname" => "hadoop01",
            "name" => "hadoop01"
    },
          "host" => "hadoop01",
    "@timestamp" => 2018-07-26T16:27:33.501Z,
        "source" => "/user/zhujie/Documents/elk/log4j/debug.log"
}
{
       "message" => "2018-07-27 00:27:24 [ main:21 ] - [ ERROR ] This is error message!",
        "offset" => 1361,
    "prospector" => {
        "type" => "log"
    },
          "tags" => [
        [0] "beats_input_codec_plain_applied"
    ],
      "@version" => "1",
          "beat" => {
         "version" => "6.2.2",
        "hostname" => "hadoop01",
            "name" => "hadoop01"
    },
          "host" => "hadoop01",
    "@timestamp" => 2018-07-26T16:27:33.501Z,
        "source" => "/user/zhujie/Documents/elk/log4j/debug.log"
}
{
       "message" => "2018-07-27 00:27:24 [ main:22 ] - [ ERROR ] java.lang.ArithmeticException: / by zero",
        "offset" => 1448,
    "prospector" => {
        "type" => "log"
    },
          "tags" => [
        [0] "beats_input_codec_plain_applied"
    ],
      "@version" => "1",
          "beat" => {
         "version" => "6.2.2",
        "hostname" => "hadoop01",
            "name" => "hadoop01"
    },
          "host" => "hadoop01",
    "@timestamp" => 2018-07-26T16:27:33.501Z,
        "source" => "/user/zhujie/Documents/elk/log4j/debug.log"
}
  • 运行kibana查看最后结果
    用命令启动[root@hadoop02 kibana-6.2.2-linux-x86_64]# ./bin/kibana
  log   [16:30:26.398] [info][status][plugin:[email protected]] Status changed from uninitialized to green - Ready
  log   [16:30:26.592] [info][status][plugin:[email protected]] Status changed from uninitialized to yellow - Waiting for Elasticsearch
  log   [16:30:26.603] [info][status][plugin:[email protected]] Status changed from uninitialized to green - Ready
  log   [16:30:27.814] [info][status][plugin:[email protected]] Status changed from uninitialized to green - Ready
  log   [16:30:27.844] [info][status][plugin:[email protected]] Status changed from uninitialized to green - Ready
  log   [16:30:27.907] [info][listening] Server running at http://10.25.0.221:5601
  log   [16:30:28.934] [info][status][plugin:[email protected]] Status changed from yellow to green - Ready

页面查询展示如下
image.png

到这里,我们就已经完成了log4j的日志的收集,至于log4j日志的具体解析,还要再讨论过~

猜你喜欢

转载自blog.csdn.net/qq_34646817/article/details/81232104