ELK学习笔记(一)【原创】

最近在做日志系统的选型,最终选择ELK来做应用日志监控。
ELK 是 elastic公司的三个产品集合, 分别指 ES, Logstash, Kibana。目前ELK版本为5.0.0, 需要jdk1.8支持。
logstash采集采用input,中间用filter处理,然后output。
input可以控制台输入,也可以日志文件导入,也可以端口监听输入。
output可以输出到控制台,redis,kafaka,ES等。
以下是Logstash的学习笔记:
1.下载 logstash
2.下载 jdk1.8
3.安装
 
     rpm –ivh jdk-8u111-linux-x64.rpm
     rpm -ivh logstash-5.0.0.rpm
  

4.修改jdk环境变量
  vi /usr/share/logstash/bin/logstash
 
     #头部添加,JDK环境变量
     export JAVA_HOME=/usr/java/jdk1.8.0_111
     export PATH=$JAVA_HOME/bin:$PATH
     export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
  

5. 新增logstash配置文件
  vi /usr/share/logstash/bin/logstash.conf
 
     input{
        stdin{}
     }
     output{
        stdout{codec=>rubydebug}
     }
  

6. 启动logstash
  
   ./logstash -f logstash.conf --path.settings=/etc/logstash/
   

7. 控制台输入aabc, 控制台输出
  
8. 利用file-input导入日志文件,并解析输出
input {
	file {
		path => ["/var/tmp/*.log", "/var/tmp/message"]
		start_position => "beginning"
		codec => multiline {
			pattern => "^\["
			negate => true
			what => "previous"
		}
	}
}

filter{
    mutate{
     split => ["message"," - "]
        add_field =>   {
            "tmp1" => "%{[message][0]}"
        }
        add_field =>   {
            "content" => "%{[message][1]}"
        }
        gsub => [ "message", "\[", " " ]
        gsub => [ "message", "]:", " " ]
        remove_field => [ "message" ]
        remove_field => [ "@timestamp" ]
        remove_field => [ "@version" ]
    }
    
    mutate{
     split => ["tmp1"," "]
        add_field =>   {
            "project" => "%{[tmp1][0]}"
        }
        add_field =>   {
            "level" => "%{[tmp1][1]}"
        }
        add_field =>   {
            "timestamp" => "%{[tmp1][2]}T%{[tmp1][3]}Z"
        }
        add_field =>   {
            "tmp2" => "%{[tmp1][4]}"
        }
        remove_field => [ "tmp1" ]
    }
    
    mutate{
     split => ["tmp2",":"]
        add_field =>   {
            "class" => "%{[tmp2][0]}"
        }
        add_field =>   {
            "method" => "%{[tmp2][1]}"
        }
        remove_field => [ "tmp2" ]
    }
}
    
output{
	stdout{codec=>rubydebug}
}

日志文件:
[DDC-SUBSCRIBE]: WARN  2016-07-06 13:50:32,162 PersonalGoodsDeclareMessageListener:consumeMessage - 接收到消息:key=10059835419
[DDC-SUBSCRIBE]: WARN  2016-07-06 13:50:32,166 PersonalGoodsDeclareMessageListener:consumeMessage - 接收到消息:key=10059755919
[DDC-SUBSCRIBE]: WARN  2016-07-06 13:50:32,168 PersonalGoodsDeclareMessageListener:consumeMessage - 接收到消息:key=10059842019
[DDC-SUBSCRIBE]: WARN  2016-07-06 13:50:32,169 PersonalGoodsDeclareMessageListener:consumeMessage - 接收到消息:key=10060209919
[DDC-SUBSCRIBE]: WARN  2016-07-06 13:50:32,169 PersonalGoodsDeclareMessageListener:consumeMessage - 接收到消息:key=10059764469
[DDC-SUBSCRIBE]: WARN  2016-07-06 13:50:32,169 PersonalGoodsDeclareMessageListener:consumeMessage - 接收到消息:key=10059743019
[DDC-SUBSCRIBE]: WARN  2016-07-06 13:50:32,309 PersonalGoodsDeclareMessageListener:consumeMessage - 接收到消息:key=10059964669
[DDC-SUBSCRIBE]: WARN  2016-07-06 13:50:32,318 PersonalGoodsDeclareMessageListener:consumeMessage - 接收到消息:key=10060158219
[DDC-SUBSCRIBE]: ERROR 2016-07-06 13:50:32,616 DefaultDaoImpl:create - 数据插入出错.Table:JKF_P_GOODS_DECLAR, key:10059756769
[DDC-SUBSCRIBE]: ERROR 2016-07-06 13:50:32,619 DefaultDaoImpl:create - 数据插入出错.Table:JKF_P_GOODS_DECLAR, key:10060229469
[DDC-SUBSCRIBE]: ERROR 2016-07-06 13:50:32,616 DefaultDaoImpl:create - 数据插入出错.Table:JKF_P_GOODS_DECLAR, key:10059743019

输出内容:



问题1:
安装logstash的时候, 报错:
/usr/share/logstash/vendor/jruby/bin/jruby: line 388: /usr/bin/java: 没有那个文件或目录

解决:
可能是jdk使用解压包解压,而不是采用rpm包安装,导致没有/usr/bin/java文件,可采用软连接解决。
ln -s /usr/local/jdk1.8.0_121/bin/java /usr/bin/java

猜你喜欢

转载自zhenggm.iteye.com/blog/2336814