ELK study notes (1) [original]

Recently, I was doing the selection of the log system, and finally chose ELK for application log monitoring.
ELK is a collection of three products of elastic company, referring to ES, Logstash, Kibana respectively. The current ELK version is 5.0.0, which requires jdk1.8 support.
Logstash is collected by input, processed by filter in the middle, and then output.
Input can be console input, log file import, or port monitoring input.
Output can be output to console, redis, kafaka, ES, etc.
The following are the study notes of Logstash:
1. Download logstash
2. Download jdk1.8
3. Install
 
     rpm –ivh jdk-8u111-linux-x64.rpm
     rpm -ivh logstash-5.0.0.rpm
  

4. Modify the jdk environment variable
  vi /usr/share/logstash/bin/logstash
 
     #Add to the header, JDK environment variables
     export JAVA_HOME=/usr/java/jdk1.8.0_111
     export PATH=$JAVA_HOME/bin:$PATH
     export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
  

5. Added logstash configuration file
  vi /usr/share/logstash/bin/logstash.conf
 
     input{
        stdin{}
     }
     output{
        stdout{codec=>rubydebug}
     }
  

6. Start logstash
  
   ./logstash -f logstash.conf --path.settings=/etc/logstash/
   

7. Console input aabc, console output
  
8. Use file-input to import log files and parse the output
input {
	file {
		path => ["/var/tmp/*.log", "/var/tmp/message"]
		start_position => "beginning"
		codec => multiline {
			pattern => "^\["
			negate => true
			what => "previous"
		}
	}
}

filter{
    mutate{
     split => ["message"," - "]
        add_field =>   {
            "tmp1" => "%{[message][0]}"
        }
        add_field =>   {
            "content" => "%{[message][1]}"
        }
        gsub => [ "message", "\[", " " ]
        gsub => [ "message", "]:", " " ]
        remove_field => [ "message" ]
        remove_field => [ "@timestamp" ]
        remove_field => [ "@version" ]
    }
    
    mutate{
     split => ["tmp1"," "]
        add_field =>   {
            "project" => "%{[tmp1][0]}"
        }
        add_field =>   {
            "level" => "%{[tmp1][1]}"
        }
        add_field =>   {
            "timestamp" => "%{[tmp1][2]}T%{[tmp1][3]}Z"
        }
        add_field =>   {
            "tmp2" => "%{[tmp1][4]}"
        }
        remove_field => [ "tmp1" ]
    }
    
    mutate{
     split => ["tmp2",":"]
        add_field =>   {
            "class" => "%{[tmp2][0]}"
        }
        add_field =>   {
            "method" => "%{[tmp2][1]}"
        }
        remove_field => [ "tmp2" ]
    }
}
    
output{
	stdout{codec=>rubydebug}
}

log file:
[DDC-SUBSCRIBE]: WARN 2016-07-06 13:50:32,162 PersonalGoodsDeclareMessageListener:consumeMessage - message received:key=10059835419
[DDC-SUBSCRIBE]: WARN 2016-07-06 13:50:32,166 PersonalGoodsDeclareMessageListener:consumeMessage - message received:key=10059755919
[DDC-SUBSCRIBE]: WARN 2016-07-06 13:50:32,168 PersonalGoodsDeclareMessageListener:consumeMessage - message received:key=10059842019
[DDC-SUBSCRIBE]: WARN 2016-07-06 13:50:32,169 PersonalGoodsDeclareMessageListener:consumeMessage - message received:key=10060209919
[DDC-SUBSCRIBE]: WARN 2016-07-06 13:50:32,169 PersonalGoodsDeclareMessageListener:consumeMessage - message received:key=10059764469
[DDC-SUBSCRIBE]: WARN 2016-07-06 13:50:32,169 PersonalGoodsDeclareMessageListener:consumeMessage - message received:key=10059743019
[DDC-SUBSCRIBE]: WARN 2016-07-06 13:50:32,309 PersonalGoodsDeclareMessageListener:consumeMessage - message received:key=10059964669
[DDC-SUBSCRIBE]: WARN 2016-07-06 13:50:32,318 PersonalGoodsDeclareMessageListener:consumeMessage - message received:key=10060158219
[DDC-SUBSCRIBE]: ERROR 2016-07-06 13:50:32,616 DefaultDaoImpl:create - data insertion error.Table:JKF_P_GOODS_DECLAR, key:10059756769
[DDC-SUBSCRIBE]: ERROR 2016-07-06 13:50:32,619 DefaultDaoImpl:create - data insertion error.Table:JKF_P_GOODS_DECLAR, key:10060229469
[DDC-SUBSCRIBE]: ERROR 2016-07-06 13:50:32,616 DefaultDaoImpl:create - data insertion error.Table:JKF_P_GOODS_DECLAR, key:10059743019

Output content:



Question 1:
When installing logstash, an error is reported:
/usr/share/logstash/vendor/jruby/bin/jruby: line 388: /usr/bin/java: No such file or directory

Solution:
It may be that jdk uses the decompression package to decompress, instead of using the rpm package to install, resulting in no /usr/bin/java file, which can be solved by soft connection.
ln -s /usr/local/jdk1.8.0_121/bin/java /usr/bin/java

Guess you like

Origin http://10.200.1.11:23101/article/api/json?id=326940266&siteId=291194637