Logstash收集nginx日志之使用grok过滤插件解析日志

grok作为一个logstash的过滤插件,支持根据模式解析文本日志行,拆成字段。

  • nginx日志的配置:
log_format  main  '$remote_addr - $remote_user [$time_local] "$request" $status $body_bytes_sent "$http_referer" "$http_user_agent" "$http_x_forwarded_for"';
  • logstash中grok的正则(添加在/usr/local/logstash-6.2.4/vendor/bundle/jruby/2.3.0/gems/logstash-patterns-core-4.1.2/patterns/grok-patterns文件中)为:

WZ ([^ ]*)
NGINXACCESS %{IP:remote_ip} \- \- \[%{HTTPDATE:timestamp}\] "%{WORD:method} %{WZ:request} HTTP/%{NUMBER:httpversion}" %{NUMBER:status} %{NUMBER:bytes} %{QS:referer} %{QS:agent} %{NUMBER:elapsed} %{NUMBER:serverelapsed} %{QS:xforward}

logstash的配置为:

input{
  file{
    path => "/usr/local/nginx/logs/access.log"
    type => "nginx"
    start_position => "beginning"
  }
}

filter {  
    grok {  
      match => { "message" => "%{NGINXACCESS}" }
    }  
} 
output{
  if [type] == "nginx" {
    elasticsearch {
    hosts=> ["172.17.102.202:9200"]
    index=> "nginx"
    }
  }
}

猜你喜欢

转载自www.cnblogs.com/wjoyxt/p/9172370.html
今日推荐