ELK collects Nginx logs

The point is to extract Nginx logs through Logstash regular


1. Nginx log format configuration

  log_format  main  '$remote_addr - $remote_user [$time_local] "$request" '

                      '$status $body_bytes_sent "$http_referer" '

                      '"$http_user_agent" "$http_x_forwarded_for"';

Two, Nginx log format 

192.168.20.7 - - [18/Nov/2020:21:42:26 +0800] "GET / HTTP/1.1" 200 612 "-" "Mozilla/5.0 (Windows NT 6.1; Win64; 

x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/86.0.4240.198 Safari/537.36" "-"

192.168.20.7 - - [18/Nov/2020:21:42:26 +0800] "GET /favicon.ico HTTP/1.1" 404 555 "http://192.168.20.41/" "Mozil

la/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/86.0.4240.198 Safari/537.36" "

-"

3. Built-in regular extraction syntax: %{built-in regular expression: field name}

%{IP:remote_addr} - (%{WORD:remote_user}|-) \[%{HTTPDATE:time_local}\] "%{WORD:method} %{NOTSPACE:request} HTTP/%{NUMBER}" %{NUMBER:status} %{NUMBER:body_bytes_sent} %{QS} %{QS:http_user_agent}

Four , Logstash regularly extracts Nginx and writes to ES

1. To use kibana's own Grok extraction, you need to master regular expressions

image

2. Use formal extraction grammar

image

3. Extract the positive nginx log in segments

image

4. Configure logstash

image

5. The default logstash collects nginx logs


[root@master nginx]# more /etc/logstash/conf.d/logstash.conf 

input {

  file {

    path => "/var/log/nginx/access.log"

  }

}

output {

  elasticsearch {

    hosts => ["http://192.168.20.41:9200", "http://192.168.20.42:9200"]

    user => "elastic"

    password => "hahashen"

    index => "sjgnginx-%{+YYYY.MM.dd}"

  }

}

image

6, kibana creates Create index pattern

image

Next step

image

7, kibana displays nginx logs

image

image

8. Finally, split the nginx log regularly

image.png


 Please scan your attention and learn together


image.png





Guess you like

Origin blog.51cto.com/15127516/2658346