EFK combat two - Integrated log

file

Foreword

EFK in infrastructure, we need to deploy Filebeat client, the log collection and spread LogStash by Filebeat. In LogStash in the log and then parse the log transfer to ElasticSearch, and finally view the log by Kibana.

Above EFK real one - to build a basic environment has built a good foundation EFK environment, we open up our data transmission between the three through real cases and solve some common problems EFK in the course of.

First look at the actual business journal

2020-01-09 10:03:26,719 INFO ========GetCostCenter Start===============
2020-01-09 10:03:44,267 WARN 成本中心编码少于10位!{"deptId":"D000004345","companyCode":"01"}
2020-01-09 10:22:37,193 ERROR java.lang.IllegalStateException: SessionImpl[abcpI7fK-WYnW4nzXrv7w,]: can't call getAttribute() when session is no longer valid.
	at com.caucho.server.session.SessionImpl.getAttribute(SessionImpl.java:283)
	at weaver.filter.PFixFilter.doFilter(PFixFilter.java:73)
	at com.caucho.server.dispatch.FilterFilterChain.doFilter(FilterFilterChain.java:87)
	at weaver.filter.MonitorXFixIPFilter.doFilter(MonitorXFixIPFilter.java:30)
	at weaver.filter.MonitorForbiddenUrlFilter.doFilter(MonitorForbiddenUrlFilter.java:133)

Log format is composed of:
time logging level details
the main task is to write EFK in this log normal.

filebeat installation configuration

  • Download filebeat7.5.1
  • After downloading the files uploaded to the server and unzip
    tar -zxvf filebeat-7.5.1-linux-x86_64.tar.gz
  • 修改 filebeat.yml,
filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /app/weaver/Resin/log/xxx.log

This section input configuration log, the log storage path specified

output.logstash:
  # The Logstash hosts
  hosts: ["172.31.0.207:5044"]

This section configuration log output, the specified storage path Logstash

  • Start filebeat
    ./filebeat -e -c filebeat.yml
    If you need quiet start, use the nohup ./filebeat -e -c filebeat.yml &command to start

logstash Configuration

logstash configuration is divided into three sections input, filter, output.
inputSpecifies the input port is open to the main Filebeat for receiving a log
filterfor specifying filtering, analyzing content filtering log.
outputSpecifies the output, can directly configure the ES address

input {
  beats {
    port => 5044
  }
}

output {
  elasticsearch {
    hosts => ["http://172.31.0.127:9200"]
    index => "myindex-%{+YYYY.MM.dd}"
    user => "elastic"
    password => "xxxxxx"
  }
}

We configured logstash command to restart logstash
docker-compose -f elk.yml restart logstash

After the above-described two-stage configuration application written to the log to the log file, filebeat writes the logs logstash. In view kibana written log results are as follows:
file

Log showed two questions:

  • Due to the error log stack information multiple rows to show in kibana in multiple lines, data viewing mess. Stack exception needs to be organized into a new line.
  • The need to resolve the log, split into "Time Log Level Log Details" display format.

Upgrading

  • Set in filebeat merge row
    filebeat default is the line of transmission, but we definitely multi-line log a log, we need to merge multiple lines together we must find the law of the log. For example, we all begins with the log format time format, so we filebeat in filebeat.inputsadd the following line configuration area
  # 以日期作为前缀
  multiline.pattern: ^\d{4}-\d{1,2}-\d{1,2}
  # 开启多行合并
  multiline.negate: true
  # 合并到上一行之后
  multiline.match: after
  • Sets the parse log in logstash in
    the log resolved to "Time Log Level Log Details" display format, so we need to add a filter segment logstash profile
filter {
	grok{
		match => {
			"message" => "(?<date>\d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2}),\d{3} %{LOGLEVEL:loglevel} (?<des>.*)"
		}
	}
}

This is mainly using grok syntax parsing the log, the log filter expressions by positive. It can be debugged by kibana in the grok debugging tool
file

After configuration is complete we reopen kibana Discover interface to view the log, in line with expectations, perfect!
file

common problem

kibana Ran码

The main reason is the client log file format issue, you can file xxx.logview the log file encoding format, if it is garbled ISO8859 encoding basic will, we can specify the log encoded for transmission by encoding the filebeat profile.

filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /app/weaver/Resin/log/xxx.log
  encoding: GB2312

kibana extraction field error

file

As shown above, this exception occurs when you open kibana Discover panel, you simply delete the ES .kibana_1index can then revisit Kibana.
file

Check around file

We usually check the context information to facilitate troubleshooting problems when the terminal view the log of a keyword, such as frequently used instructions cat xxx.log | grep -C50 keyword, then how to achieve this functionality in Kibana in it.
file

Search Keyword Kibana, and then locate the specific logging, click the down arrow to the left, then click on "View document around" can be realized.

Dynamic index

We may need to log platform docking multiple business systems, the need to establish different indexes based on business systems.

  • In filebeat to log in marked mark
- type: log
  ......
  fields:
    logType: oabusiness
  • In generating an index mark according logstash
input {
  beats {
    port => 5044
  }
}
filter {
  if [fields][logType] == "oabusiness" {
		grok{
			match => {
				"message" => "(?<date>\d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2}),\d{3} %{LOGLEVEL:loglevel} (?<des>.*)"
			}
		}
	}
}
output {
	elasticsearch {
		hosts => ["http://172.31.0.207:9200"]
		index => "%{[fields][logType]}-%{+YYYY.MM.dd}"
		user => "elastic"
		password => "elastic"
	}
}

Well, ladies and friends, the contents of this issue on all over it, to see the students here are excellent students, under a promotion and pay rise is you!
If you find this article to help you, then please scan the following QR code to add a concern. "Forward" plus "look", to form good habits! Let's See you next time!

file

Published 46 original articles · won praise 492 · Views 100,000 +

Guess you like

Origin blog.csdn.net/jianzhang11/article/details/103971826
efk