log analysis system

log analysis system

Reference blog: http://udn.yyuap.com/thread-54591-1-1.html ; https://www.cnblogs.com/yanbinliu/p/6208626.html ; http://blog.csdn.net /wyqlxy/article/details/52622867

In Internet projects, good log monitoring and analysis can ensure the stable operation of the business. However, in general, logs are scattered on each production server, and developers cannot log in to the production server. In this case, a centralized log collection device is required. The keywords in the log are monitored, and an alarm is triggered when an exception is triggered, so as to assist developers in viewing relevant logs.

https://www.cnblogs.com/byfcumt/p/8023225.html

ELK is a system that implements this function. It is short for elasticsearch, logstash and kibana. Recently, I intend to use it to manage various logs generated by the data platform. Here I record the steps to build in the test environment and the problems encountered. The general framework: The log data flow is as follows. The application places the log in a local file, and the FileBeat deployed on each server is responsible for collecting the log, and then sending the log to LogStash; LogStash processes the log; and then passes the processed Json object. For ElasticSearch, perform landing and index processing; finally, use Kibana to provide a web interface to view logs, etc. Because ES is based on Lucene, Kibana supports Lucene query syntax. image_1b4g890586hv1ag91ks3abv1i62m.png-24.8kB If the log data flow is particularly large, LogStash will cause congestion. In this case, message queues can be used for buffering. At the same time, once the log enters LogStash, it will not be read by some stream processing programs. At this time, it is better to use kafka, because kafka persists the message locally, and the stream processing application can read the message from the initial offset of the message. The process after adding kafka is as follows: image_1b4g8qjmjegcnf614pqe61kbf13.png-33.1kB

Installation process:

1. Software version

Logstash version: 1.5.4 (deprecated because it does not support the beats input plugin), 2.2.1 (currently used version)

elasticsearch version: 1.7.1

Kinaba version book: 4.1.1

filebeat version: 5.5.1

java version: jdk-8u152

2. Installation steps

Step 1. Download the Java component and install it

Download the latest version of jdk, unzip it and double-click to install it. Remarks: Please be sure to use the version above java8, otherwise ELK cannot be used normally.

Step 2. Add the JAVA_HOME environment variable

Right-click "This Computer"->Properties->Advanced System Settings->Environment Variables, create a new JAVA_HOME in the system variables, the value is C:\Java\jdk1.8.0_152, as shown in the following figure

Step 3. Install and configure nginx and configure reverse proxy for kibana

First download nginx from the URL http://nginx.org/download/nginx-1.9.4.zip. Unzip nginx-1.9.4.zip to f:\elk, and the directory is renamed nginx. Modify f:\elk\nginx\ conf\nginx.conf file, add the following content server{ listen 80; server_name localhost; location / { proxy_set_header Host $host; proxy_pass http://localhost:5601; } This solves the problem that port 5601 is blocked by the firewall, causing external users to access no problem. Step 4. Install elasticsearch

Unzip the elasticsearch-1.7.1.zip archive to the address of F:\elk\elasticsearch

Open the command line and enter the following command Pushd f:\elk\elasticsearch\bin
Service install to produce the following output

Then enter service manager and the following interface appears

Change the "Startup type" from Manual to Automatic. Then click "Start", elasticsearch will start running immediately. Enter http://localhost:9200 in the browser, and the following interface appears, indicating that the elasticsearch installation is successful.

Step 5. Install the head plugin

Unzip the contents of the elasticsearch-head-master.zip compressed package to the elasticsearch\plugins folder, modify the elasticsearch-head-master name to head, and enter http://localhost:9200/_plugin/head/ in the browser to view Elasticsearch medium data

Step 6. Install logstash

Unzip the logstash-2.2.1.zip archive to the address of F:\elk\logstash

Create a new Logstash.conf in f:\elk\logstash\bin with the following content input { beats { port => "5544" codec => json { charset => "UTF-8" } } } filter { if [type] == "info" { grok { match => { "message" => "(? \d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2},\d{3})\s+[(? . )]\s+(? \w )\s+(? \S*)\s+[(? \S*)]\s+: (? . )\s " } overwrite => ["message"] } } if [type] == "error"{ grok { match => { "message" => "(? \d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2},\d{3})\s+[(? . )]\s+(? \w )\s+(? \S*)\s+[(? \S*)]\s+- (? . )\s " } overwrite => ["message"] } } } output { elasticsearch { hosts => ["localhost:9200"] index => "test-%{+YYYY-MM}" } stdout { codec => rubydebug } }

Step 7. Create a new Logstash startup batch file. Create a new run.bat file in the folder f:\elk\logstash\bin. The content of the file is as follows: logstash.bat agent -f logstash.conf The purpose of adding this batch is to solve the problem in the windows environment The problem that logstash will hang after running for a while.

Step 8. Install Logstash as a Windows service First download nssm from the website https://nssm.cc/release/nssm-2.24.zip. Unzip nssm-2.24.zip, then copy nssm-2.24\win64\nssm.exe from the unzip directory to f:\elk\logstash\bin, then enter Pushd f:\elk\logstash\bin in the command line and execute Nssm install logstash installation interface appears

Please fill in the following information: Path: f:\elk\logstash\bin\run.bat Startup directory: f:\elk\logstash\bin The interface is as follows:

Click the "Details" tab and fill in the following: Display name: logstash Startup type: Automatic The interface is as follows:

Next, click the "Dependencies" tab and fill in the following information This service depends on the following system components: elasticsearch-service-x86 The interface is as follows:

The reason for adding dependencies is that the output of logstash is configured with Elasticsearch. If Elasticsearch is not started, logstash cannot work properly. Finally, click the install service button to execute the installation process. The following interface appears, indicating that the service has been installed successfully.

Step 9. Install Kibana as a Windows service Copy the nssm downloaded in Step 8 to the folder f:\elk\kibana\bin. Then enter Pushd f:\elk\kibana\bin in the command line and then execute Nssm install kibana. When the installation interface appears, please fill in the following information: Path: f:\elk\kibana\bin\kibana.bat Startup directory: f:\elk\ kibana\bin

The interface is as follows:

Similar to step 8, click the "Details" tab, and fill in the following content: Display name: kibana Startup type: Automatic The interface is as follows:

Next, click the "Dependencies" tab and fill in the following information This service depends on the following system components: elasticsearch-service-x86 logstash The interface is as follows:

Finally, click the install service button to execute the installation process. The following interface appears, indicating that the service has been installed successfully.

You can modify the running port of kibana in "f:\elk\kibana\config\kibana.yml".

Step 10. Install FileBeat

(1), installation

1. Unzip the filebeat-5.5.1-winsows-x86_64.zip archive in the folder (you can also download it yourself) to the C:\filebeat folder

2. Run PowerShell as an administrator (do not use cmd.exe here), enter the following command in the console to install

CD C:\filebeat

.\install-service-filebeat.ps1

3. The above error will occur during the installation process. At this time, the execution strategy needs to be changed. The statement is as follows

Set-executionpolicy RemoteSigned

Select: y to re-execute the installation command after execution

.\install-service-filebeat.ps1

So far the installation is successful, you can see that the filebeat service is closed at this time

4. In the start menu - run and enter Services.msc to open the local service operation bar, find filebeat.exe, and start the service

(2), placement

1. Open the C:\filebeat folder, find the filebeat.yml configuration file, and open it

2. Replace the following content into the configuration file, and change the configuration content as needed (note that since the file is in yml format, the rows and columns in the same module need to be aligned, otherwise the configuration file cannot be read)

(3), data viewing

Restart the service, insert data into the log directory where the configuration file is located, and open http://localhost:9200/_plugin/head/ or http://localhost:5601 to view the new log data

Attached:

Test data (2017120713.TXT)

2017-12-07 13:00:24,330 [service_FlightInfoDeptDateCalculater0] INFO DispatchAssist.ACARSMonitorNew.FlightInfoDeptDateCalculater [Save] : A total of 22 pieces of data need to be updated 2017-12-07 13:00:25,220 [service_ACARSMonitorNew0] INFO DispatchAssist.ACARSMonitorNew ] : M11 starts parsing.......QU SHAITMU .BJSXCXA 070500 M11 FI MU2412/AN B-1018 DT BJS LHW 070500 M42A

  • POS CAS 288,LAT N 38.388,LON E105.944,ALT 25580,FOB 15960,UTC 050023 2017-12-07 13:30:41,718 [service_ACARSMonitorNew0] INFO Ceair.Operations.WindowsServiceClient.Program [SingleThreadTask] : next execution time :2017-12-07 13:30:41 2017-12-07 13:30:41,718 [service_ACARSMonitorNew0] INFO Ceair.Operations.WindowsServiceClient.Program [SingleThreadTask]: Sleep: 30000

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=325154628&siteId=291194637