Using ELK to build a Docker containerized application log center

cover image


Overview

Once the application is containerized, what needs to be considered is how to collect the print logs of the application located in the Docker container for operation and maintenance analysis. A typical example is the log collection of SpringBoot applications . This article will explain how to use the ELK log center to collect logs generated by containerized applications, and to query and analyze the logs in a visual way. Its architecture is shown in the following figure:

Architecture diagram


Mirror preparation

Mirror preparation

  • ElasticSearch mirror
  • Logstash mirror
  • Kibana mirror image
  • Nginx image (as a containerized application to produce logs)

Start the Linux system Rsyslog service

Modify the Rsyslog service configuration file:

vim /etc/rsyslog.conf

Turn on the following three parameters:

$ModLoad imtcp
$InputTCPServerRun 514

*.* @@localhost:4560

Turn on 3 parameters

The intent is simple: let Rsyslog load the imtcp module and listen on port 514, then forward the data collected in Rsyslog to the local port 4560!

Then restart the Rsyslog service:

systemctl restart rsyslog

View rsyslog startup status:

netstat -tnl

netstat -tnl


Deploy the ElasticSearch service

docker run -d  -p 9200:9200 \
 -v ~/elasticsearch/data:/usr/share/elasticsearch/data \
 --name elasticsearch elasticsearch

ES startup success effect


Deploy the Logstash service

Add ~/logstash/logstash.confthe configuration file as follows:

input {
  syslog {
    type => "rsyslog"
    port => 4560
  }
}

output {
  elasticsearch {
    hosts => [ "elasticsearch:9200" ]
  }
}

In the configuration, we let Logstash fetch the application log data from the local Rsyslog service, and then forward it to the ElasticSearch database!

After the configuration is complete, you can start the Logstash container with the following command:

docker run -d -p 4560:4560 \
-v ~/logstash/logstash.conf:/etc/logstash.conf \
--link elasticsearch:elasticsearch \
--name logstash logstash \
logstash -f /etc/logstash.conf

image.png


Deploy Kibana service

docker run -d -p 5601:5601 \
--link elasticsearch:elasticsearch \
-e ELASTICSEARCH_URL=http://elasticsearch:9200 \
--name kibana kibana

image.png


Start nginx container to produce logs

docker run -d -p 90:80 --log-driver syslog --log-opt \
syslog-address=tcp://localhost:514 \
--log-opt tag="nginx" --name nginx nginx

Obviously, the Nginx application logs in the Docker container are forwarded to the local syslog service, and then the syslog service transfers the data to Logstash for collection.

So far, the log center has been built, and a total of four containers are currently working:

image.png

Experimental verification

  • Open the browser localhost:90to open the Nginx interface, and refresh it several times, so that the log of the GET request is generated in the background

  • Open the Kibana visualization interface:localhost:5601

localhost:5601

  • Collect Nginx application logs

Collect Nginx application logs

  • Query application logs

Enter in the query box program=nginxto query a specific log

Query application logs


postscript


CodeSheep

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=324439878&siteId=291194637