EFK (Elasticsearch+fluentd+Kibana) log analysis system construction

    My EFK organization is as follows, with a total of 39 VMs. Fluented reads logs in the form of tail from each service (marked as APP in the figure). To extract the logs, you can use the http://fluentular.herokuapp.com/ website to test the regular rules . , then sent to ElasticSearch, and then displayed using Kibana. EFK all runs in separate docker containers.

 

 

The building steps are as follows

The construction of a Fluented

1 Create docker-compose.yml and mount two mount points, one is the fluentd configuration file and the other is the folder where the log is located.

version: '2'
services:
  fluentd:
    build: .
    expose:
      - 24224
    ports:
      - "24224:24224"
    volumes:
      - /data/conf/fluent.conf:/fluentd/etc/fluent.conf
      - /data/logs/nginx:/data/logs/nginx
    restart: always

2 Create Dockerfile and add elasticsearch plug-in and forward plug-in

FROM fluent/fluentd:v1.12.0-debian-1.0
USER root
RUN ["gem", "install","fluent-plugin-elasticsearch","--no-document", "--version", "4.3.3"]
RUN ["gem", "install","fluent-plugin-forest","--no-document"]
USER root

3 Configure fluent.conf

<source>
  @type tail
  path /data/logs/nginx/https-access.log
  pos_file /data/logs/nginx/https-access.log.pos
 <parse>
    @type nginx	 
 </parse>
 tag *
</source>
<match *.**>
  @type forest
  subtype copy
    <template>
	  <store>
		@type elasticsearch
		host XX.XX.XX.XX
		port 9200
	  </store>
   </template>
</match>

2 Configuration of ElasticSearch and Kibana

Configure docker-compose.xml

version: '2'
services:
  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:7.10.2
    environment:
      - "discovery.type=single-node"
    expose:
      - 9200
    ports:
      - "9200:9200"
  kibana:
    image: kibana:7.10.1
    links:
      - "elasticsearch"
    ports:
      - "80:5601"

3 Start EFK

Run docker-compose up --d under flentd and EK respectively.

 docker ps
CONTAINER ID        IMAGE                                                  COMMAND                  CREATED             STATUS              PORTS                                NAMES
4941a0198a4f        fluentd_fluentd                                        "tini -- /bin/entr..."   4 days ago          Up 4 days           5140/tcp, 0.0.0.0:24224->24224/tcp   fluentd_fluentd_1
a33dee4a2bdb        kibana:7.10.1                                          "/usr/local/bin/du..."   7 days ago          Up 6 days           0.0.0.0:80->5601/tcp                 efk_kibana_1
dd05f58e2cbe        docker.elastic.co/elasticsearch/elasticsearch:7.10.2   "/tini -- /usr/loc..."   7 days ago          Up 6 days           0.0.0.0:9200->9200/tcp, 9300/tcp     efk_elasticsearch_1

4 Access the server where Kibana is located and set the index

Just perform kibana log query in Discover

Guess you like

Origin blog.csdn.net/baidu_31405631/article/details/114132231