docker-compose EFK view the log and container docker

Previous " docker-compose ELK + Filebeat View docker container and logs ," we have demonstrated how to create a container using a docker-compose the docker, all logs and docker is collected ELK using Filebeat way to read docker container log file

 

 

Video sources are: [elasticsearch 3] How to install a using EFK Stack Docker with Fluentd

Reference Code Address: https://github.com/justmeandopensource/elk/tree/master/docker-efk

It is now read docker-compose EFK container log

 

Wherein the content of the document follows docker-compose.yml

version: '2.2'

services:

  fluentd:
    build: ./fluentd
    container_name: fluentd
    volumes:
      - ./fluentd/conf:/fluentd/etc
    ports:
      - "24224:24224"
      - "24224:24224/udp"

  # Elasticsearch requires your vm.max_map_count set to 262144
  # Default will be 65530
  # sysctl -w vm.max_map_count=262144
  # Add this to /etc/sysctl.conf for making it permanent
  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:6.5.4
    container_name: elasticsearch
    environment:
      - bootstrap.memory_lock=true
      - "ES_JAVA_OPTS=-Xms512m -Xmx512m"
    ulimits:
      memlock:
        soft: -1
        hard: -1
    volumes:
      - esdata1:/usr/share/elasticsearch/data
    ports:
      - 9200:9200

  kibana:
    image: docker.elastic.co/kibana/kibana:6.5.4
    container_name: kibana
    environment:
      ELASTICSEARCH_URL: "http://elasticsearch:9200"
    ports:
      - 5601:5601
    depends_on:
      - elasticsearch

volumes:
  esdata1:
    driver: local

 

 

fluentd  = > Dockerfile

FROM fluent/fluentd
RUN ["gem", "install", "fluent-plugin-elasticsearch", "--no-rdoc", "--no-ri"]

 

fluentd => fluent.conf

<source>
  @type forward
  port 24224
</source>

# Store Data in Elasticsearch
<match *.**>
  @type copy
  <store>
    @type elasticsearch
    host elasticsearch
    port 9200
    include_tag_key true
    tag_key @log_name
    logstash_format true
    flush_interval 10s
  </store>
</match>

 

 

musc => clients-td-agent.conf

<source>
  @type syslog
  @id input_syslog
  port 42185
  tag centosvm01.system
</source>

<match *.**>
  @type forward
  @id forward_syslog
  <server>
    host <fluentd-ip-address>
  </server>
</match>

 

 

The overall operation is very simple, open [HostIP: 5601], you can see kibana has been in existence, on index-pattern, you can create, but this time the name before the name ELK different, already logstash- * and the log also can read.

 

 

 

Source Address: https://github.com/ChenWes/docker-efk

 

Guess you like

Origin www.cnblogs.com/weschen/p/11067858.html