ELK + Kafka log collection and analysis platform

  1. ELK and Kafka introduced

    LK are made Logstash (+ collection analysis),

    ElasticSearch (Search + storage), Kibana (visual display), mainly in order to achieve distributed log data logging system in which vast amounts of centralized management and query, to facilitate monitoring and troubleshooting, greatly facilitate micro-Services view the log;

    • Logstash application system log data received, performs filtering, analysis, and other operations docking unified format, and writes it to the ElasticSearch; Logstash log can support N-channel types: Kafka docking Log queue is read, and the hard disk log abutting read directory take, Reids log queue is read and the like are stored;

    • ElasticSearch storing log data, it is a distributed search engine, having a highly scalable, highly reliable, easy to manage, etc., may be used for full-text search, retrieval, and analysis of the structure, and can combine these three;

    • Kibana of ElasticSearch stored in the log data: data presentation, report presentation, and real-time;

            ELK done using simple logging system, due to the relatively large Logstash consume system resources, taking up running at high CPU and memory resources, and there is no message queue caching, may be at risk of data loss, only suitable for small amounts of data environments;

            Apache Kafka is a messaging middleware, is a distributed, based publish / subscribe messaging system. To achieve a unified process provides a real-time data, the level of megabits per second of high-throughput, low-latency platform, and has distributed, divided, redundant backup of persistent log service and so on.

            And using a lightweight message do Redis different queues, as Kafka with disk queue, so it does not matter problem disk message buffer; and when used as a cluster, a corresponding application corresponds Redis Redis, cause data to some extent inclined, thereby resulting in loss of data, if used for Redis small amount of data queues Alternatively Kafka environment, would greatly enhance the efficiency and cost;

     

     

     

     

  2. ELK Docker  environment to build

    Docker environment to build can refer to: Docker Quick Start and Installation

    And Elk Docker Compose GitHub: HTTPS: //github.com/deviantony/docker-elk   
    (0) , a native environment CentOS7, other similar environments; Docker installation environment (e.g., installed, skip this step);
    (. 1) , Github Search docker elk (Download link as above), or which is downloaded to the clone linux, because of mechanical problems, this article only install a single node Edition;

     ( 2) , explain docker-compose.yml;

     

     ( 3) , into the directory, the directory has a docker-compose.yml use the following command to download and start (first download will be relatively long);

    # Launch container for easy debugging console, for the first time recommended to use this command to download and start 
    Docker - Compose up 
    # backgrounding container 
    Docker -compose up - d 
    # other commands, see Docker -compose - Help 
    # safety shutdown Docker - Compose 
    #docker -compose Down -v

      Note: If there -bash: docker-compose: When command not found, resolved as follows:

    # 1 , install pip, check whether there; 
    pip - V 
    # 2 , if installed pip ignore this step, if the error - bash: pip: the Command not found Please install; 
    yum -y install epel- Release 
    yum -y Python - install PIP 
    PIP install - upgrade PIP 
    # 3 , install Docker- Compose; 
    PIP install Docker - Compose 
    # 4 , check whether the installation was successful; 
    Docker -compose -version

     ( 4) , a green done means that the download is complete and launch; open ports, I use here is Ali cloud, so it is necessary to install cloud instances group configuration rules for open ports;

     

     ( 5) , to understand the role of each port, and access Elasticsearch Kibana console and the console;

    5000 : Logstash the INPUT # log collection TCP port
     9200 : elasticsearch HTTP console port #
     9300 : TCP Transport elasticsearch # cluster communication port for heartbeat
     5601 : Kibana # console port

    Elasticsearch console (default user name: elastic, password: changeme, modify docker-compose.yml), the following interface description of the service has started successfully:

     Kibana console (default user name: elastic, password: changeme, the docker-elk-master / kibana / config under kibana.yml modification), the following interface description of the service has started successfully:

  3. Kafka Docker build

    Github Kafka Docker and Compose:https://github.com/wurstmeister/kafka-docker

    (0) , a native environment CentOS7, other environments similar; mounting Docker environment (e.g., installed, skip this step);        
    (. 1) , Github search docker kafka (Download link above), which clone or downloaded to the linux, because of mechanical problems, only install a single-node version;

     

     ( 2) , Docker-compose.yml will build a multi-node cluster, we do not do presentations, so the use of a single-node file docker-compose-single-broker.yml;

     

     ( 3) , into the directory, the directory has a docker-compose.yml use the following command to download and start (first download will be relatively long);

    # For the first time recommended to use this command to download and start 
    Docker -compose -f-Docker Compose-Single- broker.yml up 
    # backstage start 
    Docker -compose -f Docker Compose-SINGLE-broker.yml-up -d

     ( 4) , a green done means that the download is complete and starts, view the running status;

    docker-made ps

  4. Kafka and associated Logstash

    (1) , arranged to modify logstash.conf (the docker-elk-master / logstash / pipeline li);

    input {
      #tcp {
      #  port => 5000
      #}
      kafka {
            id => "my_plugin_id"
            bootstrap_servers => "localhost:9092"
            topics => ["test"]
            auto_offset_reset => "latest"
        }
    }

     ( 2) , restart Logstash;

    docker-compose restart logstash

     

     




Guess you like

Origin www.cnblogs.com/yimusidian/p/12450515.html