Linux log collection practical experiment command (nanny level)

♥️Author : Xiao Liu at Station C

♥️Personal homepage: Xiao Liu's homepage

♥️ Share cloud computing network operation and maintenance classroom notes every day, hard work may not necessarily pay off, but there will be gains, come on! Work hard together for a better life!

♥️Under the setting sun, it is the most beautiful bloom, the tree is thousands of feet high, and the fallen leaves return to the roots. Life is not easy, and the true love in the world

Table of contents

Modify the host name: efk

1. Install elasticsearch:

2. Modify the configuration file:

3. Create a data directory and modify permissions

4. Allocate locked memory:

5. Install the Google Chrome plug-in locally

3. Start filebeatsystemctl start filebeat

2. Start nginx

3.s Use the ab stress test tool to test access

4. View filebeat index and data in es browser

5. Add index to kibana


foreword

This chapter is a Linux log nanny-level experiment

 

Elasticsearch: search engine database, store logs
Filebeat: log collection
Kibana: log display

Modify the host name: efk

    hostnamectl set-hostname efk
    bash

Install es host: 192.168.1.104

1. Install elasticsearch:

    Premise: jdk-1.8.0
    copy elasticsearch-6.6.0.rpm to virtual machine
    rpm -ivh elasticsearch-6.6.0.rpm

2. Modify the configuration file:

vim /etc/elasticsearch/elasticsearch.yml
node.name: node-1                        
path.data: /data/elasticsearch            
path.logs: /var/log/elasticsearch                        
network.host: 192.168.8.129,127.0.0.1    
http.port: 9200                            

3. Create a data directory and modify permissions

mkdir -p /data/elasticsearch
chown -R elasticsearch:elasticsearch /data/elasticsearch/

4. Allocate locked memory:

vim /etc/elasticsearch/jvm.options
-Xms1g #allocate minimum memory    
-Xmx1g #allocate maximum memory, the official recommendation is half of physical memory, but the maximum is 32G


systemctl daemon-reload
systemctl restart elasticsearch

5. Install the Google Chrome plug-in locally

More Tools--Extensions--Open Developer Mode--Insert Plugin Package

#################################################### ################
Install kibana on the es host
(1) Install kibana
rpm -ivh kibana-6.6.0-x86_64.rpm

(2) Modify the configuration file
vim /etc/kibana/kibana.yml
to modify:
server.port: 5601
server.host: "192.168.8.129"
server.name: "db01" #The host name of your own host
elasticsearch.hosts: [ "http://192.168.1.104:9200"] #es The ip of the server is convenient for receiving log data
Save and exit

(3) Start kibana
systemctl start kibana

#################################################### #################
Install filebeat on the es host
1. Install filebeat
cd /data/soft
rpm -ivh filebeat-6.6.0-x86_64.rpm

2. Modify the configuration file
vim /etc/filebeat/filebeat.yml
to modify:
filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /var/log/nginx/access.log

output.elasticsearch:
  hosts: ["192.168.8.129:9200"]

save and exit

3. Start filebeat
systemctl start filebeat

#################################################### ####################
Install nginx and httpd-tools on the es host
1. Configure the yum source and install nginx httpd-tools
yum -y install epel-release
yum -y install nginx httpd-tools

2. Start nginx

systemctl start nginx

3.s Use the ab stress test tool to test access

ab -n 100 -c 20 http://192.168.8.129/

4. View filebeat index and data in es browser

5. Add index to kibana

    management--create index
    discover--upper right corner--choose today
    

♥️Following is the driving force for my creation

♥️Like, is the greatest recognition for me

♥️This is Xiaoliu, I am inspiring to do every article well, thank you everyone

Guess you like

Origin blog.csdn.net/lzl10211345/article/details/129876615