ELK installation steps

1. Installation preparation environment

1, jdk 1.8 and above; 2, elasticsearch-7.7.0; 3, logstash-7.7.0; 4, kibana-7.7.0; 5, CentOS Linux release 7.5.1804 (Core)

I put up with three hosts, docker build a good lamp before I collect log information
192.168.116.128:logstash
192.168.116.129:elasticsearch
192.168.116.130:kibana

Turn off the firewall and selinux, or you can add rules yourself

Second, install Elasticsearch

1. The jdk environment must be installed on all three hosts. I have demonstrated a host here and it has been installed.

[root@localhost src]# rpm -ivh jdk-8u131-linux-x64_.rpm 
准备中...                          ################################# [100%]
	软件包 jdk1.8.0_131-2000:1.8.0_131-fcs.x86_64 已经安装

2. Download the latest version of Elasticsearch.
Download the software or upload it to the server after it has been downloaded. I downloaded it and upload it to the server.

[root@localhost src]# ll elasticsearch-7.7.0-linux-x86_64.tar.gz 
-rw-r--r--. 1 root root 314430566 5月  19 21:33 elasticsearch-7.7.0-linux-x86_64.tar.gz

3. Unzip to the specified directory and rename it

tar xf elasticsearch-6.4.2.tar.gz  -C /usr/local
[root@master-node local]# mv elasticsearch-6.4.2  elasticsearch

4. Create an ordinary user elk to run elasticsearch

[root@master-node /]#groupadd elk
[root@master-node /]#useradd -g elk elk -m
[root@master-node local]# chown -R elk.elk /usr/local/elasticsearch/
[root@master-node local]# ll /usr/local/elasticsearch/
total 436
drwxr-xr-x  3 elk elk   4096 Oct 11 22:21 bin
drwxr-xr-x  2 elk elk    148 Sep 26 21:38 config
drwxr-xr-x  3 elk elk   4096 Sep 26 21:38 lib
-rw-r--r--  1 elk elk  13675 Sep 26 21:30 LICENSE.txt
drwxr-xr-x  2 elk elk      6 Sep 26 21:38 logs
drwxr-xr-x 27 elk elk   4096 Sep 26 21:38 modules
-rw-r--r--  1 elk elk 401465 Sep 26 21:38 NOTICE.txt
drwxr-xr-x  2 elk elk      6 Sep 26 21:38 plugins
-rw-r--r--  1 elk elk   8511 Sep 26 21:30 README.textile

5. Create an elasticsearch data storage directory and give the elk user the permissions,

[root@master-node ~]# mkdir -p /data/elasticsearch
[root@master-node ~]#chown -R elk.elk /data/elasticsearch

6, modify the elasticsearch configuration file

[root@localhost src]# cd /usr/local/elasticsearch/config/
[root@localhost config]# cat elasticsearch.yml | grep ^[^#]
cluster.name: ELK-Cluster
node.name: node-1
path.data: /data/elasticsearch
path.logs: /usr/local/elasticsearch/logs
network.host: 192.168.116.129
http.port: 9200
cluster.initial_master_nodes: ["node-1"]

7. Modify the relevant kernel parameters

此处vm.max_map_count= 视报错信息修改的大一点

[root@localhost config]# echo "vm.max_map_count=262144" >> /etc/sysctl.conf
[root@localhost config]# sysctl -p
[root@localhost config]#  vim /etc/security/limits.conf
* soft nproc 65536
* hard nproc 65536
* soft nofile 65536
* hard nofile 65536

8. Switch user elk to run elasticsearch

[root@master-node config]# su - elk
[elk@master-node ~]$ cd /usr/local/elasticsearch/
[elk@master-node elasticsearch]$ ./bin/elasticsearch -d

9. Check the status of elasticsearch, the following means normal operation

[root@localhost config]# curl http://192.168.116.129:9200
{
    
    
  "name" : "node-1",
  "cluster_name" : "ELK-Cluster",
  "cluster_uuid" : "UWxJP8whTXuvr7Vdn1Hl0A",
  "version" : {
    
    
    "number" : "7.7.0",
    "build_flavor" : "default",
    "build_type" : "tar",
    "build_hash" : "81a1e9eda8e6183f5237786246f6dced26a10eaf",
    "build_date" : "2020-05-12T02:01:37.602180Z",
    "build_snapshot" : false,
    "lucene_version" : "8.5.1",
    "minimum_wire_compatibility_version" : "6.8.0",
    "minimum_index_compatibility_version" : "6.0.0-beta1"
  },
  "tagline" : "You Know, for Search"
}

Third, install Logstash

Note: Install on the host where you want to collect logs

1. Unzip it to the /usr/local directory and rename it to logstash

[root@master-node ~]# tar xf logstash-6.4.2.tar.gz  -C /usr/local/
[root@master-node ~]# cd /usr/local/
[root@master-node local]# mv logstash-6.4.2 logstash

2. Modify the configuration file

[root@localhost ~]# cd /usr/local/logstash/config/
[root@localhost config]# cat apache.conf 
input {
    
    
    file {
    
    
        path => "/data/docker/httpd/logs/other_vhosts_access.log"
        type => "apache-log"
	start_position => "beginning"
    }
}

output {
    
    
    elasticsearch {
    
    
        hosts => "192.168.116.129:9200"
        index => "apache_log-%{+YYYY.MM.dd}"
    }
}

3. Specify the configuration file to run logstash

[root@localhost ~]# ./logstash -f /usr/local/logstash/config/apache.conf &
[root@localhost ~]# netstat -nlpt | grep 9600
tcp6       0      0 127.0.0.1:9600          :::*                    LISTEN      39739/java

Fourth, install kibana

1. Unzip to /usr/local and rename it to kibana

[root@master-node ~]# tar xf kibana-6.4.2-linux-x86_64.tar.gz  -C /usr/local
[root@master-node ~]# cd /usr/local/
[root@master-node local]# mv kibana-6.4.2-linux-x86_64 kibana

2. Modify the configuration file

[root@localhost ~]# cd /usr/local/kibana/config/
[root@localhost config]# cat kibana.yml | grep ^[^#]
server.port: 5601
server.host: "192.168.116.130"
elasticsearch.hosts: ["http://192.168.116.129:9200"]
logging.dest: /var/log/kibana.log

3. Add permissions to the /var/log/kibana.log file

[root@master-node config]#  touch /var/log/kibana.log
[root@master-node config]# chmod 777 /var/log/kibana.log

4. Enter /bin under the installation directory to start kibana

[root@master-node kibana]# cd bin/
[root@master-node bin]# ./kibana --allow-root &

Startup time is a bit long wait
Insert picture description here

Guess you like

Origin blog.csdn.net/lq_hello/article/details/107236634