Ubuntu builds ELK log analysis system (Elasticsearch+Logstash+Kibana)

Install the JDK environment in advance and
download jdk installation package: http://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html
1. Install the dependency package jdk8:
#sudo mkdir /usr/lib/ jvm
#tar xvzf jdk-8u91-linux-x64.tar.gz -C /usr/lib/jvm/
#vim ~/.bashrc Append export JAVA_HOME=/usr/lib/jvm/jdk1.8.0_91 export
at the bottom of the document JRE_HOME=${JAVA_HOME}/jre export CLASSPATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib export PATH=${JAVA_HOME}/bin:$PATH execute: source ~/.bashrc execute java -version and java, the installation is complete when there is corresponding data. Download the ELK installation package: https://www.elastic.co/downloads/ --------------------------------- --------------------------------------- 1 #tar xvzf logstash-2.3.3.tar .gz





















Create the logstash-test.conf configuration file in the logstash-2.3.3 directory with the following content:

# cat logstash-test.conf
 
input { stdin { } }
output {
   stdout { codec=> rubydebug }
}

Logstash uses input and output to define collection The related configuration of input and output when logging. In this example, input defines an input called "stdin", and output defines an output called "stdout". No matter what characters we enter, Logstash will return the characters we enter in a certain format, where output is defined as "stdout" and the codec parameter is used to specify the logstash output format.

Use the following command to start:

# ./bin/logstash agent -f logstash-test.conf
After starting, what you type on the screen will be displayed in the console. For example, if you enter "hello"
and display the corresponding json content, it will be successful.



Install elasticsearch:


#tar xvzf elasticsearch-2.3.3.tar.gz
Modify the configuration file to allow remote access:



cd elasticsearch-2.3.3/config
vim elasticsearch.yml
Modify the network column to:
network.host 0.0.0.0


Start elasticsearch:


# ./bin/elasticsearch -d #-d is the background start


. Visit http://<elasticsearch-ip>:9200
if there is data, it means success.


Install elasticsearch plugin head:


# cd elasticsearch-2.3.3
# ./bin /plugin install mobz/elasticsearch-head


access http://<elasticsearch-ip>:9200/_plugin/head

There is a page indicating success




Test whether elasticsearch and logstash can be linked successfully:

create a file in the logstash-2.3.3 installation directory for Test logstash using elasticsearch as the backend test file logstash-es-simple.conf of logstash, which defines stdout and elasticsearch as output, such "multiple output" ensures that the output results are displayed on the screen and also output to elastisearch , the content is as follows:


[logstash-es-simple.conf]
 
input { stdin { } }
output {
   elasticsearch {hosts => "localhost" }
   stdout { codec=> rubydebug }
}
 
# hosts is the host of elasticsearch, here both are
started :

1
# ./bin/logstash agent -f logstash-es-simple.conf open http://<elasticsearch -


ip>:9200/_search?pretty
See data

4. Install kibana:

1
# tar xvzf kibana-4.5.1-linux-x64.tar.gz
Boot:

1
2
# cd kibana-4.5.1-linux-x64
# ./bin/kibana

access http:/ /<ip>:5601 After creating an index, you can

click the "Discover" tab to search and browse the data in elasticsearch. The data in the last 15 minutes is searched by default, or you can customize it.

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=326171852&siteId=291194637