ELK built environment window10

 When will involve micro-oriented service development to a multi-track system log, if there are problems filter up inter-system handover is very troublesome, it is necessary to adopt a specific tool to log unified categorization process, easy access to troubleshoot error, the following will introduce an open-source tool ELK.

ELK by the ElasticSearch, Logstash and Kiabana three open source tools. Official Download: https: //www.elastic.co/cn/downloads/

 

Preparing the environment

Client: win10

Java environment: jdk1.8 ( must be 1.7 or later !!! )

Component Information:

    logstash: log collection filtering tools, download  https://artifacts.elastic.co/downloads/logstash/logstash-7.3.0.zip

              kibana: providing search, view, and interact with the functional data stored in Elasticsearch index, download  https://artifacts.elastic.co/downloads/kibana/kibana-7.3.0-windows-x86_64.zip

              elasticsearch: Open Source distributed search engine, download  https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-7.3.0-windows-x86_64.zip

    elasticsearch-head: es plugin interface of a cluster operation and management tools, download https://github.com/mobz/elasticsearch-head/releases

first step

Download the component and unzip to a local disk (be careful not to extract the path with spaces, otherwise the subsequent installation may be wrong), as shown below:

 

 

 

The second step

Install jdk1.8, and configuration environment variable

 

third step

Installation logstash, the following command in the powershell:

logstash.conf: New logstash.conf configuration file in the bin folder in the directory logstash

 

Details are as follows,

input {

file {

path => "D: / ELK / logs / *" ## log storage directory

start_position => "beginning"

codec => plain { charset => "GBK" }

}

}

filter {

mutate {

add_field => [ "[fields][path]", "%{[path]}"]

add_field => [ "message]", "%{[message]}"]

}

}

output {

elasticsearch {

hosts => ["http://127.0.0.1:9200"]

index => "logstash-%{type}-%{+YYYY.MM.dd}"

template_overwrite => true

}

}

 

the fourth step

Elasticsearch installation, will be added to simplify the operation elasticsearch-7.3.0 / bin path to the environment variable that can enter commands directly anywhere in elasticsearch powershell is, starting with the default configuration (the default port for the transmission 9300 accepts HTTP requests the port is 9200), then use the browser to open http: // localhost: 9200 /, see following figure shows the installation was successful start

 

the fifth step

安装elasticsearch-head,此处需要提前安装nodejs url: https://nodejs.org/en/download/,然后依次执行下面的命令:

npm install -g grunt-cli #编译源码

npm install #安装服务

grunt server #启动服务

 

第六步

安装Kibana,为简化操作可以将把ibana-7.3.0-windows-x86_64/bin路径加入到环境变量,那样就可以在powershell中的任意位置直接输入Kibana命令启动

注意:连接远程elasticsearch时,需修改文件/config/kibana.yml内容 elasticsearch.hosts

启动成功后,在浏览器中输入:http://localhost:5601/,出现下面的界面

然后我们回到前面第三步中logstash.conf文件中指定的日志目录,新建日志txt文件,文件内容如图:

 

 

 日志文件保存之后,在浏览器访问Url:http://localhost:5601/app/kibana#/management/kibana/index_pattern,就可以看到可以新建日志索引了

 

 

 索引创建完成后就可以在下面的链接看到日志数据了,http://localhost:5601/app/kibana#/discover

 

Guess you like

Origin www.cnblogs.com/dlianghui/p/elk.html