Windows7 build ELK locally to collect project operation logs

Introduction to ELK

ELK is the abbreviation of Elasticsearch+Logstash+Kibana

Elasticsearch is a distributed search and analysis engine that can be used for full-text search, structured search and analysis, and can combine the three. Elasticsearch is based on Lucene and is now one of the most widely used open source search engines.

Logstash is simply a pipeline with real-time data transmission capabilities. It is responsible for transmitting data information from the input end of the pipeline to the output end of the pipeline. At the same time, this pipeline also allows you to add filtering in the middle according to your needs. Net, Logstash provides many powerful filters to meet your various application scenarios.

Kibana is an open source analysis and visualization platform designed for use with Elasticsearch. You can use kibana to search, view, and interact with the data stored in the Elasticsearch index, using various icons, tables, maps, etc., kibana can easily display advanced data analysis and visualization.

ELK download and install

You can go to the official website download center to download and install: https://elasticsearch.cn/download/ (I downloaded 6.2.2 here, logstash download the zip version)
JDK1.8 needs to be installed in advance.

ELK unzip installation

Unzip the three compressed packages

Start es Start elasticsearch.bat in the ..elasticsearch-6.2.2\bin\ directory, the browser requests: http://localhost:9200/

Start Kibana Start kibana.bat in the ..kibana-6.2.2-windows-x86_64\bin\ directory without modifying any configuration. Browser visit http://localhost:5601/

Unzip logstash and create a file logstash.conf in the config directory

input {
  tcp {
    #模式选择为server
    mode => "server"
    #ip和端口根据自己情况填写,端口默认4560,对应下文logback.xml里appender中的destination
    host => "localhost"
    port => 4560
    #格式json
    codec => json_lines
  }
}
filter {
  #过滤器,根据需要填写
}
output {
  elasticsearch {
    action => "index"
    #这里是es的地址,多个es要写成数组的形式
    hosts  => "localhost:9200"
    #用于kibana过滤,可以填项目名称
    index  => "applog"
  }
}

Open the cmd command window to execute in the bin directory

logstash.bat -f ../config/logstash.conf

Report an error 

Solution: Open the logstash.bat line 52 in the logstash-6.2.2\bin directory and add double quotes to %CLASSPATH%

Restart 

If no error is reported, the startup is successful. In this way, the ELK log collection system is set up.

Build a springboot project modify logback.xml to add


    <appender name="stash" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
        <destination>localhost:4560</destination>
        <!-- encoder必须配置,有多种可选 -->
        <encoder charset="UTF-8" class="net.logstash.logback.encoder.LogstashEncoder" />
    </appender>

    <root level="TRACE">
        <appender-ref ref="stash" />
    </root>

Note: The destination must be the same as in the previous configuration file.

Add dependency in pom.xml

<dependency>
            <groupId>net.logstash.logback</groupId>
            <artifactId>logstash-logback-encoder</artifactId>
            <version>5.1</version>
        </dependency>

 


    @ApiOperation(value = "查看举报列表", notes = "根据用户code 获取举报列表 reportStatus 不需要传值")
    @PostMapping(value = "/list")
    public Wrapper reportList(@RequestBody ReportInfoBaseListDto listDto) {
        log.info("查看举报列表,参数:==>>" + listDto);

        try {
            List<Map<String, Object>> list = reportInfoService.queryListByUserInstitutionCode(listDto);

            PageInfo pageInfo = new PageInfo(list);
            return WrapMapper.wrap(Wrapper.SUCCESS_CODE, Wrapper.SUCCESS_MESSAGE, pageInfo);
        } catch (Exception e) {
            e.printStackTrace();
            return WrapMapper.wrap(Wrapper.ERROR_CODE, e.getMessage());
        }
    }

Here use lombok controller to add @Log zhu annotation

Start the project to request this interface.

After running the test case, return to the kibana interface, Management --> Index Patterns, fill in the index value in the Logstash configuration, here is applog

 

Step 2 Here, select "@timestamp"

Back to Discover

In this way, the logs are collected.

 

 

Guess you like

Origin blog.csdn.net/qq_27828675/article/details/105248305