Spring Boot using Log4j2 & Logback output log to EKL

Article Directory

1, ELK Introduction

ELK is Elasticsearch, Logstash, Kibana abbreviation, Elasticsearch is an open source distributed search engine that provides collection, analysis, data storage and other functions, Logstash is mainly used to collect log analysis, log filtering tools, Kibana provide analysis and Elasticsearch visual Web platform, can be found in Elasticsearch index, interaction data, and generates various dimensions of the table of FIG. Bottom line: log collection (Logstash), a data storage (elasticsearch), visualized log (Kibana), combine the three very convenient centralized processing log.

2, the environment, the software is ready

The demo environment, I was on the MAC OS operating this machine, the following are installed software and version:

  • Java: 1.8.0_211
  • Elasticsearch: 7.1.0
  • Logstash: 7.1.0
  • Kibana : 7.1.0
  • Spring Boot: 2.1.4.RELEASE

Note: The main demonstration of how to configure Spring-Boot project Log4j2and Logbackoutput logs to the ELK in, and can Kibanabe correctly retrieved in, Elasticsearchand Spring-Boot project underlying need Java environment, it is necessary to advance locally installed Java environment, where ignore Java installation process.

3, ELK environment to build

We can go to the official website to separately download the latest version of the system corresponds Elasticsearch, , Logstash, Kibanaup to now has been updated to 7.1.0version, the installation package is available at the following link and provide a default configuration startup procedure.

Here I would like to start with the default configuration of Elasticsearchthe service, the boot is completed, by local http://127.0.0.1:9200powers on the address to access the service. Note: to not start Logstash, and Kibanabecause they need to change the configuration, will be mentioned below.

4, Spring Boot Configuration Example

Use Idea create a Spring Boot project, let's add Log4j2support, demonstrates how to use Log4j2the log output directly to the local ELK, and then the next presentation by Logbackdynamic output index name to the log, convenient category search logs.

4.1, Log4j2 way configuration

First of all modifications pom.xmlto increase Log4j2logging framework support, pay attention to spring-boot-starterthe default use Logbackas a logging framework, so you need to remove the default logging configuration spring-boot-starter-logging.

pom.xml Add the following configuration:

	 <dependency>
        <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> <exclusions> <exclusion> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-logging</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-log4j2</artifactId> </dependency> <dependency> <groupId>com.lmax</groupId> <artifactId>disruptor</artifactId> <version>3.4.2</version> </dependency>

Note: disruptoris a lightweight, high-performance concurrent framework, Log4j2comprising based on LMAX Disruptorthe next-generation (high-performance inter-thread messaging library) Asynchronous Loggers. In a multithreaded environment Asynchronous Loggersthroughput is Log4j1and Logback18 times, the time delay should be an order of magnitude. If you use asynchronous log, add disruptorsupport, will greatly improve efficiency, of course, do not add is no problem.

Increased log4j2-spring.xmlconfigured to output the ELK probably configured as follows:

< ? XML Version = "1.0" encoding = "UTF-. 8" ? > <The Configuration Status = "OFF" MonitorInterval = "60" > <Appenders > < ! - Console log, only the output level and higher level information, and log output level arrangement of the color - > <console name = "console" target = "SYSTEM_OUT" > < ! - the console just above the level of the output level and the information (onMatch), other direct rejection (onMismatch) - > <ThresholdFilter Level = "info" onMatch = "ACCEPT"onMismatch="DENY"/> <PatternLayout pattern="%highlight{%d{yyyy.MM.dd 'at' HH:mm:ss z} %-5level %class{36} %M() @%L - %msg%n}{FATAL=Bright Red, ERROR=Bright Magenta, WARN=Bright Yellow, INFO=Bright Green, DEBUG=Bright Cyan, TRACE=Bright White}"/> </Console> <!-- socket 日志,输出日志到 Logstash 中做日志收集 --> <Socket name="Socket" host="127.0.0.1" port="4560" protocol="TCP"> <JsonLayout properties="true" compact="true" eventEol="true" /> <PatternLayout pattern="%d{yyyy.MM.dd 'at' HH:mm:ss z} %-5level %class{36} %M() @%L - %msg%n"/> </Socket> </Appenders> <Loggers> <Root level="INFO"> <appender-ref ref="Socket"/> <appender-ref ref="Console"/> </Root> </Loggers> </Configuration>

Note: This configuration Socket output, output logs to the local Logstashdo log collection, automatically output to the local post-formatting process Elasticsearchis stored, and finally through Kibanathe page shows up to retrieve the index over the Web. You need to specify host, , port, protocolhere to configure the cost of Logstashdesignated ports and addresses can be configured.

At the same time can be application.propertiesconfigured log output level, pay attention to where you can not load the specified log4j2-spring.xmlfile, Spring Boot load the default configuration file.

logging.level.root=info

Finally, the code writes some specific log and exception information in the Controller, convenient Kibanaviewing verification.

@RestController
@RequestMapping("/test")
public class LogController { private Logger logger = LogManager.getLogger(LogController.class); @RequestMapping(value = "/log4j2", method = RequestMethod.GET) public String testLog(){ try { logger.info("Hello 这是 info message. 信息"); logger.error("Hello 这是 error message. 报警") ; Logger . The warn ( "the Hello This is warn message warning." ) ; Logger . Debug ( "the Hello This is a debug message debugging." ) ; Logger . Fatal ( "the Hello This is a fatal message seriously." ) ; List <String > List = new new the ArrayList < > ( ) ; the System .out . the println (List . GET ( 2 ) ) ; } the catch ( Exception E ) {Logger .error("testLog", e); } return ""; } }

OK, Spring Boot project to add Log4j2support configuration is completed, then, we need to configure Logstashand Kibana, in Logstashthe installation directory under config directory, create test-log4j2.confprofiles, configuration is as follows:

input {
  tcp {
    host => "127.0.0.1"
    port => "4560" mode => "server" type = json } stdin {} } filter { } output { stdout { codec => rubydebug } elasticsearch { hosts => ["127.0.0.1:9200"] action => "index" codec => rubydebug index => "log4j2-%{+YYYY.MM.dd}" } }

Note: This configuration host and port tcp keep on top of log4j2-spring.xmlthe configuration, or it may not be able to log collection, output at elasticsearch the host configuration to keep the top local startup Elasticsearchconfiguration is consistent, index specified as a fixed log4j2-yyyy.MM.ddformat, convenient Kibanato retrieve index use. Use this configuration file to start Logstashthe following command:

$ cd <Logstash_path>/bin
$ ./logstash -f ../config/test-log4j2.conf

Finally, start Kibana, you can use the default configuration, but here I have slightly modified some configuration is as follows:

$ vim <Kibana_path>/config/kibana.conf

server.port: 5601 server.host: "127.0.0.1" elasticsearch.hosts: ["http://127.0.0.1:9200"] i18n.locale: "zh-CN"

Similarly, this elasticsearch.hostsconfiguration to keep on top of the same, specifying the Kibanastart port 5601, and modify the default English to Chinese display for easy viewing, start Kibanauntil the log output displays the status Greenthat is boot up.

Everything is ready, start the final Spring Boot engineering, and trigger /test/log4j2interfaces, manufacturing various types of logs, the KibanaWeb page to see if the correct load over it!

Browser Access http://127.0.0.1:4560to open Kibanapage, first of all we look at Elasticsearchthe index management which, if configured on top of an existing log4j2-yyyy.MM.ddformat index.

OK, show already exists, then the next we Kibanacreate an index mode Index mode, the input log4j2-*can be correctly matched to Elasticsearchthe specified index, followed by the field name selected at the time of screening @timestampto help us filter the data by time period behind the creation process as follows:

Created, we can Kibanafilter and display the log, for example, I added a messagefield, after filtering, on the show for various types of logs and exception log top engineering sample code, very easy and intuitive!

4.2, Logback way configuration

Top use Log4j2logging framework can be output to a log ELK correct, but there is a place we need to pay attention to, is to start Logstashspecified when Elasticsearchthe index index is a fixed value ( log4j2-*), and if there are multiple simultaneous project to ELK output log, using the same index name it, will cause the log chaos, convenient to distinguish between investigation and log each item, so we hope to be able to pass the dynamic output index name Elasticsearch, such as project a is output to the index projectA-*lower, project B output to the index projectB-*lower, so help us in Kibanaunder the matching index value corresponding to the project in accordance with, of course, the use of Spring Boot default logging framework Logbackcan easily do so.

那么改造项目来支持 Logback 日志框架,首先修改 pom.xml 配置文件如下:

    <dependency>
        <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> <!-- <exclusions>--> <!-- <exclusion>--> <!-- <groupId>org.springframework.boot</groupId>--> <!-- <artifactId>spring-boot-starter-logging</artifactId>--> <!-- </exclusion>--> <!-- </exclusions>--> </dependency> <dependency> <groupId>net.logstash.logback</groupId> <artifactId>logstash-logback-encoder</artifactId> <version>5.3</version> </dependency>

注意:这里将移除的 spring-boot-starter-logging 重新添加进来,同时依赖 logstash-logback-encoder 该插件,该插件起到 Socket 通过 TCP 方式向 Logstash 进行日志输送的作用。

接着增加 logback-spring.xml 配置文件如下:

<?xml version="1.0" encoding="UTF-8"?> <configuration debug="false"> <property name="LOG_HOME" value="logs/demo.log" /> <appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender"> <encoder class="ch.qos.logback.classic.encoder.PatternLayoutEncoder"> <pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger{50} - %msg%n</pattern> </encoder> </appender> <appender name="logstash" class="net.logstash.logback.appender.LogstashTcpSocketAppender"> <destination>127.0.0.1:4560</destination> <encoder class="net.logstash.logback.encoder.LogstashEncoder" > <customFields>{"appname": "demo-elk"}</customFields> </encoder> </appender> <root level="INFO"> <appender-ref ref="STDOUT" /> <appender-ref ref="logstash" /> </root> </configuration>

注意:这里的 destination 依旧要配置对应上本地 Logstash 配置,着重说下 <customFields>{"appname": "demo-elk"}</customFields> 字段配置,该自定义字段配置, Logstash 收集日志时,每条日志记录均会带上该字段,而且在 Logstash 配置文件中可以通过变量的方式获取到字段,这样就能达到我们说的动态输出索引名称到 Elasticsearch 中的功能了。同样,application.properties 可以不指定加载 logback-spring.xml 文件,Spring Boot 会默认加载该配置文件。接下来,去 Logstash config 下增加 test-logback.conf 配置文件,配置如下:

input {
  tcp {
    host => "127.0.0.1"
    port => "4560" mode => "server" type => json } stdin {} } filter { } output { stdout { codec => rubydebug } elasticsearch { hosts => ["127.0.0.1:9200"] action => "index" codec => rubydebug index => "%{[appname]}-%{+YYYY.MM.dd}" } }

注意:这里的 %{[appname]} 就是获取上边的 <customFields> 字段中的 json 串 key 值,我们只传了一个 appname 值,当让还可以传递其他值,例如 IP、Hostname 等关键信息,方便在 Kibana 中检索索引时区分。重启一下 Logstash 指定该配置文件。

$ cd <Logstash_path>/bin
$ ./logstash -f ../config/test-logback.conf

ElasticsearchKibana 不需要重启,再次启动 Spring Boot 工程,去 Kibana 下查看 !查看下 Elasticsearch 索引管理里面,是否已存在上边配置的 demo-elk-yyyy.MM.dd 格式索引。

What? 怎么没有获取到传递过去的 appname 值呢?原样配置到 Elasticsearch 索引中去了,但是我在后台 Logstash 控制台日志中可以明显看到,打印的每条 Json 串中是有该字段的呀!各种搜索,发现大家也是这么配置的呢!即使将 type => json 改为 codec => json 依旧不行!百思不得解的时候,查看了下 logstash-logback-encoder 文档说明 这里明确指出要使用 codec => json_lines 方式,好吧! test-logbash.conf 配置修改如下并重启 Logstash

input {
  tcp {
    host => "127.0.0.1"
    port => "4560" mode => "server" codec => json_lines } stdin {} } filter { } output { stdout { codec => rubydebug } elasticsearch { hosts => ["127.0.0.1:9200"] action => "index" index => "%{[appname]}-%{+YYYY.MM.dd}" } }

这下妥妥没有问题了,在去查看下 Elasticsearch 索引管理,这下就有了。

那么接着建一个索引模式名称为 demo-elk-*,查看下日志记录,是否能够正常加载的项目日志,也是妥妥没有问题的。

Guess you like

Origin www.cnblogs.com/zmsn/p/12030679.html