How to make log4j generate log in json format

When using java to develop projects, log logs are generally an indispensable part of the application. In most cases, our log files are ordinary text information, and different levels of logs are marked by level.


The purpose of the log is mainly to trace the problem when there is a problem, and it is convenient to find out the cause. When the amount of data is small, you can quickly query or do some simple statistics through various shell commands on Linux, such as awk and grep. , when the amount of data, and the program itself is distributed, this method is a bit laborious. For example, if you have 10 machines, you need to log in to each query, which is very cumbersome, and the linux command may be very inefficient when the amount of data is large. Therefore, at this time, we need to use a special log analysis tool to deal with it. It is recommended to use the ELK suite, which is very good at log query analysis and statistics, and the most important thing is open source.


ElasticSearch supports standard json-structured data and builds indexes directly, but most of the time our log files are ordinary text, and there is no way to insert them directly into es, unless logstash is used in the middle to convert, but then we need to maintain Multiple sets of logstash rules are also cumbersome. Ideally, the generated log is directly in json format, so that es can be directly inserted through logstash without paying attention to specific business fields, which is more flexible.



In log4j, there is no layout directly corresponding to json. Let’s explain layout here. Layout is a class in the log component that renders the final result as a string. If we need a custom format, we need to inherit the layout class and then rewrite it. format method to complete the final log output format.

log4j does not directly support the json format, but the logstash official website has provided a support project jsonevent-layout, although it has not been updated for several years, but it can be used simply. The function is to convert the print information of log4j into json format, so that it can be directly inserted into es through logstash. How to use it?

github: https://github.com/logstash/log4j-jsonevent-layout

first introduce the following pom dependencies in maven:
<!-- https://mvnrepository.com/artifact/net.logstash.log4j/jsonevent-layout -->
<dependency>
    <groupId>net.logstash.log4j</groupId>
    <artifactId>jsonevent-layout</artifactId>
    <version>1.7</version>
</dependency>




Then simply configure it in log4j.properties:
# log4j.rootLogger = INFO, console, kafka
log4j.rootLogger=INFO,console

# appender console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.out
log4j.appender.console.layout=net.logstash.log4j.JSONEventLayout


The final printed log format is as follows:
{"@timestamp":"2017-09-15T09:08:50.805Z","source_host":"USER-20160722CY","file":"TestJson.java","method":"main","level":"INFO","line_number":"39","thread_name":"main","@version":1,"logger_name":"TestJson","message":"log信息","class":"net.logstash.log4j.TestJson","mdc":{}}


In addition to the standard json format, the above log also has a timestamp field @timestamp that is specially required in ELK. Note that this field must be present and the format must be a format supported by es. Only in this way can it be directly inserted into es through logstash.



Summary:

Although using the jsonevent-layout of the logstash official website can directly convert the output information of log4j into json, the disadvantage is that it cannot support adding custom fields to json. For example, I pass in a Map in the log.info() method. The kv in the class needs to be generated in json, or directly pass in a JSON object in the info method. Sometimes our application needs to set specific fields to add to json for subsequent targeted statistical analysis. For example, I There is a method time-consuming field. At present, jsonevent-layout is not satisfied, so we need to customize a layout to achieve such a function, which will be shared in a later article.



Guess you like

Origin http://10.200.1.11:23101/article/api/json?id=326853530&siteId=291194637