Data Analysis: Easily merge multiple lines of logs

Description of Requirement

A common requirement for using Honghu to process log data is to process logs containing Stacktrace, similar to the following logs:

When importing the above logs into Honghu, we hope that Honghu can treat logs from 1 to 19 lines as one event, instead of splitting them by rows as 19 events by default.

solution

In Honghu, all data (Events) have the attribute _datatype (data source type), which is used to mark the data source type and structure of the Event. The data parsing rules used during data import and query analysis are also defined based on the data source type.

Taking file import data as an example, in the step of previewing the data, we can use the edit data source type to configure the analysis of the data.

Click to add attributes

Find firstline_format in the attribute name

Fill in the attribute value with a regular expression matching the format of the first line in the multi-line log: ([12]\d{3}-(0[1-9]|1[0-2])-(0[ 1-9]|[12]\d|3[01])) (0[1-9]|1[\d]|2[0-3]):(0[1-9]|[1- 5][\d]):(0[1-9]|[1-5][\d]) [ERROR|INFO|DEBUG].*

  • Match logs like YYYY-MM-dd HH::MM::SS as the first line

After editing, you will see changes in the preview data. Multi-line logs containing Stacktrace will be processed as an Event in Honghu.

Note that what we are editing here is the built-in data source type. You can click Save As to save it as a separate data source type for subsequent use.

Use the corresponding data source type (my_datatype) to search the imported data, and the logs containing Stacktrace will be merged into one Event according to the configured rules.

Guess you like

Origin blog.csdn.net/Yhpdata888/article/details/131539018