dubbo traceId achieve transparent transmission link tracking log (based Filter and RpcContext realization)

First, to solve the problem:

Use elk in the process found the following problems:

1. unable to accurately locate a request through which services
2. Request a plurality of threads are alternately printing log is not conducive to a view request to view the chronological log.

Second, the desired effect

Able to view a complete log request link, interference from other requests log.

Third, the realization of hands

What the consumer side needs to be done:

1. Create a new interceptor to intercept all requests, generating a link-id (traceId) placed before the call interface and MDC and the attachment RpcContext dubbo of log4j, the interceptor is based here jfinal achieved, spring mvc using other interception aop programs or replaced, as long to achieve the generation of traceId into memory.

import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;

import org.apache.log4j.MDC;

import com.alibaba.dubbo.rpc.RpcContext;
import com.jfinal.aop.Interceptor;
import com.jfinal.aop.Invocation;
import com.jfinal.handler.Handler;
import com.jfinal.kit.StrKit;
import com.ttxn.frameworks.utils.CommonUtil;

public class TraceHandler extends Handler {

    @Override
    public void handle(String target, HttpServletRequest request, HttpServletResponse response, boolean[] isHandled) {
        String token = request.getParameter("token");
        String userId = StrKit.notBlank(token) ? CommonUtil.getUserId(token) : "VISITOR";
        String uuid = CommonUtil.getUuid();
        String traceId = userId + "-" + uuid;
        RpcContext.getContext().setAttachment("traceId", traceId);
        MDC.put("traceId", traceId);
        next.handle(target, request, response, isHandled);
    }
    
}

2. Modify log4j configuration file, plus the beginning of the print log id link variables (% X {traceId}), are automatically printed when the link id for the MDC log4j print log, then the consumer will end logs all output log4j link id.

# log4j.rootLogger=WARN, stdout, file
log4j.rootLogger=INFO, stdout , file
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%X{traceId}%n%-d{yyyy-MM-dd HH:mm:ss}%n[%p]-[Thread: %t]-[%C.%M()]: %m%n

# Output to the File
log4j.appender.file=org.apache.log4j.DailyRollingFileAppender
log4j.appender.file.DatePattern='_'yyyy-MM-dd'.log'
log4j.appender.file.File=${catalina.base}/logs/ttxn_log.log
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%X{traceId}%n%-d{yyyy-MM-dd HH:mm:ss}%n[%p]-[Thread: %t]-[%C.%M()]: %m%n

Log4j output effect after the consumer side configuration:

Para: [APP公告]
VISITOR-1e2f6d11ca594ea7af3118567d00f004
2019-07-31 17:37:23
[INFO]-[Thread: DubboServerHandler-192.168.5.15:20884-thread-17]-[com.ttxn.frameworks.annotation.redis.RedisCacheInterceptor.intercept()]: cache not hit key=DATA_CACHE_common.getCfg ,filed=5bm/5pKt55S15Y+w

3. Customize a dubbo filter, why the consumer side of this filter to define it? The reason is an interface may call more than one service or calling a service many times, this will involve multiple rpc call, RpcContext the attachment only once rpc session is valid, so we need to put a traceId before each rpc call, in order to ensure the server can get multiple rpc call to the traceId, in every business code rpc setAttachment once again too much trouble, so I chose to put a filter, transparent to the application developer. New under src / main / resources directory META-INF / dubbo / com.alibaba.dubbo.rpc.Filter file, as follows.

dubboRpcFilter=com.ttxn.api.filter.TraceFilter

Consumer.xml modify files, apply filters

<dubbo:consumer timeout="30000" filter="dubboRpcFilter"/> 
import com.alibaba.dubbo.rpc.Filter;
import com.alibaba.dubbo.rpc.Invocation;
import com.alibaba.dubbo.rpc.Invoker;
import com.alibaba.dubbo.rpc.Result;
import com.alibaba.dubbo.rpc.RpcContext;
import com.alibaba.dubbo.rpc.RpcException;
import com.jfinal.kit.StrKit;
import com.ttxn.frameworks.utils.trace.TraceIdUtil;

public class TraceFilter implements Filter {

    @Override
    public Result invoke(Invoker<?> invoker, Invocation invocation) throws RpcException {
        String traceId = RpcContext.getContext().getAttachment("traceId");
        if ( !StrKit.isBlank(traceId) ) {
            // *) 从RpcContext里获取traceId并保存
            TraceIdUtil.setTraceId(traceId);
        } else {
            // *) 交互前重新设置traceId, 避免信息丢失
            traceId = TraceIdUtil.getTraceId();
            RpcContext.getContext().setAttachment("traceId", traceId);
        }
        Result result = invoker.invoke(invocation);
        return result;
    }

}

Here, the work done on the consumer side, if you are like me jfinal architecture, need to do a thing, custom jfinal the print request log is as follows:

import java.io.IOException;
import java.io.Writer;

import org.slf4j.MDC;

import com.alibaba.dubbo.rpc.RpcContext;
/**
 * jfinal请求日志打印工具,用来获得jfinal action report的日志内容,进行自定义后输出
 * @author Administrator
 *
 */
public class JFinalActionReportWriter extends Writer {
    
    private static final String LOG_PREFIX = "[START]";
    
    public void write(String str) throws IOException {
        String traceId = MDC.get("traceId");
        System.out.print(LOG_PREFIX + traceId + str);
    }
    public void write(char[] cbuf, int off, int len) throws IOException {}
    public void flush() throws IOException {}
    public void close() throws IOException {}
}

Increase follows GlobalConfig

    public void configConstant(Constants me)
    {
        ActionReporter.setWriter(new JFinalActionReportWriter());
    }

You can customize the output log jfinal request content, as this part of the log log4j configuration is not affected, so we need to do this step.

[START]VISITOR-dafe84f6ee2f4d2b907a4c7ef8f8d20c
JFinal action report -------- 2019-07-31 17:37:14 ------------------------------
Url         : GET /app/subject/searchSubjectV3_4
Controller  : com.ttxn.api.controller.app.SubjectRest.(SubjectRest.java:1)
Method      : searchSubjectV3_4
Interceptor : com.ttxn.api.interceptor.APIExceptionInterceptor.(APIExceptionInterceptor.java:1)
              com.ttxn.frameworks.plugins.spring.IocInterceptor.(IocInterceptor.java:1)
              com.ttxn.api.decorator.CoverImgInterceptor.(CoverImgInterceptor.java:1)
Parameter   : query=  
What the production side needs to be done:

1. The need to customize dubbo interceptor, the new configuration file src \ main \ resources \ META-INF \ dubbo \ com.alibaba.dubbo.rpc.Filter, file content and the consumer side, like:

dubboContextFilter=com.ttxn.webservice.interceptor.dubbo.ContextFilter

Modify provider.xml, producers apply this interceptor:

    <!-- 提供方调用过程缺省拦截器,将拦截所有service -->
    <dubbo:provider filter="dubboContextFilter"/>

Get traceId in custom interceptor, placed MDC:



package com.ttxn.webservice.interceptor.dubbo;

import java.lang.annotation.Annotation;
import java.sql.SQLException;

import org.apache.log4j.MDC;

import com.alibaba.dubbo.rpc.Filter;
import com.alibaba.dubbo.rpc.Invocation;
import com.alibaba.dubbo.rpc.Invoker;
import com.alibaba.dubbo.rpc.Result;
import com.alibaba.dubbo.rpc.RpcContext;
import com.alibaba.dubbo.rpc.RpcException;
import com.jfinal.kit.StrKit;
import com.jfinal.plugin.activerecord.Db;
import com.jfinal.plugin.activerecord.IAtom;
import com.ttxn.frameworks.annotation.transaction.Transaction;
import com.ttxn.frameworks.utils.trace.TraceIdUtil;

public class ContextFilter implements Filter
{
    @Override
    public Result invoke(Invoker<?> invoker, Invocation invocation)
        throws RpcException
    {
        String traceId = RpcContext.getContext().getAttachment("traceId");
        if (StrKit.notBlank(traceId)) {
            MDC.put("traceId", traceId);
        }
        
        Result result = invoker.invoke(invocation);
        return result;
    }


}

Log4j modify the file, plus the output format traceid variables (% X {traceId}):

# log4j.rootLogger=WARN, stdout, file
log4j.rootLogger=INFO, stdout , file
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%X{traceId}%n%-d{yyyy-MM-dd HH:mm:ss}%n[%p]-[Thread: %t]-[%C.%M()]: %m%n

# Output to the File
log4j.appender.file=org.apache.log4j.DailyRollingFileAppender
log4j.appender.file.DatePattern='_'yyyy-MM-dd'.log'
log4j.appender.file.File=${catalina.base}/logs/ttxn_log.log
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%X{traceId}%n%-d{yyyy-MM-dd HH:mm:ss}%n[%p]-[Thread: %t]-[%C.%M()]: %m%n

This is realized traceId consumption from end to end production of transparent transmission, the production side log4j log is as follows:

Para: [快报]
VISITOR-1e2f6d11ca594ea7af3118567d00f004
2019-07-31 17:37:23
[INFO]-[Thread: DubboServerHandler-192.168.5.15:20884-thread-15]-[com.ttxn.frameworks.annotation.redis.RedisCacheInterceptor.intercept()]: cache not hit key=DATA_CACHE_base.newsList ,filed=15

We can follow traceId to filter all log requests from kibana in a single, easy to achieve the effect of the investigation bug.

Guess you like

Origin www.cnblogs.com/powerjiajun/p/11279042.html