JAVA: Springboot dynamically assembles Druid multiple data sources

1 Introduction

Recently, I plan to build an authentication center service, use springboot+FastMybatis to assemble Druid, and consider the subsequent expansion of Druid multi-data source configuration, with one data source as the main structure and multiple dynamic data sources as the auxiliary structure. In addition to the database, it will be built in combination with the shiro security framework.

2. Quote

Add framework Springboot + FastMybatis + Druid related maven references in pom.xml.

<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
    <groupId>net.oschina.durcframework</groupId>
    <artifactId>fastmybatis-spring-boot-starter</artifactId>
    <version>${fastmybatis.version}</version>
</dependency>
<dependency>
    <groupId>com.alibaba</groupId>
    <artifactId>druid-spring-boot-starter</artifactId>
    <version>${druid.version}</version>
</dependency>
<dependency>
    <groupId>org.apache.shiro</groupId>
    <artifactId>shiro-core</artifactId>
    <version>${shiro.version}</version>
</dependency>
<dependency>
    <groupId>org.apache.shiro</groupId>
    <artifactId>shiro-spring</artifactId>
    <version>${shiro.version}</version>
</dependency>

3. Data source queue

We use a data source as the main structure and multiple dynamic data sources as the auxiliary structure. When we add new data sources later, we only need to adjust the configuration of the new data source without changing the original structure. So we must have our own data source queue to store dynamic data sources.

/**
* 多数据源队列
*
* @author lisk
*/
public class DynamicContextUtils {
    
    
   
    private static final ThreadLocal<Deque<String>> CONTEXT = new ThreadLocal() {
    
    
        @Override
        protected Object initialValue() {
    
    
            return new ArrayDeque();
        }
    };

    /**
     * 获得当前线程数据源
     *
     * @return 数据源名称
     */
    public static String peek() {
    
    
        return CONTEXT.get().peek();
    }

    /**
     * 设置当前线程数据源
     *
     * @param dataSource 数据源名称
     */
    public static void push(String dataSource) {
    
    
        CONTEXT.get().push(dataSource);
    }

    /**
     * 清空当前线程数据源
     */
    public static void poll() {
    
    
        Deque<String> deque = CONTEXT.get();
        deque.poll();
        if (deque.isEmpty()) {
    
    
            CONTEXT.remove();
        }
    }

}

4. Data source aspect

First of all, we need to add our own annotation, and we can intercept and load dynamic data sources in the aspect.

@Target({
    
    ElementType.METHOD, ElementType.TYPE})
@Retention(RetentionPolicy.RUNTIME)
@Documented
@Inherited
public @interface DataSource {
    
    
    String value() default "";
}

Now we intercept the annotation we added in the aspect, and then add it to the data source queue we defined through @Aspect.

@Aspect
@Component
@Order(Ordered.HIGHEST_PRECEDENCE)
public class DataSourceAspect {
    
    
    protected Logger logger = LoggerFactory.getLogger(DataSourceAspect.class);

    @Pointcut("@annotation(com.xhl.lk.auth2.datasource.annotation.DataSource)" +
            "|| @within(com.xhl.lk.auth2.datasource.annotation.DataSource)")
    public  void dataSourcePointCut(){
    
    

    }

    @Around("dataSourcePointCut()")
    public  Object around(@NotNull ProceedingJoinPoint point) throws Throwable{
    
    
        MethodSignature signature = (MethodSignature) point.getSignature();
        Class targetClass = point.getTarget().getClass();
        Method method = signature.getMethod();

        DataSource targetDataSource = (DataSource) targetClass.getAnnotation(DataSource.class);
        DataSource methodDataSource = method.getAnnotation(DataSource.class);
        if(Objects.nonNull(targetDataSource) || Objects.nonNull(methodDataSource)){
    
    

            String value = Objects.nonNull(methodDataSource) ? methodDataSource.value() : targetDataSource.value();
            DynamicContextUtils.push(value);
            logger.debug("set datasource is {}", value);
        }

        try{
    
    
            return point.proceed();
        }finally {
    
    
            DynamicContextUtils.poll();
            logger.info("clean datasource");
        }
    }
}

5. Data source properties

Add Druid master data source and dynamic data source parameter mapping classes so that databases can be tuned and linked via mappings.

/**
 * 多数据源属性
 *
 * @author lisk 
 */
@Data
public class DataSourceProperty {
    
    
    private String driverClassName;
    private String url;
    private String username;
    private String password;

    /**
     * Druid默认参数
     */
    private int initialSize = 2;
    private int maxActive = 10;
    private int minIdle = -1;
    private long maxWait = 60 * 1000L;
    private long timeBetweenEvictionRunsMillis = 60 * 1000L;
    private long minEvictableIdleTimeMillis = 1000L * 60L * 30L;
    private long maxEvictableIdleTimeMillis = 1000L * 60L * 60L * 7;
    private String validationQuery = "select 1";
    private int validationQueryTimeout = -1;
    private boolean testOnBorrow = false;
    private boolean testOnReturn = false;
    private boolean testWhileIdle = true;
    private boolean poolPreparedStatements = false;
    private int maxOpenPreparedStatements = -1;
    private boolean sharePreparedStatements = false;
    private String filters = "stat,wall";
}

The dynamic data source attribute is based on the current main data source and is obtained from the queue. The dynamic data source prefix is ​​identified by @ConfigurationProperties.

@Data
@ConfigurationProperties(prefix = "dynamic")
public class DynamicDataSourceProperty {
    
    
    private Map<String, DataSourceProperty> datasource = new LinkedHashMap<>();
}

We define multiple data source configurations in the configuration file application.yml:

spring:
    datasource:
        type: com.alibaba.druid.pool.DruidDataSource
        druid:
            driver-class-name: com.mysql.cj.jdbc.Driver
            url: jdbc:mysql://192.168.254.128:3306/sys_xhl?useUnicode=true&characterEncoding=UTF-8&useSSL=false&serverTimezone=Asia/Shanghai
            username: shdxhl
            password: shdxhl
            initial-size: 10
            max-active: 100
            min-idle: 10
            max-wait: 60000
            pool-prepared-statements: true
            max-pool-prepared-statement-per-connection-size: 20
            time-between-eviction-runs-millis: 60000
            min-evictable-idle-time-millis: 300000
            #Oracle需要打开注释
            #validation-query: SELECT 1 FROM DUAL
            #spring.datasource.druid.test-on-borrow=true
            #spring.datasource.druid.test-while-idle=true
            test-while-idle: true
            test-on-borrow: true
            test-on-return: false
            stat-view-servlet:
                enabled: true
                url-pattern: /druid/*
                #login-username: admin
                #login-password: admin
            filter:
                stat:
                    log-slow-sql: true
                    slow-sql-millis: 1000
                    merge-sql: false
                wall:
                    config:
                        multi-statement-allow: true
##多数据源的配置
dynamic:
  datasource:
    slave1:
        driver-class-name: com.mysql.cj.jdbc.Driver
        url: jdbc:mysql://192.168.254.128:3306/blog_weike?useUnicode=true&characterEncoding=UTF-8&useSSL=false&serverTimezone=Asia/Shanghai
        username: blog
        password: wiloveyou
#    slave2:
#      driver-class-name: org.postgresql.Driver
#      url: jdbc:postgresql://localhost:5432/renren_security
#      username: renren
#      password: 123456

6. Config initialization

Implement the data link initialization of the main data source and multiple dynamic data sources in @Configuration, and realize dynamic data source switching by inheriting AbstractRoutingDataSource.

//通过重载determineCurrentLookupKey 来获取切换的数据源Key。
public class DynamicDataSource extends AbstractRoutingDataSource {
    
    
    @Override
    protected Object determineCurrentLookupKey() {
    
    
        return DynamicContextUtils.peek();
    }
}

Create a Factory of Dynamic data source to implement dynamic data source parameter mapping and Druid data source initialization:

public class DynamicDataSourceFactory {
    
    
    protected static Logger logger = LoggerFactory.getLogger(DynamicDataSourceFactory.class);
    //build动态数据源,初始化
    public static DruidDataSource buildDruidDataSource(DataSourceProperty properties) {
    
    
        DruidDataSource druidDataSource = new DruidDataSource();
        druidDataSource.setDriverClassName(properties.getDriverClassName());
        druidDataSource.setUrl(properties.getUrl());
        druidDataSource.setUsername(properties.getUsername());
        druidDataSource.setPassword(properties.getPassword());

        druidDataSource.setInitialSize(properties.getInitialSize());
        druidDataSource.setMaxActive(properties.getMaxActive());
        druidDataSource.setMinIdle(properties.getMinIdle());
        druidDataSource.setMaxWait(properties.getMaxWait());
        druidDataSource.setTimeBetweenEvictionRunsMillis(properties.getTimeBetweenEvictionRunsMillis());
        druidDataSource.setMinEvictableIdleTimeMillis(properties.getMinEvictableIdleTimeMillis());
        druidDataSource.setMaxEvictableIdleTimeMillis(properties.getMaxEvictableIdleTimeMillis());
        druidDataSource.setValidationQuery(properties.getValidationQuery());
        druidDataSource.setValidationQueryTimeout(properties.getValidationQueryTimeout());
        druidDataSource.setTestOnBorrow(properties.isTestOnBorrow());
        druidDataSource.setTestOnReturn(properties.isTestOnReturn());
        druidDataSource.setPoolPreparedStatements(properties.isPoolPreparedStatements());
        druidDataSource.setMaxOpenPreparedStatements(properties.getMaxOpenPreparedStatements());
        druidDataSource.setSharePreparedStatements(properties.isSharePreparedStatements());

        try {
    
    
            druidDataSource.setFilters(properties.getFilters());
            druidDataSource.init();
        } catch (SQLException e) {
    
    
            logger.error("DynamicDataSourceFactory is error:" + e.toString());
        }
        return druidDataSource;
    }
}

Finally, we add multiple data source object bean instances in @Configuration:

@Configuration
@EnableConfigurationProperties(DynamicDataSourceProperty.class)
public class DynamicDataSourceConfig {
    
    
    @Autowired
    private DynamicDataSourceProperty properties;

    @Bean
    @ConfigurationProperties(prefix = "spring.datasource.druid")
    public DataSourceProperty dataSourceProperty() {
    
    
        return new DataSourceProperty();
    }

    @Bean
    public DynamicDataSource dynamicDataSource(DataSourceProperty dataSourceProperty) {
    
    
        DynamicDataSource dynamicDataSource = new DynamicDataSource();
        dynamicDataSource.setTargetDataSources(getDynamicDataSource());

        //默认数据源
        DruidDataSource defaultDataSource = DynamicDataSourceFactory.buildDruidDataSource(dataSourceProperty);
        dynamicDataSource.setDefaultTargetDataSource(defaultDataSource);

        return dynamicDataSource;
    }

    private Map<Object, Object> getDynamicDataSource(){
    
    
        Map<String, DataSourceProperty> dataSourcePropertyMap = properties.getDatasource();
        Map<Object, Object> targetDataSources = new ConcurrentHashMap<>(dataSourcePropertyMap.size());
        dataSourcePropertyMap.forEach((k, v) -> {
    
    
            DruidDataSource druidDataSource = DynamicDataSourceFactory.buildDruidDataSource(v);
            targetDataSources.put(k, druidDataSource);
        });
        return targetDataSources;
    }
}

7. Verification

Finally, we can easily verify whether the current Druid multi-data source configuration is effective. By visiting the address of http://localhost:8080/lk-auth/druid/, we can clearly see the database execution statements and various data sources. index. Code link: https://gitee.com/lhdxhl/lk-auth.git

insert image description here

Guess you like

Origin blog.csdn.net/lishangke/article/details/131298271