Our scenario is an online project, because customers need high concurrency export excle document, and then for the SQL statements, because the reason the amount of data, query time is relatively long,
In the process of SQL execution, the project was still operating in real time, in order not to affect the operations, we decided to use the database to read and write separation, and avoid the impact on the efficiency of concurrent lock table.
This example written in the spring boot + mybatis architecture
If you are using a application.properties need to change if the format like yml
# Connection pool database.type = com.alibaba.druid.pool.DruidDataSource # writable database database.d0.url = JDBC: MySQL: //10.10.59.10/test database.d0.username = the root database.d0.password the root = database.d0.driver com.mysql.jdbc.Driver-class-name = # read-only databases database.d1.url = JDBC: MySQL: //10.10.59.11/test database.d1.username = the root database.d1 = the root .password database.d1.driver-class-name = com.mysql.jdbc.Driver #Mapper profile mybatis.mapper-locations = classpath *: / mapper / * Mapper.xml
First, we prepared a few basic classes
ReadWriteSplitRoutingDataSource.class inherited AbstractRoutingDataSource, the main route between this class to manage our multiple data sources
/** * 数据源路由 */ class ReadWriteSplitRoutingDataSource extends AbstractRoutingDataSource { @Override protected Object determineCurrentLookupKey() { return DataBaseContextHolder.getDataBaseType(); } }
DataBaseContextHolder.class custom class with ThreadLocal to manage the deployment of data sources to solve the multiple threads of data sources disorder problems
/ ** * provisioning data source * / public class DataBaseContextHolder { // distinguish from master source data public enum databaseType { D0, Dl } // thread-local variables Private static Final the ThreadLocal <databaseType> = ContextHolder new new the ThreadLocal <> (); / / to the data type set inside thread public static void setDataBaseType (databaseType databaseType) { contextHolder.set (databaseType); } // get the data from the container type public static databaseType getDataBaseType () { returncontextHolder.get () == null ? DataBaseType.D0: contextHolder.get (); } // Data type emptying containers public static void clearDataBaseType () { contextHolder.remove (); } }
And begin configuring the data source
import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.beans.factory.annotation.Qualifier; import org.springframework.beans.factory.annotation.Value; import org.springframework.boot.context.properties.ConfigurationProperties; import org.springframework.boot.jdbc.DataSourceBuilder; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import javax.sql.DataSource; import java.util.HashMap; import java.util.Map; /** * 配置数据源 */ @Configuration public class DataSourceConfiguration { private static Logger LOGGER = LoggerFactory.getLogger(DataSourceConfiguration.class); //连接池 @Value("${database.type}") private Class<? extends DataSource> dataSourceType; /** * 可读写数据源 * * @return d0DataSource */ @Bean(name = "d0DataSource") @ConfigurationProperties(prefix = "database.d0") public DataSource d0DataSource() { DataSource d0DataSource = DataSourceBuilder.create().type(dataSourceType).build(); LOGGER.info("========D0: {}=========", d0DataSource); return d0DataSource; } /** * 只读数据源 * * @return d1DataSource */ @Bean(name = "d1DataSource") @ConfigurationProperties(prefix = "database.d1") public DataSource d1DataSource() { DataSource d1DataSource = DataSourceBuilder.create().type(dataSourceType).build(); LOGGER.info("======== Dl: {} =========" , d1DataSource); return d1DataSource; } / ** * assembled the two data sources do not use herein being given to Autowired with Qualifier * * @param d0DataSource * @param d1DataSource * @return myRoutingDataSource * / @Bean public the DataSource myRoutingDataSource (@Qualifier ( "d0DataSource" ) the DataSource d0DataSource, @Qualifier ( "d1DataSource" ) the DataSource d1DataSource) { the Map <Object, Object> = targetDataSources new new the HashMap <> (); targetDataSources.put(DataBaseContextHolder.DataBaseType.D0, d0DataSource); targetDataSources.put(DataBaseContextHolder.DataBaseType.D1, d1DataSource); ReadWriteSplitRoutingDataSource myRoutingDataSource = new ReadWriteSplitRoutingDataSource(); //设置默认数据源 myRoutingDataSource.setDefaultTargetDataSource(d0DataSource); //设置双数据源 myRoutingDataSource.setTargetDataSources(targetDataSources); return myRoutingDataSource; } }
Data source configured, the next configuration Mybatis
import org.apache.ibatis.session.SqlSessionFactory; import org.mybatis.spring.SqlSessionFactoryBean; import org.springframework.beans.factory.annotation.Value; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.core.io.support.PathMatchingResourcePatternResolver; import org.springframework.jdbc.datasource.DataSourceTransactionManager; import org.springframework.transaction.PlatformTransactionManager; import org.springframework.transaction.annotation.EnableTransactionManagement; importjavax.annotation.Resource; Import the javax.sql.DataSource; / ** * Mybatis arranged to double the data source binding sqlSessionFactory * / // start transaction @EnableTransactionManagement @Configuration public class MybatisConfiguration { // double data source @Resource (name = "myRoutingDataSource" ) Private the DataSource myRoutingDataSource; // Mapper package path @Value ( "} $ {mybatis.mapper-locations" ) Private String mybatisLocations; @Bean public a SqlSessionFactory SqlSessionFactory () throws Exception { SqlSessionFactoryBean the SqlSessionFactoryBean = new new the SqlSessionFactoryBean (); // set the dual data source sqlSessionFactoryBean.setDataSource (myRoutingDataSource); // Set the file path mapper sqlSessionFactoryBean.setMapperLocations ( new new PathMatchingResourcePatternResolver () getResources (mybatisLocations).); Return sqlSessionFactoryBean.getObject (); } @Bean public the PlatformTransactionManager PlatformTransactionManager () { // configure transaction management data source return new new DataSourceTransactionManager (myRoutingDataSource); } }
Then you start to configure our comments section and
Customizing a comment
Import java.lang.annotation.ElementType; Import java.lang.annotation.Retention; Import java.lang.annotation.RetentionPolicy; Import java.lang.annotation.Target; // the method used in the annotation @Target ({ElementType. the METHOD, ElementType.TYPE}) // run at runtime @Retention (RetentionPolicy.RUNTIME) public @ interface ReadOnlyDB { }
Cut a section to write this comment because this section is rather special, high-priority needs, it is necessary to set up a class to inherit Ordered priority
import org.aspectj.lang.ProceedingJoinPoint; import org.aspectj.lang.annotation.Around; import org.aspectj.lang.annotation.Aspect; import org.aspectj.lang.annotation.Pointcut; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.core.Ordered; import org.springframework.stereotype.Component; @Aspect @Component public class ReadOnlyConnectionInterceptor implements Ordered { public static final Logger LOGGER = LoggerFactory.getLogger(ReadOnlyConnectionInterceptor.class); / ** * entry point * / @Pointcut ( "@annotation (com.beyondsoft.common.datasource.ReadOnlyDB)" ) public void msPointCut () { } // Around we begin processing program from entering until the end ! Around ( "msPointCut ()" ) public Object the proceed (ProceedingJoinPoint ProceedingJoinPoint) throws the Throwable { the try { logger.info ( "Database Update Dl --------------- ------- -------- " ); DataBaseContextHolder.setDataBaseType (DataBaseContextHolder.DataBaseType.D1); // the method finishes Object result = proceedingJoinPoint.proceed(); return result; } finally { DataBaseContextHolder.clearDataBaseType(); LOGGER.info("---------------end database D1---------------"); } } @Override public int getOrder() { //最高优先级 return Ordered.HIGHEST_PRECEDENCE; } }
Usage is on a read-only service you need to open the notes we have just defined @ReadOnlyDB
Boot process can be seen in the data source successfully loaded
Successful routing data source results as shown
If you have any difficulty, or comments, please leave a comment below to see the landlord will return, thank you.