Springboot multi-data source configuration Mybatis uses druid connection pool to integrate mysql and hive dual data sources

  • Introduction: In the process of project development, we rarely use only one data source. Generally, we will use multiple data sources in a project. Recently encountered projects need to take data from big data. Part of the data, and some business in my project uses mysql, so two data sources are involved
  •          This example explains how to use springboot to configure hive, mysql, and mybatis so that you don't need to use jdbc to connect, which is much more convenient and flexible.

The first step: first we need to download the dependencies:

maven dependency, including mybatis, springboot, big data connection, MySQL dependency, druid, etc.

<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<java.version>1.8</java.version>
<spring.version>4.3.9.RELEASE</spring.version>
</properties>
 
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>1.5.9.RELEASE</version>
</parent>
 
<dependencies>
 
<dependency>
	<groupId>org.springframework.boot</groupId>
	<artifactId>spring-boot-starter-web</artifactId>
</dependency>
 
<dependency>
	<groupId>org.springframework</groupId>
	<artifactId>spring-core</artifactId>
</dependency>
<dependency>
	<groupId>org.springframework</groupId>
	<artifactId>spring-context</artifactId>
</dependency>
 
 
<!-- 添加mybatis支持 -->
<dependency>
	<groupId>org.mybatis.spring.boot</groupId>
	<artifactId>mybatis-spring-boot-starter</artifactId>
	<version>1.3.2</version>
</dependency>
 
<!-- 添加mysql驱动 -->
<dependency>
	<groupId>mysql</groupId>
	<artifactId>mysql-connector-java</artifactId>
</dependency>
 
<!-- 添加数据库连接池 -->
<dependency>
	<groupId>com.alibaba</groupId>
	<artifactId>druid</artifactId>
	<version>1.0.29</version>
</dependency>
 
<!-- 添加spring管理bean对象 -->
<dependency>
	<groupId>org.springframework</groupId>
	<artifactId>spring-beans</artifactId>
</dependency>
 
<!-- 添加hadoop依赖 -->
<dependency>
	<groupId>org.apache.hadoop</groupId>
	<artifactId>hadoop-common</artifactId>
	<version>2.6.0</version>
</dependency>
 
<dependency>
	<groupId>org.apache.hadoop</groupId>
	<artifactId>hadoop-mapreduce-client-core</artifactId>
	<version>2.6.0</version>
</dependency>
 
<dependency>
	<groupId>org.apache.hadoop</groupId>
	<artifactId>hadoop-mapreduce-client-common</artifactId>
	<version>2.6.0</version>
</dependency>
 
<dependency>
	<groupId>org.apache.hadoop</groupId>
	<artifactId>hadoop-hdfs</artifactId>
	<version>2.6.0</version>
</dependency>
 
<dependency>
	<groupId>jdk.tools</groupId>
	<artifactId>jdk.tools</artifactId>
	<version>1.8</version>
	<scope>system</scope>
	<systemPath>${JAVA_HOME}/lib/tools.jar</systemPath>
</dependency>
 
<dependency>
	<groupId>org.springframework.boot</groupId>
	<artifactId>spring-boot-configuration-processor</artifactId>
	<optional>true</optional>
</dependency>
 
<!-- 添加hive依赖 -->
<dependency>
	<groupId>org.apache.hive</groupId>
	<artifactId>hive-jdbc</artifactId>
	<version>1.1.0</version>
	<exclusions>
		<exclusion>
			<groupId>org.eclipse.jetty.aggregate</groupId>
			<artifactId>*</artifactId>
		</exclusion>
	</exclusions>
</dependency>
<dependency>

    <groupId>
       org.apache.hive
    </groupId>
    <artifacId>hive-service</artifactId>
    <version>1.1.0</version>
</dependency>
 
</dependencies>	

Step 2: Configure multiple data source properties. It can be in yml or properties file. I use yml here:

spring:
  datasource:
    mysqlMain: # 数据源1mysql配置
      type: com.alibaba.druid.pool.DruidDataSource
      jdbc-url: jdbc:mysql://0.0.0.0:3306/heyufu?characterEncoding=UTF-8&useUnicode=true&serverTimezone=GMT%2B8
      username: root
      password: root
      driver-class-name: com.mysql.cj.jdbc.Driver
    hive: # 数据源2hive配置
      jdbc-url: jdbc:hive2://0.0.0.0:10000/iot
      username: hive
      password: hive
      driver-class-name: org.apache.hive.jdbc.HiveDriver
      type: com.alibaba.druid.pool.DruidDataSource
    

The third step: Create a new DataSourceProperties public class

Used to read the data source configuration information of hive and mysql

import lombok.Data;
import org.springframework.boot.context.properties.ConfigurationProperties;

import java.util.Map;


@Data
@ConfigurationProperties(prefix = DataSourceProperties.DS, ignoreUnknownFields = false)
public class DataSourceProperties {

    final static String DS = "spring.datasource";

    private Map<String,String> mysqlMain;

    private Map<String,String> hive;

    

}

Step 4: Set up the dual data source configuration class

MySQL configuration class

import com.alibaba.druid.pool.DruidDataSource;
import com.xxxx.xxxx.Config.DataSourceCommonProperties;
import com.xxxx.xxxx.Config.DataSourceProperties;
import lombok.extern.log4j.Log4j2;
import org.apache.ibatis.session.SqlSessionFactory;
import org.mybatis.spring.SqlSessionFactoryBean;
import org.mybatis.spring.SqlSessionTemplate;
import org.mybatis.spring.annotation.MapperScan;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.boot.context.properties.EnableConfigurationProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Primary;
import org.springframework.core.io.support.PathMatchingResourcePatternResolver;

import javax.sql.DataSource;
import java.sql.SQLException;


@Configuration
//用来映射Dao层
@MapperScan(basePackages = "com.xxxx.xxxx.Dao.db1", sqlSessionFactoryRef = "db1SqlSessionFactory")
@Log4j2
@EnableConfigurationProperties({DataSourceProperties.class})
public class MysqlConfig {

    @Autowired
    private DataSourceProperties dataSourceProperties;

  
	// 设置为主数据源  
    @Primary
    @Bean("db1DataSource")
    public DataSource getDb1DataSource() thorw Exception {
        DruidDataSource datasource = new DruidDataSource();
        //配置数据源属性
        datasource.setUrl(dataSourceProperties.getMysqlMain().get("jdbcUrl"));
        datasource.setUsername(dataSourceProperties.getMysqlMain().get("userName"));
        datasource.setPassword(dataSourceProperties.getMysqlMain().get("passWord"));
        datasource.setDriverClassName(dataSourceProperties.getMysqlMain().get("driverClassName"));

        return datasource;
    }
	
	// 创建工厂bean对象
    @Primary
    //唯一标识  在一个项目中的所有bean不能重复 
    @Bean("db1SqlSessionFactory")
    public SqlSessionFactory db1SqlSessionFactory(@Qualifier("db1DataSource") DataSource dataSource) throws Exception {
        SqlSessionFactoryBean bean = new SqlSessionFactoryBean();
        bean.setDataSource(dataSource);
        // mapper的xml形式文件位置必须要配置,不然将报错:no statement (这种错误也可能是mapper的xml中,namespace与项目的路径不一致导致)
        bean.setMapperLocations(new PathMatchingResourcePatternResolver().getResources("classpath*:Mapper/db1/*.xml"));
        return bean.getObject();
    }

	// 创建模板bean
    @Primary
    @Bean("db1SqlSessionTemplate")
    public SqlSessionTemplate db1SqlSessionTemplate(@Qualifier("db1SqlSessionFactory") SqlSessionFactory sqlSessionFactory){
        return new SqlSessionTemplate(sqlSessionFactory);
    }

}

Hive configuration class

import com.alibaba.druid.pool.DruidDataSource;
import com.xxxx.xxxx.Config.DataSourceCommonProperties;
import com.xxxx.xxxx.Config.DataSourceProperties;
import lombok.extern.log4j.Log4j2;
import org.apache.ibatis.session.SqlSessionFactory;
import org.mybatis.spring.SqlSessionFactoryBean;
import org.mybatis.spring.SqlSessionTemplate;
import org.mybatis.spring.annotation.MapperScan;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.boot.context.properties.EnableConfigurationProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.io.support.PathMatchingResourcePatternResolver;

import javax.sql.DataSource;
import java.sql.SQLException;


@Configuration
@MapperScan(basePackages = "com.xxxx.xxxx.Dao.db2", sqlSessionFactoryRef = "db2SqlSessionFactory")
@Log4j2
@EnableConfigurationProperties({DataSourceProperties.class})
public class HiveConfig {

    @Autowired
    private DataSourceProperties dataSourceProperties;

    @Bean("db2DataSource")
    public DataSource getDb2DataSource(){
        DruidDataSource datasource = new DruidDataSource();

        //配置数据源属性
        datasource.setUrl(dataSourceProperties.getHive().get("jdbcUrl"));
        datasource.setUsername(dataSourceProperties.getHive().get("userName"));
        datasource.setPassWord(dataSourceProperties.getHive().get("passWord"));
        
    datasource.setDriverClassName(dataSourceProperties.getHive().get("driverClassName"));

        return datasource;
    }

    @Bean("db2SqlSessionFactory")
    public SqlSessionFactory db2SqlSessionFactory(@Qualifier("db2DataSource") DataSource dataSource) throws Exception {
        SqlSessionFactoryBean bean = new SqlSessionFactoryBean();
        bean.setDataSource(dataSource);
        // mapper的xml形式文件位置必须要配置,不然将报错:no statement (这种错误也可能是mapper的xml中,namespace与项目的路径不一致导致)
        // 设置mapper.xml路径,classpath不能有空格
        bean.setMapperLocations(new PathMatchingResourcePatternResolver().getResources("classpath*:Mapper/db2/*.xml"));
        return bean.getObject();
    }

    @Bean("db2SqlSessionTemplate")
    public SqlSessionTemplate db2SqlSessionTemplate(@Qualifier("db2SqlSessionFactory") SqlSessionFactory sqlSessionFactory){
        return new SqlSessionTemplate(sqlSessionFactory);
    }

}

Note: @primary identifies the primary data source, there can only be one in the entire project, otherwise an error will be reported

Establish Mapper interface: These two files are the packages scanned by @MapperScan

 

@Mapper
public interface HiveMapper{

        String selectList();
}
@Mapper
public interface MysqlMapper{

        String selectList();
}

The maper.xml file is as follows:

HiveMapper.xml 

<mapper namespace="Dao.db1.HiveMapper" resultType="string">
       select count(*) from 表名
</maper>
<mapper namespace="Dao.db1.MybatisMapper" resultType="string">
       select count(*) from 表名
</maper>

 

Establish the Service layer:

public interface HiveService{
    String selectList();
}
public interface MysqlService{
    String selectList();
}

ServiceImpl implementation class:

public class HiveServiceImpl implements HiveService{
    @Resource
    HiveMapper hiveMapper

    public String selectList(){
        return hiveMapper.selectList();
    }
}
public class MysqlServiceImpl implements MysqlService{
    @Resource
    MysqlMapper mysqlMapper

    public String selectList(){
        return mysqlMapper.selectList();
    }
}

Test category:

@Resource
HiveServie hiveService
@Resource
MysqlService mysqlService

@Test
public void Test(){
    String a=hiveService.selectLit();
    System.out.println(a);
}

@Test
public void Test2(){
    String b=mysqlServcie.selectList();
    System.out.println(b);
}

Finally you're done!

Guess you like

Origin blog.csdn.net/qq_30631063/article/details/108796883