Spring Boot Framework - MyBatis topic

A, MyBatis history of 

  MyBatis apache was originally an open source project iBatis, 2010 Nian 6 project in December when it migrated from apache software foundation to google code, along with the team's switch to Google Code, ibatis3.x officially changed its name to Mybatis, code migration in November 2013 to Github.
  iBATIS comes from the word "internet" and "abatis" combination, it is a Java-based persistence framework. iBATIS persistence framework include SQL Maps and Data Access Objects (DAO).

Two, SpringBoot integrated MyBatis

1, adding a dependency

<dependency>
    <groupId>org.mybatis.spring.boot</groupId>
    <artifactId>mybatis-spring-boot-starter</artifactId>
    <version>2.0.0</version>
</dependency>

<dependency>
    <groupId>com.github.pagehelper</groupId>
    <artifactId>pagehelper</artifactId>
    <version>5.1.0</version>
</dependency>

<dependency>
    <groupId>com.github.pagehelper</groupId>
    <artifactId>pagehelper-spring-boot-starter</artifactId>
    <version>1.2.3</version>
</dependency>

2, the configuration file

Spring: 
  DataSource: 
    type: # com.alibaba.druid.pool.DruidDataSource configuration data to be used in the current source operation type 
    driver-class-name: oracle.jdbc.driver.OracleDriver # Configure ORACLE driver class 
    url: jdbc: oracle: thin: @localhost: 1521: ORCL # connection address database 
    username: "plat_user" # database username 
    password: 123 
    Druid: # database connection pool configuration 
      min-idle: 5 # database connection pool to maintain a minimum number of connections     
      initial -size: 5 # initialization connections provided 
mybatis: 
      max-total: 5 # maximum number of connections
      max-wait-millis: Maximum # 200 wait for a connection timeout acquired
  type-aliases-package: com.example.demo.dao.mapper # define aliases of all class of operation where the packet 
  mapper-locations: classpath:. com / example / demo / dao / config / ** / * xml # All mapper mapping file 
  logImpl: org.apache.ibatis.logging.log4j2.Log4j2Impl # mybatis log 
pagehelper: 
    Reasonable: false 
    supportMethodsArguments: to true 
    params: pageNum = Start; pageSize = limit;

3, notes

@SpringBootApplication
@EnableDiscoveryClient
@MapperScan("com.example.demo.dao.mapper")
public class Application
{
    public static void main(String[] args )
    {
        SpringApplication.run(Application.class, args);
    }
}

Three, MyBatis depth

1, pagination plug PageHelper

  To note the version number when pagehelper use, the important thing to say three times, we must pay attention, be sure to note that if the version number does not correspond, then the pagination plugin will not take effect.

a) adding dependent

<dependency>
    <groupId>com.github.pagehelper</groupId>
    <artifactId>pagehelper</artifactId>
    <version>5.1.0</version>
</dependency>

<dependency>
    <groupId>com.github.pagehelper</groupId>
    <artifactId>pagehelper-spring-boot-starter</artifactId>
    <version>1.2.3</version>
</dependency>

b) realization

  There are two ways to use pagehelper, configuration files and code declarative way.

  Profiles way (yaml):

pagehelper:
    reasonable: false
    supportMethodsArguments: true
    params: pageNum=start;pageSize=limit;

  Code declaratively

@Bean
public PageHelper pageHelper() {
    PageHelper pageHelper = new PageHelper();
    Properties p = new Properties();
    p.setProperty("offsetAsPageNum", "true");
    p.setProperty("rowBoundsWithCount", "true");
    p.setProperty("reasonable", "true");
    pageHelper.setProperties(p);
    return pageHelper;
}

2, multiple data source support

a) adding dependent

  Adding dependency and no difference with the individual data, with the "two, SpringBoot integrated MyBatis -> 1, add a dependency."

b) Profiles

spring:
  datasource:
    example1:
      password: root
      url: jdbc:mysql://127.0.0.1:3306/master?useUnicode=true&characterEncoding=UTF-8
      driver-class-name: com.mysql.jdbc.Driver
      username: root
      type: com.zaxxer.hikari.HikariDataSource
    example2:
      password: root
      url: jdbc:mysql://127.0.0.1:3306/slave1?useUnicode=true&characterEncoding=UTF-8
      idle-timeout: 20000
      driver-class-name: com.mysql.jdbc.Driver
      username: root
      type: com.zaxxer.hikari.HikariDataSource

c) code implementation

@Bean ( "the dataSource") // this object into Spring container 
@Primary            // indicating that the data source is a default data source 
@ConfigurationProperties (prefix = "spring.datasource.example1") // read in application.properties configuration parameters mapped to a parameter indicative of prefix objects prefix 
public the dataSource getDataSource () {
     return DataSourceBuilder.create () Build ();. 
} 

@Bean (name = "SqlSessionFactory") // indicating that the data source is a default data source 
! Primary                          // @ Qualifier represent Find Spring container object named test1DataSource 
public SqlSessionFactory SqlSessionFactory (@ Qualifier ( "dataSource" ) the dataSource the dataSource)
         throws{Exception 
    the SqlSessionFactoryBean the bean = new new the SqlSessionFactoryBean (); 
    bean.setDataSource (DataSource); 
    bean.setMapperLocations ( new new PathMatchingResourcePatternResolver () getResources (. "CLASSPATH *:. Mapping / example1 / xml *")); // set of xml where mybatis position 
    return bean.getObject (); 
} 

@Bean ( "dataSourceExample2") // this object into the container Spring 
@ConfigurationProperties (prefix = "spring.datasource.example2") // read the configuration parameters of application.properties prefix object is mapped to a parameter indicative of a prefix 
public the DataSource getDataSource () {
     return  DataSourceBuilder.create () Build ().;
} 

@Bean (name= "SqlSessionFactoryExample2") // indicating that the data source is the default data source
 // @Qualifier represents Spring container name lookup object test1DataSource the 
public a SqlSessionFactory SqlSessionFactory (@Qualifier ( "dataSourceExample2" ) the DataSource DataSource)
         throws Exception { 
    the SqlSessionFactoryBean the bean = new new the SqlSessionFactoryBean (); 
    bean.setDataSource (the DataSource); 
    bean.setMapperLocations ( new new PathMatchingResourcePatternResolver () getResources (. "the CLASSPATH *:. Mapping / examplew / * xml")); // set mybatis location of xml 
    return bean.getObject ( ); 
}

3, to the localization configuration (spring cloud config)

a) Project Profile

# Log file name 
logging: 
  File: $ {spring.application.name} 
# Configure Center Address 
logging: 
  config: http://172.17.30.111:9001/plat-config/test/develop/log4j2.yml?resolvePlaceholders=false&type= .yml

b) log configuration file

Configuration:
  status: error
  name: YAMLConfig
  properties:
    property:
     - name: log_file
       value: "${sys:LOG_FILE}"
     - name: log_pattern
       value: "%d{yyyy-MM-ddHH:mm:ss,SSS z} %t [${log_file},%X{X-B3-TraceId},%X{X-B3-SpanId},%X{X-B3-ParentSpanId}] %-5level %class{36} %L %M - %msg%xEx%n"
     - name: base_path
       value: "/java/apache-tomcat-8.0.36/webapps"
     - name: file_name
       value: "${base_path}/log/${log_file}.log"
     - name: rolling_file_name
       value: "${base_path}/backup/${log_file}-%d{yyyy-MM-dd}-%i.log.gz"
     - name: every_file_size
       value: 10M
  Appenders:
    Console:
      name: Console
      target: SYSTEM_OUT
      PatternLayout:
        Pattern: ${log_pattern}
      ThresholdFilter:
        level: trace
        onMatch: ACCEPT
        onMismatch: DENY
    RollingFile:
      name: RollingFile
      fileName: ${filename}
      filePattern: ${rolling_file_name}
      PatternLayout:
        Pattern: ${log_pattern}
      Policies:
        SizeBasedTriggeringPolicy:
          size: ${every_file_size}
    Kafka: #输出到Kafka
      name: Kafka
      topic:  app_log
      #Kafka appender ignoreExceptions must be set to false, otherwise it can not trigger Failover 
      ignoreExceptions: false 
      retryIntervalSeconds: 600  
      PatternLayout: 
        Pattern: $ {} log_pattern 
      Property: 
        - name: bootstrap.servers 
          value: 172.17.30.143:9092 
        # KafkaClient bag default is 60000ms, when Kafka is down, try to write Kafka takes 1 minute to return Exception, only after triggers Failover, when the request is large, log4j2 queue will soon be playing, after the write log on Blocking, seriously affected the primary service response. So to set short enough, the queue length is long enough 
        - name: max.block.ms 
          value: 20000 
    Failover: Failover # Appender here is the decoupling of dependence on Kafka, when Kafka Crash, the log trigger Failover, you can write local 
      name: Failover 
      Primary: Kafka 
      #retryIntervalSeconds is switched by the abnormality, it is possible to increase the amount interval, such as above 10 minutes 
      Failovers: 
        AppenderRef: 
          - REF: Console 
  Loggers is: 
    Logger:
     - name: org.apache.http
       level: INFO
       additivity: false
       AppenderRef:
        - ref: Console
        - ref: RollingFile
        - ref: Failover
     - name: com.netflix.discovery
       level: ERROR
       additivity: false
       AppenderRef:
        - ref: Console
        - ref: RollingFile
        - ref: Failover
    Root:
      level: DEBUG
      AppenderRef:
       - ref: Console
       - ref: RollingFile
       - ref: Failover

c) add dependencies

< Dependency >  <-! Introduced log4j2 dependent -> 
    < the groupId > org.springframework.boot </ the groupId > 
    < the artifactId > Spring-Boot-Starter-log4j2 </ the artifactId > 
</ dependency > 

< dependency >   <! - plus the ability to recognize log4j2.yml file -> 
    < groupId > com.fasterxml.jackson.dataformat </ groupId > 
    < artifactId > jackson-the dataFormat-YAML </ artifactId >
</dependency>

Note: pom file to add file support depends yaml

 

Appendix: "MyBatis Chinese official website address"

Guess you like

Origin www.cnblogs.com/pinenut/p/11884419.html