Macro used to increase the scala sql insert mode ORM

Similar Hibernateor JPA, a definition of case class, for example Person, then instantiate case classthe direct call dataSource.save(obj)or dataSource.saveWithSchema("person1")(obj)to insert data.

Examples of use

Define a data source, and then create a person table, the table structure is as follows:


6393906-3f542f6543845cb6.png
person.schema

Inserted by the program:

import com.xiaomi.ad.common.db.DruidDataSourceInitializer._
import wangzx.scala_commons.sql._

object ScalaSqlSpec {
    case class Person(name: String, age: Int, firtsHobby: String, typeName: String)

    def main(args: Array[String]): Unit = {
        val p1 = Person("Walter White", 50, "cook", "White")
        val p2 = Person("Jesse Pinkman", 26, "party", "Pinkman")

        dataSource.save(p1)
        dataSource.saveWithSchema("person")(p2)
    }
}

Execution, inserted successfully, log is as follows:

5-31 15:09:53 627 main INFO - {dataSource-1} inited
05-31 15:09:53 675 main DEBUG - SQL Preparing: INSERT INTO person (name,age,first_hobby,type_name) VALUES ( ?,?,?,? ) args: List(JdbcValue(Walter White), JdbcValue(50), JdbcValue(cook), JdbcValue(White))
05-31 15:09:53 696 main DEBUG - SQL result: 1
05-31 15:09:53 697 main DEBUG - SQL Preparing: INSERT INTO person (name,age,first_hobby,type_name) VALUES ( ?,?,?,? ) args: List(JdbcValue(Jesse Pinkman), JdbcValue(26), JdbcValue(party), JdbcValue(Pinkman))
05-31 15:09:53 717 main DEBUG - SQL result: 1

Method to realize

1. RichDataSource define and save saveWithSchema

RichDataSource equivalent is enhanced DataSource class that provides additional functionality to the program introduced by implicit conversion mode, we can call the method in RichDataSource by a general DataSource class.

def save[T: OrmInsert](dto: T): Int = withConnection(_.save(dto))

def saveWithSchema[T: OrmInsert](schema: String)(dto: T): Int = withConnection(_.saveWithSchema(schema, dto))

2. RichConnection define and save saveWithSchema

Also withConnection will get Connetion, we use RichConnection enhancement, the method RichConnection then you can call

def save[T: OrmInsert](dto: T): Int = {
        val (sql, sqlWithArgs) = implicitly[OrmInsert[T]].from(dto, None)
        val prepared = conn.prepareStatement(sql, Statement.NO_GENERATED_KEYS)

        try {
            if (sqlWithArgs != null) setStatementArgs(prepared, sqlWithArgs)

            LOG.debug("SQL Preparing: {} args: {}", Seq(sql, sqlWithArgs): _*)

            val result = prepared.executeUpdate()

            LOG.debug("SQL result: {}", result)

            result
        }
        finally {
            prepared.close()
        }
    }

    def saveWithSchema[T: OrmInsert](schemaName: String, dto: T): Int = {
        val (sql, sqlWithArgs) = implicitly[OrmInsert[T]].from(dto, Some(schemaName))
        val prepared = conn.prepareStatement(sql, Statement.NO_GENERATED_KEYS)

        try {
            if (sqlWithArgs != null) setStatementArgs(prepared, sqlWithArgs)

            LOG.debug("SQL Preparing: {} args: {}", Seq(sql, sqlWithArgs): _*)

            val result = prepared.executeUpdate()

            LOG.debug("SQL result: {}", result)

            result
        }
        finally {
            prepared.close()
        }
    }

3. A new method focuses on RichConnection

save and saveWithSchema Datong Little Italy, we analyze the save focused on this method

The key code is the following paragraph, similar to other code and before, no major changes

val (sql, sqlWithArgs) = implicitly[OrmInsert[T]].from(dto, None)

We passed in the object is of type T, where the use of context-bound characteristics, bind to T OrmInsert [T], there Introduction Context-Bound of the previous article.

trait OrmInsert[C] {
        def from(c: C, schemaName: Option[String]): (String, Seq[JdbcValue[_]])

        def build(c: C): (String, List[Token], Seq[JdbcValue[_]])
}

After binding to T speaking OrmInsert [T], primarily, from which may be performed by a method of parsing the incoming object c, then generates the insert to be inserted and sql parameter Seq [JdbcValue [_], for example:

The above objects in the program must be generated sql p1 and the list of parameters is the following:

INSERT INTO person (name,age,first_hobby,type_name) VALUES ( ?,?,?,? ) 

args: List(JdbcValue(Walter White), JdbcValue(50), JdbcValue(cook), JdbcValue(White))

By implicitly[OrmInsert[T]]get the implicit value OrmInsert [T] in the presence of the above program, where we get to be a trait OrmInsert Ga implementation class, then the class how to get it?

4. Generate OrmInsert [T] is implemented by class at compile Macro

A first exposed OrmInsert [T] values ​​implicit

implicit def materialize[C]: OrmInsert[C] = macro converterToMapMacro[C]

The code generated by the code at compile calling converterToMapMacro macro, to achieve the process of this method.

def converterToMapMacro[C: c.WeakTypeTag](c: whitebox.Context): c.Tree = {
            import c.universe._
            val tpe = weakTypeOf[C]

            val fields = tpe.decls.collectFirst {
                case m: MethodSymbol if m.isPrimaryConstructor => m
            }.get.paramLists.head

            val (names, jdbcValues) = fields.map { field =>
                val name = field.name.toTermName
                val decoded = name.decodedName.toString

                val value = q"JdbcValue.wrap(t.$name)"
                (q"Token($decoded)", value)
            }.unzip

            val schemaName = TermName(tpe.typeSymbol.name.toString).toString.toLowerCase()

            val tree =
                q"""
                     new AbstractInsert[$tpe] {
                        def build(t: $tpe) = ($schemaName,$names,$jdbcValues)
                     }
            """
            tree
}

The above code will eventually generate a tree, this tree as AST, as long as we understand it dynamically generates a code, the code will return AbstractInsert implementation class, AbstractInsert to OrmInsert abstract class.

By val tpe = weakTypeOf[C]acquiring the type type type C, this type of replacement type C of various types of information.

Get the TPE, all field names get this case class by the above-described operation, and field types, type C will follow instance passed to build OrmInsert method, i.e. def build(t: T), in the macro code, type T we use $tpeto in particular place, then we can get through t.fieldName i.e. mode field value of the current instance of t, then this easy to handle, we need to convert the value JdbcValue, so in the above code, using q " JdbcValue.wrap (t.decoded) "` the field name wrapped in Token, primarily for the purpose of converting the hump and underscores.

Finally generated implementation class as follows:

new AbstractInsert[$tpe] {
       def build(t: $tpe) = ($schemaName,$names,$jdbcValues)
}
  • $schemaName: This is the name of the case class, not specifically identified in the case, the database table name
  • $names: This is a field Token (filedName), primarily for the field name when the subsequent insertion and use a database
  • $jdbcValues: Here, since the incoming example t buid method, directly to each value in the form of t.fieldName, the data table for subsequent insertion prepare specific values.

The abstract class AbstractInsert using SQL to splice insert

abstract class AbstractInsert[C] extends OrmInsert[C] {

    def from(c: C, schemaName: Option[String]): (String, Seq[JdbcValue[_]]) = {
        val (schema, tokenNames, args) = build(c)
        val useSchema = schemaName match {
            case Some(value) if value != null ⇒ value
            case None ⇒ schema
        }
        val sqlFields = tokenNames.map(_.underscoreName).mkString(",")
        val interrogation = tokenNames.indices.map(_ ⇒ "?").mkString(",")
        val sql = s"INSERT INTO $useSchema ($sqlFields) VALUES ( $interrogation )"

        (sql, args)
    }
}

from () method and passing an instance c database table name schemaName, if you do not specify the use of the class name in lowercase case class.
Calls generated in the macro compile methods from method build()to obtain schema table, tokeNames Field List,
specific data to be inserted Seq [JdbcValue [_]]. Then start stitching sql, splicing code is relatively simple, they are not described in detail here.

Here will eventually pieced sql, to be inserted with field? He said then return splicing sql string and to access value.

6. Execute SQL final

def save[T: OrmInsert](dto: T): Int = {
        val (sql, sqlWithArgs) = implicitly[OrmInsert[T]].from(dto, None)
        val prepared = conn.prepareStatement(sql, Statement.NO_GENERATED_KEYS)

        try {
            if (sqlWithArgs != null) setStatementArgs(prepared, sqlWithArgs)

            LOG.debug("SQL Preparing: {} args: {}", Seq(sql, sqlWithArgs): _*)

            val result = prepared.executeUpdate()

            LOG.debug("SQL result: {}", result)

            result
        }
        finally {
            prepared.close()
        }
    }

PrepareStatement to eventually use the data into the database, the entire process is completed.

to sum up

Implemented in the Scala-sql in this object-oriented insert mode, the most important is how we want to get the value out of the object field and each field, but we do not know in advance how many fields there will be this object, if according to the normal mode, during the operation can also be done by the reflection mode, but this is not elegant.

The use Macro for doing do will be very elegant, We will get directly to the object to be inserted at compile all the fields and field values ​​corresponding to, to get this information after the splicing insert SQL will become more simple.

Want to know more can view the source code , this project fork from

Reproduced in: https: //www.jianshu.com/p/86f741f51e71

Guess you like

Origin blog.csdn.net/weixin_34101229/article/details/91095271