UDF sqarkSQL user-defined functions in the understanding and application

UDF sqarkSQL user-defined functions in the understanding and application

There is not a panacea sqarksql some functions can not be achieved so we need to customize function
e.g. sqarksql in concat_ws ( ",", "A ", "B") => The combined into two fields with a field of the intermediate, separated
input line udf returns a row is called the
input line returns multiple rows called udtf
enter multi-line return line is called udaf

Demand for example:
you enter an id let him give you return a provinces,
in the sql can not be achieved, we can be a function of his package, we tune later when I can call myself a good package can function the
code to achieve :
create a spark environment

val spark: SparkSession = SparkSession.builder().master("local[*]")
  .appName(this.getClass.getSimpleName)
  .getOrCreate()
  import spark.implicits._

Create a list collection:

val tp: Dataset[(String, String)] = spark.createDataset(List(("aaa","bbb"),("aaa","ccc"),("aaa","ddd")))
val df: DataFrame = tp.toDF("f1","f2").show()
表格实现:
+---+---+
| f1| f2|
+---+---+
|aaa|bbb|
|aaa|ccc|
|aaa|ddd|
+---+---+

Method 1: DSL style create a df.selectExpr ( "expression")

df.selectExpr("concat_ws('---',f1,f2) as f3").show()
代码实现:
+---------+
|       f3|
+---------+
|aaa---bbb|
|aaa---ccc|
|aaa---ddd|
+---------+

Second way: If you want to use df.select (concat_ws ()) you must import the spark in the sql function

import org.apache.spark.sql.functions._
df.select(concat_ws("|||",$"f1",'f2)as "f3")
代码实现:
+---------+
|       f3|
+---------+
|aaa|||bbb|
|aaa|||ccc|
|aaa|||ddd|
+---------+

Three ways: Register to view write sql statement

 
    df.createTempView("data")
    spark.sql(
      """
        |
        |select
        |concat_ws("_",f1,f2)as da
        |from
        |data
        |
      """.stripMargin).show()
代码实现:
  +---------+
|       da|
+---------+
|aaa///bbb|
|aaa///ccc|
|aaa///ddd|
+---------+    

Four ways: a method using a custom function DUF

  首先先注册一个udf传入三个参数1.函数名字,2.需要传入的可变参数们
    spark.udf.register("Myconcat_ws",(s:String,a:String,b:String)=>{
      a+s+b+a+b+s
    })
    df.selectExpr("Myconcat_ws('/-/-/-/',f1,f2) as f3").show()
    代码实现:
  +--------------------------+
|f3                        |
+--------------------------+
|aaa/-/-/-/bbbaaabbb/-/-/-/|
|aaa/-/-/-/cccaaaccc/-/-/-/|
|aaa/-/-/-/dddaaaddd/-/-/-/|
+--------------------------+
    

    spark.stop()
  }
}
Published 48 original articles · won praise 11 · views 1487

Guess you like

Origin blog.csdn.net/weixin_45896475/article/details/104428804