Detailed explanation of special symbols in Scala

1. => (anonymous function)

=> Anonymous function. In Spark, a function is also an object that can be assigned to a variable.

Spark's anonymous function definition format:
(parameter list) => {function body}

Therefore, the role of => is to create an anonymous function instance.

For example: (x:Int) => x +1, which is equivalent to the following Java method:

public int function(int x) {
    
    
 return x+1;
}

example:

class Symbol {
    
    
 var add = (x: Int) => x + 1
}
 
object test2 {
    
    
 def main (args: Array[String] ): Unit = {
    
    
 var symbol = new Symbol
 printf(""+ symbol.add.toString())
 }
}

2. <- (Set traversal)

Loop traversal, examples are as follows:

var list = Array(1,2,3,4)
for (aa <- list) {
    
    
 printf(aa+" ")
}

The above code is similar to the Java code:

int[] list = {
    
    1,2,3,4};
for(int aa : list) {
    
    
 System.out.print(aa+" ");
}

3.++= (string splicing)

var s:String = "a"
s+="b"
println(s)
s++="c"
println(s)

4. ::: Three colon operators and :: two colon operators

  • ::: The three colon operators represent the concatenation operation of List. (Similar to list1.addAll(list2) in Java)
  • :: The two colon operators represent the concatenation operation between ordinary elements and list. (Similar to the list1.add(A) operation in Java)
例子: 
val one = List(1,2,3)
val two = List(4,5,6)
val three = one ::: two
println(three.toString())
 
val four = 7 :: three
 
println(four.toString())

5. -> Construct tuple and _N visit the Nth element of tuple

1. The meaning of tuples in scala:

  • Tuple is a list of different types of value aggregation threads
  • By enclosing multiple values ​​in parentheses, it means a tuple

2. The difference between tuple and array in scala: the data types of the elements in the array must be the same, but the data types of the tuples can be different.

例子:
val first = (1,2,3) // 定义三元元组
 
val one = 1
val two = 2
val three = one -> two
 
println(three) // 构造二元元组
 
println(three._2) // 访问二元元组中第二个值

Insert picture description here

6. Usage of _ (underscore)

  • Wildcard
    _ can play a wildcard similar to *:
import org.apache.spark.SparkContext._
  • Refers to each element in the collection
例如 遍历集合筛选列表中大于某个值的元素:

val lst = List(1,2,3,4,5)
val lstFilter = lst.filter(_ > 3)
  • Get the element value of the specified subscript in the tuple
val ss = (1,"22","333")
println(ss._1)
  • Use pattern matching to get the members of the tuple
val m = Map(1 -> 2,2 -> 4)
for ((k,_) <- m) println(k) //如果不需要所有部件, 则在不需要的部件使用_; 本例只取key,因此在value处用_

Insert picture description here

  • Member variables instead of local variables add default values
var s:Int=_
def main(args: Array[String]): Unit = {
    
    
 println(s)
}

6. +=

Add elements to the variable array

val arrBuf1 = new ArrayBuffer[Int]()
arrBuf1+= 11 // 添加一个元素
 
println(arrBuf1)

7. -=

Remove the corresponding value from the variable array of map

val arrBuf1 = new ArrayBuffer[Int]()
arrBuf1+= 11 // 添加一个元素
arrBuf1+= 12 // 添加一个元素
arrBuf1-= 12 // 删除一个元素
 
println(arrBuf1)
 
var map = Map(1 -> 1,2 -> 2,3 ->3 )   //key -> value
map-=1
println(map)

Guess you like

Origin blog.csdn.net/m0_49834705/article/details/112717099