The most commonly used data structures Map and Tuple in Scala

package com.supeng.spark.scala

/**
 * 1. By default, Map constructs an immutable collection. The content inside cannot be modified. Once modified, it becomes a new Map. The original Map content remains unchanged
 * 2. The Map instance is called The factory method pattern apply to construct a Map instance, and the main thing is the Map interface, which uses a specific implementation in apply.
 *3. If you want to directly create a new Map instance, you need to use specific Map subclasses such as HashMap.
 *4. To query the value of a Map must use the syntax of getOrElse. On the one hand, no exception is reported when the key does not exist. Provide default values
 * The provision of default values ​​is very important in actual development. In spark development, many default values ​​are implemented through getorElse.
 *5, Use SortedMap to get a sorted collection of Map
 *6, LinkedHashMap can remember the order of inserted data, which is very useful in actual development
 *7, there are many different types of data in Tuple, such as ("tom","lilei ",30,"I am into Spark so much!!!");
 * 8, when actually developing big data at the enterprise level, Tuple will be used repeatedly to express data structure and Tuple to process business logic
 * 9 Another very important role of Tuple is to return several values ​​in Tuple as the return value of the function. Take SparkContext source code as an example to illustrate
 * val (sched, ts) = SparkContext.createTaskScheduler(this, master)
 * _schedulerBackend = sched
 * _taskScheduler = ts
 */
object HelloMapTuple {
  def main(args :Array[String]) :Unit ={
    val bigDatas = Map("Spark"-> 6,"Hadoop"-> 7) //Call the factory The method mode apply is used to construct Map instances, and the main requirement is that Map is an interface. Use the specific implementation class in apply
   
    val persons = Map(("jialin",30),("spark",1))
   
    val programingLanguage = scala. collection.mutable.Map("scala"->13,"Java"->10)
    programingLanguage("scala")=15
    for((name,age) <-programingLanguage) println(name + ": "+ age)
   
    println (programingLanguage.getOrElse("defalut", 111))
    println("--------------------------" )
       
    val personsInformation = new scala.collection.mutable.HashMap[String,Int]
    personsInformation += ("Scala"-> 13,"Java"-> 23)
    for((name,age)<-personsInformation) println(name +" : " + age)
    println("--------------------------")
   
    personsInformation -= ("Java")
    for((name,age)<-personsInformation) println(name +" : " + age)
    println("--------------------------")
   
    for(key <- personsInformation.keySet) println(key)
   
     println("--------------------------")
    
    
     for(value<- personsInformation.values) println(value)
    
     var result = for((name,age) <- personsInformation) yield (age,name)
    
     println("--------------------------")
      for((age,name)<-personsInformation) println(age +" : " + name)
    
     
         val persons2 = scala.collection.immutable.SortedMap(("jialin",30),("dtspark",1),("scala",12))
        
         for((name,age)<-persons2) println(name +" : " + age)
        
         val information = ("tom","lilei",30,"I am into Spark so much!!!")
         println(information._4)
  }
}

Guess you like

Origin blog.csdn.net/superiorpengFight/article/details/54315078