Good programmers to share large data line learning Scala series of maps Map

Good programmers to share large data line learning Scala series of maps Map First, let's look at what is mapped ( the Map )

In Scala, the data structure is called a hash table that maps.

1.  Construction of the mapping

In Scala, there are two Map, a packet is immutable in the Map, the contents of the Map immutable; the other is under the mutable Map package, the contents of the variable Map.

Construction of an immutable map


Construction mode tuple


Construction of a variable map


2. The  value of the acquisition and modify a map

The key to obtain the map value corresponding to the following three methods can be recommended particularly getOrElse method.


Modify variable map information, traversal access map

Object  MappingDemo {
  DEF  main (args: the Array [String]): Unit = {
    // definition of a variable build Map
     Val  Scores = scala.collection.mutable.Map ( "zhangsan" -> 90, "Lisi" -> 80 , "wangwu" -> 0)
    // = scala.collection.mutable.Map scores2 Val ( "Moumou" -> 50)   
    // modify the map value corresponding to the key
    Scores ( "wangwu") = 100

    // Add a new key to map the
    Scores ( "zhaoliu") = 50  // similar  scores.update ( "zhangsan", 50)
    Scores + = ( "sunqi" -> 60, "qianba" -> 99)
    // Scores ++ = scores2

    // remove a key-value pair
    scores - = " zhangsan " // similar scores.remove ( "zhangsan")
     // Scores - = scores2  difficult to use? ? ?

    // get a set of keys and traverse
     // little significance? How to get the value built
     Val  RES = scores.keySet
     for (elem <- RES)
      Print (elem + "")
    the println ()

     // traverse Map
     for  ((K, V) <- Scores)
      Print (K + ":" + V + "")
  }

Results of the


3. HashMap

Variable map

import scala.collection.mutable

object MutMapDemo extends App{
  val map1 = new mutable.HashMap[String, Int]()
  //map中添加数据
  map1("spark") = 1
  map1 += (("hadoop", 2))
  map1.put("storm", 3)
  println(map1)

  //map中移除元素
  map1 -= "spark"
  map1.remove("hadoop")
  println(map1)
}


Guess you like

Origin blog.51cto.com/14479068/2435449
Recommended