Spark算子之map、flatMap

map(func):
源码

  /**
   * Return a new RDD by applying a function to all elements of this RDD.
   */
  def map[U: ClassTag](f: T => U): RDD[U] = withScope {
    val cleanF = sc.clean(f)
    new MapPartitionsRDD[U, T](this, (context, pid, iter) => iter.map(cleanF))
  }

作用 : 返回一个新的RDD,该RDD由每一个输入元素经过func函数转换后组成。

val arr: RDD[Int] = sc.makeRDD(Array(1, 2, 3, 4, 5))
val arrMap: RDD[Int] = arr.map(_ * 2)
val arrColl: Array[Int] = arrMap.collect()
for (aa <- arrColl) {
  	 print(aa + " ")
}

结果是:2 4 6 8 10

flatMap:
源码

  /**
   *  Return a new RDD by first applying a function to all elements of this
   *  RDD, and then flattening the results.
   */
  def flatMap[U: ClassTag](f: T => TraversableOnce[U]): RDD[U] = withScope {
    val cleanF = sc.clean(f)
    new MapPartitionsRDD[U, T](this, (context, pid, iter) => iter.flatMap(cleanF))
  }

作用:通过首先将函数应用于此RDD的所有元素,然后展平结果,返回新的RDD。

猜你喜欢

转载自blog.csdn.net/wz272343078/article/details/94571628
今日推荐