Spark API 详解/大白话解释 之 groupBy、groupByKey

groupBy(function) 
function返回key,传入的RDD的各个元素根据这个key进行分组

val a = sc.parallelize(1 to 9, 3)
a.groupBy(x => { if (x % 2 == 0) "even" else "odd" }).collect//分成两组
/*结果 
Array(
(even,ArrayBuffer(2, 4, 6, 8)),
(odd,ArrayBuffer(1, 3, 5, 7, 9))
)
*/

 

val a = sc.parallelize(1 to 9, 3)
def myfunc(a: Int) : Int =
{
  a % 2//分成两组
}
a.groupBy(myfunc).collect

 

/* 
结果 
Array( 
(0,ArrayBuffer(2, 4, 6, 8)), 
(1,ArrayBuffer(1, 3, 5, 7, 9)) 

*/

groupByKey( )

val a = sc.parallelize(List("dog", "tiger", "lion", "cat", "spider", "eagle"), 2)
val b = a.keyBy(_.length)//给value加上key,key为对应string的长度
b.groupByKey.collect
//结果 Array((4,ArrayBuffer(lion)), (6,ArrayBuffer(spider)), (3,ArrayBuffer(dog, cat)), (5,ArrayBuffer(tiger, eagle)))

 

猜你喜欢

转载自longzhun.iteye.com/blog/2283162