Several ways to create RDD in Spark

Create RDD from collection

val conf: SparkConf = new SparkConf().setAppName(this.getClass.getName).setMaster("local[*]")
val sc = new SparkContext(conf)

// 方式一:parallelize方法创建RDD
//val rdd: RDD[Int] = sc.parallelize(List(1,2,3,4))

// 方式二:makeRDD方法创建RDD
val rdd: RDD[Int] = sc.makeRDD(List(1,2,3,4))
rdd.collect().foreach(println)

sc.stop()

Read file to create RDD

val conf: SparkConf = new SparkConf().setAppName(this.getClass.getName).setMaster("local[*]")
val sc = new SparkContext(conf)
val rdd: RDD[String] = sc.textFile("D:\\develop\\workspace\\bigdata2021\\spark2021\\input")

sc.stop()

Create RDD from other RDD

val conf: SparkConf = new SparkConf().setAppName(this.getClass.getName).setMaster("local[*]")
val sc = new SparkContext(conf)
val rdd: RDD[String] = sc.textFile("D:\\develop\\workspace\\bigdata2021\\spark2021\\input")
val flatRDD: RDD[String] = rdd.flatMap(_.split(" "))
sc.stop()

Guess you like

Origin blog.csdn.net/FlatTiger/article/details/114917679