Spark : 在IDEA中用scala编写Spark的WordCount程序并提交运行

使用IDEA新建maven工程,添加相关依赖:

<properties>
    <maven.compiler.source>1.8</maven.compiler.source>
    <maven.compiler.target>1.8</maven.compiler.target>
    <scala.version>2.11.11</scala.version>
    <spark.version>2.3.0</spark.version>
    <hadoop.version>2.7.7</hadoop.version>
    <encoding>utf-8</encoding>
</properties>

<dependencies>
    <!-- https://mvnrepository.com/artifact/org.scala-lang/scala-library -->
    <!-- 导入scala的依赖 -->
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
        <version>${scala.version}</version>
    </dependency>
    <!-- 导入spark的依赖 -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <!-- 指定hadoop client API版本 -->
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-client</artifactId>
        <version>2.7.7</version>
    </dependency>
</dependencies>

给项目添加scala支持,新建Scala Object文件

编写程序:

import org.apache.spark.rdd.RDD
import org.apache.spark.{SparkConf, SparkContext}


object WordCountDemo {
  def main(args: Array[String]): Unit = {


    //创建Spark配置对象SparkConf,设置应用程序名字
    val conf = new SparkConf().setAppName("WordCountDemo")
    //创建spark执行入口
    val sc = new SparkContext(conf)
    //指定以后从哪里读取数据创建RDD(弹性分布式数据集)
    val lines: RDD[String] = sc.textFile(args(0))
    //切分压平
    val words: RDD[String] = lines.flatMap(_.split(" "))
    //将单词和1组合
    val wordAndOne: RDD[(String, Int)] = words.map((_, 1))
    //按key进行聚合
    val reduced: RDD[(String, Int)] = wordAndOne.reduceByKey(_+_)
    //按value进行排序
    val sorted: RDD[(String, Int)] = reduced.sortBy(_._2, false)
    //将结果保存到hdfs中
    sorted.saveAsTextFile(args(1))
    //释放资源
    sc.stop()
  }
}

执行maven的打包命令
将打包上传至服务器
开启hdfs和spark,将输入文本上传至hdfs

执行任务:

/soft/spark/bin/spark-submit --class com.wby.demo.WordCountDemo /root/jars/spark-1.0-SNAPSHOT.jar hdfs://192.168.124.132:9000/test/w hdfs://192.168.124.132:9000/test/res

猜你喜欢

转载自www.cnblogs.com/wbyixx/p/11111926.html