Spark 编写WordCount程序

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/m0_37294838/article/details/89975361

注:此案例是以yarn的模式进行运行的,所以你需要启动hdfs与yarn集群

1.创建一个Maven项目WordCount并导入依赖

<dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.1.1</version>
    </dependency>
</dependencies>

<build>
        <finalName>WordCount</finalName>
        <plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
                <version>3.2.2</version>
                <executions>
                    <execution>
                       <goals>
                          <goal>compile</goal>
                          <goal>testCompile</goal>
                       </goals>
                    </execution>
                 </executions>
            </plugin>
        </plugins>
</build>

 

2)编写代码

args(0):读取文件的路径

args(1):文件输出路径

package com.test

import org.apache.spark.{SparkConf, SparkContext}

object WordCount{

  def main(args: Array[String]): Unit = {


//1.创建SparkConf并设置App名称
    val conf = new SparkConf().setAppName("WC")

//2.创建SparkContext,该对象是提交Spark App的入口
    val sc = new SparkContext(conf)

    //3.使用sc创建RDD并执行相应的transformation和action
    sc.textFile(args(0)).flatMap(_.split(" ")).map((_, 1)).reduceByKey(_+_, 1).sortBy(_._2, false).saveAsTextFile(args(1))


//4.关闭连接
    sc.stop()
  }
}

3)打包插件

<plugin>

                <groupId>org.apache.maven.plugins</groupId>

                <artifactId>maven-assembly-plugin</artifactId>

                <version>3.0.0</version>

                <configuration>

                    <archive>

                        <manifest>

                            <mainClass>WordCount</mainClass>

                        </manifest>

                    </archive>

                    <descriptorRefs>

                        <descriptorRef>jar-with-dependencies</descriptorRef>

                    </descriptorRefs>

                </configuration>

                <executions>

                    <execution>

                        <id>make-assembly</id>

                        <phase>package</phase>

                        <goals>

                            <goal>single</goal>

                        </goals>

                    </execution>

                </executions>

      </plugin>

4)打包到集群测试

此命令要在spark的安装的目录下运行

bin/spark-submit \

--class com.test.WordCount \

--master yarn \

WordCount.jar \

/word.txt \

/out

猜你喜欢

转载自blog.csdn.net/m0_37294838/article/details/89975361