scala mvn project的运行(未完!!)

package spark.core
import org.apache.spark.{SparkConf, SparkContext}

object WordCount { 
  def main(args: Array[String]): Unit = {
     val conf = new SparkConf()
       .setMaster("local")
       .setAppName("WordCount")
       .set("spark.testing.memory", "2147480000")
     val sc = new SparkContext(conf)
    //    val lines = sc.textFile("hdfs://spark01:9000/user/spark/spark.txt", minPartitions = 1)
    //    val lines = sc.textFile("/home/hadoop/spark.txt", minPartitions = 1)
    //    val lines = sc.textFile("C:/Users/Administrator/Desktop/spark.txt", minPartitions = 1)
    val lines = sc.textFile("file:///D:/sparktest.txt", minPartitions = 1)
    val words = lines.flatMap(_.split(" "))
    val pairs = words.map((_, 1))
    val wordCount = pairs.reduceByKey(_ + _)
    wordCount.foreach(wordCount => println(wordCount._1 + " appeared " + wordCount._2 + " times."))
    println(lines.count())
    println(lines.first())
  }

}

运行报错:

java.lang.ClassNotFoundException: org.spark_project.guava.cache.CacheLoader

搜索了下发现在本地repo库里有

C:\Users\zhangming\.m2\repository\org\apache\spark\spark-network-common_2.11\2.3.0\spark-network-common_2.11-2.3.0.jar

工程右键-->Build Path-->Configure Build Path-->Java Build Path-->Libraries-->Add External Jars

再运行,报错:

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/lang3/SystemUtils

本地repo库中有:C:\Users\zhangming\.m2\repository\org\apache\commons\commons-lang3\3.5\commons-lang3-3.5.jar

而maven依赖中是C:\Users\zhangming\.m2\repository\commons-lang\commons-lang\2.6\commons-lang-2.6.jar

工程右键-->Build Path-->Configure Build Path-->Java Build Path-->Libraries-->Add External Jars


再运行,报错:

18/06/27 10:23:48 ERROR Shell: Failed to locate the winutils binary in the hadoop binary path

java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.

java.lang.NoClassDefFoundError: com/codahale/metrics/MetricRegistry

此报错是由于程序需要根据HADOOP_HOME找到winutils。

由于windows机器没有配置此环境,所以报null\bin\winutils.exe。

下载winutils的windows版本。地址https://github.com/srccodes/hadoop-common-2.2.0-bin,解压到D:\Program Files\hadoop-common-2.2.0-bin-master

增加用户环境变量HADOOP_HOME,值为D:\Program Files\hadoop-common-2.2.0-bin-master

在系统变量的path里增加D:\Program Files\hadoop-common-2.2.0-bin-master\bin

再运行,仍然报一样的错。

写个java程序

String home = System.getenv("HADOOP_HOME");

System.out.println(home);

运行结果是 null

电脑重启,再运行。这个问题解决了。只剩java.lang.NoClassDefFoundError: com/codahale/metrics/MetricRegistry

在本地repo中搜到C:\Users\zhangming\.m2\repository\com\yammer\metrics\metrics-core\2.2.0\metrics-core-2.2.0.jar

Add External Jars,仍然报同样的错。

注意到报错路径是com/codahale/metrics/MetricRegistry

而本地路径是com\yammer\metrics\metrics-core\2.2.0\metrics-core-2.2.0.jar

于是添加依赖:

<!-- https://mvnrepository.com/artifact/com.codahale.metrics/metrics-core -->
<dependency>
    <groupId>com.codahale.metrics</groupId>
    <artifactId>metrics-core</artifactId>
    <version>3.0.2</version>

</dependency>

问题解决。再运行,报错:

java.lang.NoClassDefFoundError: org/apache/spark/util/kvstore/KVStore

repo中C:\Users\zhangming\.m2\repository\org\apache\spark\spark-kvstore_2.11\2.3.0\spark-kvstore_2.11-2.3.0.jar添加到External Jars。

问题解决。再运行,报错:

java.lang.NoClassDefFoundError: org/apache/spark/network/shuffle/ShuffleClient

本地repo中C:\Users\zhangming\.m2\repository\org\apache\spark\spark-network-shuffle_2.11\2.3.0\spark-network-shuffle_2.11-2.3.0.jar添加到External Jars。

问题解决。再运行,报错:

java.lang.NoClassDefFoundError: io/netty/channel/Channel

本地repo中没有搜到,添加依赖:

<!-- https://mvnrepository.com/artifact/io.netty/netty-all -->
<dependency>
    <groupId>io.netty</groupId>
    <artifactId>netty-all</artifactId>
    <version>4.1.25.Final</version>

</dependency>

问题解决。再运行,报错:

java.lang.NoClassDefFoundError: com/esotericsoftware/kryo/io/Output

本地repo中C:\Users\zhangming\.m2\repository\com\esotericsoftware\kryo-shaded\3.0.3添加到External Jars。

问题解决。再运行,报错:

java.lang.NoClassDefFoundError: com/twitter/chill/KryoBase

本地repo中C:\Users\zhangming\.m2\repository\com\twitter\chill_2.11\0.8.4\chill_2.11-0.8.4.jar添加到External Jars。

问题解决。再运行,报错:

java.lang.NoClassDefFoundError: org/apache/spark/unsafe/array/ByteArrayMethods

本地repo中C:\Users\zhangming\.m2\repository\org\apache\spark\spark-unsafe_2.11\2.3.0\spark-unsafe_2.11-2.3.0.jar

添加到External Jars。

问题解决。再运行,报错:

java.lang.NoClassDefFoundError: org/json4s/JsonAST$JValue

本地repo中C:\Users\zhangming\.m2\repository\org\json4s\json4s-ast_2.11\3.2.11\json4s-ast_2.11-3.2.11.jar

问题解决。再运行,报错:

java.lang.NoClassDefFoundError: javax/servlet/FilterRegistration

C:\Users\zhangming\.m2\repository\javax\servlet\javax.servlet-api\3.1.0\javax.servlet-api-3.1.0.jar

问题解决。再运行,报错:

C:\Users\zhangming\.m2\repository\org\glassfish\jersey\containers\jersey-container-servlet-core\2.22.2\jersey-container-servlet-core-2.22.2.jar

问题解决。再运行,报错:

java.lang.NoClassDefFoundError: org/glassfish/jersey/server/spi/Container

C:\Users\zhangming\.m2\repository\org\glassfish\jersey\core\jersey-server\2.22.2\jersey-server-2.22.2.jar

问题解决。再运行,报错:

java.lang.NoClassDefFoundError: org/apache/spark/launcher/LauncherProtocol$Message\spark-launcher_2.11-2.3.0.jar

C:\Users\zhangming\.m2\repository\org\apache\spark\spark-launcher_2.11\2.3.0\

问题解决。再运行,报错:

java.lang.NoClassDefFoundError: com/fasterxml/jackson/databind/Module

C:\Users\zhangming\.m2\repository\com\fasterxml\jackson\core\jackson-databind\2.9.1\jackson-databind-2.9.1.jar

问题解决。再运行,报错:

java.lang.NoClassDefFoundError: com/fasterxml/jackson/core/Versioned

C:\Users\zhangming\.m2\repository\com\fasterxml\jackson\core\jackson-core\2.9.1\jackson-core-2.9.1.jar

问题解决。再运行,报错:

java.lang.NoClassDefFoundError: com/codahale/metrics/json/MetricsModule

C:\Users\zhangming\.m2\repository\com\codahale\metrics\metrics-core\3.0.2\metrics-core-3.0.2.jar

仍然报错:

java.lang.NoClassDefFoundError: com/codahale/metrics/json/MetricsModule

添加依赖:

    <!-- https://mvnrepository.com/artifact/com.codahale.metrics/metrics-json -->
<dependency>
    <groupId>com.codahale.metrics</groupId>
    <artifactId>metrics-json</artifactId>
    <version>3.0.1</version>

</dependency>

问题解决。再运行,报错:

java.lang.NoClassDefFoundError: com/fasterxml/jackson/module/scala/DefaultScalaModule$

C:\Users\zhangming\.m2\repository\com\fasterxml\jackson\module\jackson-module-scala_2.11\2.6.7.1\jackson-module-scala_2.11-2.6.7.1.jar

问题解决。再运行,报错:

java.lang.NoSuchMethodError: com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer$.handledType()Ljava/lang/Class;

添加依赖:

<!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.module/jackson-module-scala -->
<dependency>
    <groupId>com.fasterxml.jackson.module</groupId>
    <artifactId>jackson-module-scala_2.12</artifactId>
    <version>2.9.6</version>

</dependency>

问题解决。再运行,报错:

java.lang.VerifyError: Cannot inherit from final class

此问题一般是因为存在不同版本的jar报导致的。将前面依赖删除。继续报错:

java.lang.NoSuchMethodError: com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer$.handledType()Ljava/lang/Class;


于是把C:\Users\zhangming\.m2\repository\com\fasterxml\jackson\core\jackson-databind\2.2.2 删除

删除的时候提示,eclipse正在使用,无法删除,正好验证了可能是由于jackson-databind-2.2.2.jar引起的。把eclipse退出,删除成功,重启eclipse,eclipse报错打不开了。。。

删除D:\zhangmingworkspace\.metadata\.log中的相关D:\zhangmingworkspace\.metadata\.plugins下的文件夹。重启eclipse,Open Object From File System,把原来的project import进来。报错如下:

1.Error in Scala compiler: object java.lang.Object in compiler mirror not found.

2.Archive for required library: 'C:/Users/zhangming/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.2.2/jackson-databind-2.2.2.jar' in project '0625mvn' cannot be read or is not a valid ZIP file

===================================分割线=============================================

今天重新打开Scala IDE,报上面错中的2:

Archive for required library: 'C:/Users/zhangming/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.2.2/jackson-databind-2.2.2.jar' in project '0625mvn' cannot be read or is not a valid ZIP file

C:/Users/zhangming/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.2.2/中的文件全部删除。

重新打开Scala IDE,报新的错误信息:The container 'Maven Dependencies' references non existing library  'C:\Users\zhangming\.m2\repository\com\fasterxml\jackson\core\jackson-databind\2.2.2\jackson-databind-2.2.2.jar'

我说在Referenced Library中加的是2.9.1版本的 jackson-databind-2.9.1.jar,怎么一直报2.2.2版本的错。看了下Maven Dependencies里果然有2.2.2版本的jackson-databind-2.2.2.jar。

是哪个依赖把这个jar带进来的?

把pom里的依赖挨个注释掉,保存,再去掉注释,保存。

发现把    <!-- https://mvnrepository.com/artifact/com.codahale.metrics/metrics-json  -->
<dependency>
    <groupId>com.codahale.metrics</groupId>
    <artifactId>metrics-json</artifactId>
    <version>3.0.1</version>

</dependency>

注释掉的时候还有jackson-databind-2.2.2.jar;把注释去掉,jackson-databind-2.2.2.jar就没了!~~~~~~

同时,上面那个报错也没了~~~~~~~~~

环境的报错没有了。开始运行程序,报错:

NoSuchMethodError:com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer$.handledType()Ljava/lang/Class;

这个错是因为jar报的版本不一致造成的。

添加依赖:


<!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.module/jackson-module-scala -->
<dependency>
    <groupId>com.fasterxml.jackson.module</groupId>
    <artifactId>jackson-module-scala_2.12</artifactId>
    <version>2.9.1</version>  (由于上面的都是2.9.1版本,所以这里也选2.9.1版本)

</dependency>

上面问题解决,又报错:

java.lang.VerifyError: Cannot inherit from final class

at org.apache.spark.SparkContext.textFile(SparkContext.scala:821)

at spark.core.WordCount$.main(WordCount.scala:16)

参考网上的方案,说这个错还是由于jar包的版本问题引起的

问题无穷多,搞不定了我操

猜你喜欢

转载自blog.csdn.net/yblbbblwsle/article/details/80824969
mvn
今日推荐