Caused by: java.lang.ClassNotFoundException: scala.collection.GenTraversableOnce$class

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
18/04/21 19:05:26 INFO SparkContext: Running Spark version 1.6.2
Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class
at org.apache.spark.util.TimeStampedWeakValueHashMap.<init>(TimeStampedWeakValueHashMap.scala:42)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:298)
at scala.demo2.JdbcRDD$.main(JdbcRDD.scala:16)
at scala.demo2.JdbcRDD.main(JdbcRDD.scala)
Caused by: java.lang.ClassNotFoundException: scala.collection.GenTraversableOnce$class
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)

... 4 more

You need to make sure that the scala version used by spark is the same as your system scala version

View build.sbt:

  

copy code
name := "ScalaSBT"

version := "1.0"

scalaVersion := "2.11.8"

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.6.1"
copy code

 

You need to make sure that the scala version used by spark is the same as your system scala version

 

You can also do this:

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"





Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=324599757&siteId=291194637