An error occurred during spark runtime Caused by: java.lang.ClassNotFoundException: org.apache.spark.rdd.RDD

1 Error message

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/sql/SparkSession$
	at Demo1_QuickStart$.main(Demo1_QuickStart.scala:7)
	at Demo1_QuickStart.main(Demo1_QuickStart.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.SparkSession$
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	... 2 more

Process finished with exit code 1

2 Reason analysis

      <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.10</artifactId>
            <version>${spark.version}</version>
            <!--  <scope>provided</scope> -->
      </dependency>

I added it to the pom file

<scope>provided</scope>

This label indicates that the compilation environment is available, but it is not available at runtime. The main reason is that these dependencies are not entered when packaging, because there are on the server to avoid the package being too large.

Solution: Log out <scope>provided</scope> when running locally, as shown in the figure above.

 

 

 

Guess you like

Origin blog.csdn.net/godlovedaniel/article/details/114190802