Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
18/08/13 17:07:34 WARN Utils: Your hostname, ubuntu resolves to a loopback address: 127.0.0.1; using 219.223.197.110 instead (on interface eth0)
18/08/13 17:07:34 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/streaming/StreamingContext
at com.scalalearn.scala.main.LogAnalyzerAppMain$.main(LogAnalyzerAppMain.scala:65)
at com.scalalearn.scala.main.LogAnalyzerAppMain.main(LogAnalyzerAppMain.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.streaming.StreamingContext
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 2 more
注意,Intellij是在java应用中提交的,不是通过spark-submit提交的.
pom中的provided指的是编译需要,发布不需要,当我们通过spark-submit提交时,spark会提供需要的streaming包,
而Intellij是通过java提交的,在运行时依然需要streaming的包,所以需要去掉.
所以最终解决方案是:
去掉pom.xml中的
<scope>provided</scope>
即可