intellij出现A master URL must be set in your configuration

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
18/08/13 17:33:33 WARN Utils: Your hostname, ubuntu resolves to a loopback address: 127.0.0.1; using 219.223.197.110 instead (on interface eth0)
18/08/13 17:33:33 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
18/08/13 17:33:35 INFO SparkContext: Running Spark version 2.3.1
18/08/13 17:33:36 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/08/13 17:33:36 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: A master URL must be set in your configuration
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:367)
    at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:838)
    at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:85)
    at com.scalalearn.scala.main.LogAnalyzerAppMain$.main(LogAnalyzerAppMain.scala:65)
    at com.scalalearn.scala.main.LogAnalyzerAppMain.main(LogAnalyzerAppMain.scala)
18/08/13 17:33:36 ERROR Utils: Uncaught exception in thread main
java.lang.NullPointerException
    at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$postApplicationEnd(SparkContext.scala:2389)
    at org.apache.spark.SparkContext$$anonfun$stop$1.apply$mcV$sp(SparkContext.scala:1904)
    at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1360)
    at org.apache.spark.SparkContext.stop(SparkContext.scala:1903)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:579)
    at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:838)
    at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:85)
    at com.scalalearn.scala.main.LogAnalyzerAppMain$.main(LogAnalyzerAppMain.scala:65)
    at com.scalalearn.scala.main.LogAnalyzerAppMain.main(LogAnalyzerAppMain.scala)
18/08/13 17:33:36 INFO SparkContext: Successfully stopped SparkContext
Exception in thread "main" org.apache.spark.SparkException: A master URL must be set in your configuration
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:367)
    at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:838)
    at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:85)
    at com.scalalearn.scala.main.LogAnalyzerAppMain$.main(LogAnalyzerAppMain.scala:65)
    at com.scalalearn.scala.main.LogAnalyzerAppMain.main(LogAnalyzerAppMain.scala)
18/08/13 17:33:36 INFO ShutdownHookManager: Shutdown hook called

Process finished with exit code 1
解决方案:

进入:Run > Edit Configurations... > Application > "My project name" > Configuraton,

设置VM options项为-Dspark.master=local.

-Dspark.master=local:表示设置我的spark程序以local模式运行.

猜你喜欢

转载自blog.csdn.net/appleyuchi/article/details/81633941