运行sparkStreaming出错Exception in thread "main" org.apache.spark.SparkException: Only one SparkContent

运行sparkStreaming 程序出错Exception in thread “main” org.apache.spark.SparkException: Only one SparkConte

19/12/18 11:21:38 INFO SharedState: Warehouse path is 'file:/C:/Users/baron/Desktop/sparktest/spark-warehouse/'.
Exception in thread "main" org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)
day01.SparkSql_01$.main(SparkSql_01.scala:16)
day01.SparkSql_01.main(SparkSql_01.scala)
	at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$2.apply(SparkContext.scala:2278)
	at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$2.apply(SparkContext.scala:2274)
	at scala.Option.foreach(Option.scala:257)
	at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2274)
	at org.apache.spark.SparkContext$.markPartiallyConstructed(SparkContext.scala:2353)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:85)
	at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:837)
	at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:84)
	at day01.SparkSql_01$.main(SparkSql_01.scala:19)
	at day01.SparkSql_01.main(SparkSql_01.scala)
19/12/18 11:21:38 INFO SparkContext: Invoking stop() from shutdown hook
19/12/18 11:21:38 INFO SparkUI: Stopped Spark web UI at http://192.168.23.1:4040
19/12/18 11:21:38 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/12/18 11:21:38 INFO MemoryStore: MemoryStore cleared
19/12/18 11:21:38 INFO BlockManager: BlockManager stopped
19/12/18 11:21:38 INFO BlockManagerMaster: BlockManagerMaster stopped
19/12/18 11:21:38 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/12/18 11:21:38 INFO SparkContext: Successfully stopped SparkContext
19/12/18 11:21:38 INFO ShutdownHookManager: Shutdown hook called
19/12/18 11:21:38 INFO ShutdownHookManager: Deleting directory C:\Users\baron\AppData\Local\Temp\spark-2bb06698-69c9-4c14-b55f-462782dcaa5b

原因是因为:我的当前代码多了一行代码,当主程序结束,sparkSession会关闭,代码会结束
而我的sparkStreaing要7x24小时运行,所以出错

val spark: SparkSession = SparkSession.builder().config(conf).getOrCreate()
发布了53 篇原创文章 · 获赞 4 · 访问量 944

猜你喜欢

转载自blog.csdn.net/weixin_43548518/article/details/103594565
今日推荐