Spark Worker 启动报错:No subfolder can be created in

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/Koprvhdix/article/details/79739356

解决方案写在前面:在spark-env.sh里有个参数 SPARK_LOCAL_DIRS,是存放shuffle数据落盘的目录,这个报错就是这个目录不存在导致的。创建目录重启worker,再将核数和内存均衡一下。

附录一下报错:

18/03/29 09:59:01 ERROR Worker: Failed to launch executor app-20180329063203-1549/1642 for Mobius.di.2::bdp-141. [dispatcher-event-loop-21]
java.io.IOException: No subfolder can be created in .
    at org.apache.spark.deploy.worker.Worker$$anonfun$receive$1$$anonfun$9.apply(Worker.scala:499)
    at org.apache.spark.deploy.worker.Worker$$anonfun$receive$1$$anonfun$9.apply(Worker.scala:484)
    at scala.collection.MapLike$class.getOrElse(MapLike.scala:128)
    at scala.collection.AbstractMap.getOrElse(Map.scala:59)
    at org.apache.spark.deploy.worker.Worker$$anonfun$receive$1.applyOrElse(Worker.scala:484)
	at org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp(Inbox.scala:117)
    at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:205)
    at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:101)
    at org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run(Dispatcher.scala:213)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

猜你喜欢

转载自blog.csdn.net/Koprvhdix/article/details/79739356