spark.yarn.jars的配置

1.原因:

To make Spark runtime jars accessible from YARN side, you can specify spark.yarn.archive or spark.yarn.jars. For details please refer to Spark Properties. If neither spark.yarn.archive nor spark.yarn.jars is specified, Spark will create a zip file with all jars under $SPARK_HOME/jars and upload it to the distributed cache

设置

  • 将spark/jars下的jar上传到hdfs中
#创建目录
[hadoop@hadoop001 jars]$ hadoop fs -mkdir -p /spark-yarn/jars/
#上传jar
[hadoop@hadoop001 jars]$ hadoop fs -put *.jar /spark-yarn/jars/
#配置目录 spark-defaults.conf 添加
[hadoop@hadoop001 conf]$ vi spark-defaults.conf

spark.yarn.jars                    hdfs://hadoop001:9000/spark-yarn/jars/*.jar

发布了33 篇原创文章 · 获赞 1 · 访问量 2571

猜你喜欢

转载自blog.csdn.net/weixin_44131414/article/details/102554648