Spark 各种配置项

/bin/spark-shell --master yarn --deploy-mode client
/bin/spark-shell --master yarn --deploy-mode cluster

There are two deploy modes that can be used to launch Spark applications on YARN.
In cluster mode, the Spark driver runs inside an application master process which is managed by YARN on the cluster, and the client can go away after initiating the application.
In client mode, the driver runs in the client process, and the application master is only used for requesting resources from YARN.

spark.yarn.am.cores	1	Number of cores to use for the YARN Application Master in client mode. In cluster mode, use spark.driver.cores instead.
spark.executor.instances	2	The number of executors for static allocation. With spark.dynamicAllocation.enabled, the initial set of executors will be at least this large.

YARN: The --num-executors option to the Spark YARN client controls how many executors it will allocate on the cluster (spark.executor.instances as configuration property),
while --executor-memory (spark.executor.memory configuration property)
and --executor-cores (spark.executor.cores configuration property) control the resources per executor.

猜你喜欢

转载自blog.csdn.net/zhixingheyi_tian/article/details/83827258