spark 资源动态分配

'spark.shuffle.service.enabled': 'true',
'spark.dynamicAllocation.enabled': 'false',
'spark.dynamicAllocation.initialExecutors': 50,
'spark.dynamicAllocation.minExecutors': 1,
'spark.dynamicAllocation.maxExecutors': 125,
'spark.sql.parquet.compression.codec': 'snappy',
'spark.yarn.executor.memoryOverhead': 4096,
"spark.speculation": 'true',
'spark.kryoserializer.buffer.max': '512m',

猜你喜欢

转载自www.cnblogs.com/jason-dong/p/10244156.html