Tuning the spark jvm

1, practical application

val sc = new SparkContext(new SparkConf().
  setAppName("product3_source").
  set("spark.serializer", "org.apache.spark.serializer.KryoSerializer").
  set("spark.memory.useLegacyMode", "true").
  set("spark.storage.memoryFraction","0.2").
  set("spark.shuffle.memoryFraction","0.7")
)

2, calculates the actual memory

spark.storage.safetyFraction to specify the proportion of the security zone, the default value is 0.9
spark.storage.memoryFraction specified, the default value is 0.6

0.9× 0.6 = 0.54

Spark.Shuffle.safetyFraction safe area ratio is specified, the default value of the parameter is 0.8
Spark.Shuffle.memoryFraction specified, the default value is 0.2,
0.8 × 0.2 = 0.16

3、spark.yarn.executor.memoryOverhead

When the executor performed with memory may exceed executor-memoy, it will reserve a portion of memory executor extra. spark.yarn.executor.memoryOverhead represent this memory.

and so

executorMem=executorMemory +spark.yarn.executor.memoryOverhead
Published 159 original articles · won praise 75 · views 190 000 +

Guess you like

Origin blog.csdn.net/xuehuagongzi000/article/details/104060009