memory JVM apply enough lead to not start SparkContext

java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration.

When trying to run the program directly in the spark, we met the following error:

Obviously, this is not enough to cause memory JVM application can not start SparkContext. But how to set it?

 

By looking at spark source, we find the source is so written:

/**
* Return the total amount of memory shared between execution and storage, in bytes.
*/
private def getMaxMemory(conf: SparkConf): Long = {
val systemMemory = conf.getLong("spark.testing.memory", Runtime.getRuntime.maxMemory)
val reservedMemory = conf.getLong("spark.testing.reservedMemory",
if (conf.contains("spark.testing")) 0 else RESERVED_SYSTEM_MEMORY_BYTES)
val minSystemMemory = reservedMemory * 1.5
if (systemMemory < minSystemMemory) {
throw new IllegalArgumentException(s"System memory $systemMemory must " +
s"be at least $minSystemMemory. Please use a larger heap size.")
}
val usableMemory = systemMemory - reservedMemory
val memoryFraction = conf.getDouble("spark.memory.fraction", 0.75)
(usableMemory * memoryFraction).toLong
}


So, here it is mainly val systemMemory = conf.getLong ( "spark.testing.memory", Runtime.getRuntime.maxMemory).

conf.getLong () definitions and explanations are

getLong (Key: String, defaultValue: Long): Long
the Get A the Parameter AS A Long, Falling the Back to the SET A default IF not
so, we should set about spark.testing.memory in the conf.

By trying, you can find there are two places to set up

1. At its source, may be performed after adding conf:

    new new SparkConf the conf = Val (). setAppName ( "Word COUNT")
    conf.set ( "spark.testing.memory", "2.14748 billion") // following values can be greater than 512m

2. Run Configuration may be at the Eclipse, a column are the Arguments, below VMarguments, add the following line in the following (as long as the value is greater than 512m can)

-Dspark.testing.memory=1073741824

Other parameters may be dynamically set here, such -Dspark.master = spark: // hostname: 7077

I will not run again reported this mistake.

 

Guess you like

Origin www.cnblogs.com/AlanWilliamWalker/p/11680933.html