Apache Spark driver memory

Drakker :

I've been trying to install and run a simple Java Apache Spark in intellij on windows but i have an error I can't solve. I have installed spark via maven. I get this error:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
19/03/20 23:53:23 INFO SparkContext: Running Spark version 2.0.0-cloudera1-SNAPSHOT
19/03/20 23:53:24 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/03/20 23:53:24 INFO SecurityManager: Changing view acls to: Drakker
19/03/20 23:53:24 INFO SecurityManager: Changing modify acls to: Drakker
19/03/20 23:53:24 INFO SecurityManager: Changing view acls groups to: 
19/03/20 23:53:24 INFO SecurityManager: Changing modify acls groups to: 
19/03/20 23:53:24 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(Drakker); groups with view permissions: Set(); users  with modify permissions: Set(Drakker); groups with modify permissions: Set()
19/03/20 23:53:25 INFO Utils: Successfully started service 'sparkDriver' on port 50007.
19/03/20 23:53:25 INFO SparkEnv: Registering MapOutputTracker
19/03/20 23:53:25 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration.
    at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:212)
    at org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:194)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:308)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:165)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:260)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:429)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
    at Spark.App.main(App.java:16)
19/03/20 23:53:25 INFO SparkContext: Successfully stopped SparkContext
Exception in thread "main" java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration.
    at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:212)
    at org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:194)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:308)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:165)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:260)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:429)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
    at Spark.App.main(App.java:16)

I tried setting driver memory manually but it didn't work. I also tried installing spark locally but changing driver memory from command prompt didn't help.

This is the code:

package Spark;

import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;

import java.util.Arrays;
import java.util.List;

public class App 
{
    public static void main( String[] args )
    {
        SparkConf conf = new SparkConf().setAppName("Spark").setMaster("local");
//        conf.set("spark.driver.memory","471859200");
        JavaSparkContext sc = new JavaSparkContext(conf);


        List<Integer> data= Arrays.asList(1,2,3,4,5,6,7,8,9,1,2,3,4,5,6,7,8,9);
        JavaRDD<Integer> rdd=sc.parallelize(data);
        JavaRDD<Integer> list=rdd.map(s->s);
        int totalLines=list.reduce((a,b)->a+b);
        System.out.println(totalLines);
    }
}

I get the error when instantiating JavaSparkContext. Does anyone have any idea how to solve this?

Thanks!

sev7e0 :

If you use eclipse, you can set Run > Run Configurations... > Arguments > VM arguments and set max heap size like -Xmx512m.

In idea you can set Run\Debug Configurations> VM options : -Xmx512m

In your code, you can try this conf.set("spark.testing.memory", "2147480000")

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=130410&siteId=1