Get JavaSparkContext from a SparkSession

moe :

Im am using a SparkSession to run my spark application because I use a lot of spark-sql features. I would like to use the JavaSparkContextto create an RDD from a list. But via the Session I can only get an ordinary SparkContext. Is there a way to transform the context in this direction?

Serhiy :

After determining the SparkContext, you could use:

SparkContext sc = ...
JavaSparkContext jsc = JavaSparkContext.fromSparkContext(sc);

This will return you the new instance of JavaSparkContext, but there is no problem as long as you maintain just one active instance of the SparkContext.

Guess you like

Origin http://10.200.1.11:23101/article/api/json?id=450272&siteId=1