Spark 初始化对象

版权声明:本文为博主原创文章,遵循 CC 4.0 BY-SA 版权协议,转载请附上原文出处链接和本声明。
本文链接: https://blog.csdn.net/daoxu_hjl/article/details/86665421

Spark 1+ 和 Spark 2+ 初始化SparkContext有所区别,现列出Spark 1.5.1 和 Spark 2+中初始化sc方式:

1. Spark 2+

先创建一个SparkSession对象:

通过config方法配置自定义设置,可以写多个config设置

val spark = SparkSession.builder() // Creates a [[SparkSession.Builder]] for constructing a [[SparkSession]].  
  .appName("Word Count") // Sets a name for the application, which will be shown in the Spark web UI. If no application name is set, a randomly generated name will be used.
  .config("spark.some.config.option", "some-value") //Sets a config option. Options set using this method are automatically propagated to both `SparkConf` and SparkSession's own configuration.
  .enableHiveSupport() // 启用HiveContext
  .getOrCreate()  //Gets an existing [[SparkSession]] or, if there is no existing one, creates a new  one based on the options set in this builder.
  

val sc = spark.sparkContext

spark.sql("select 123")  # 调用sql

2. Spark 1.5.0

在这里插入图片描述

猜你喜欢

转载自blog.csdn.net/daoxu_hjl/article/details/86665421