Spark SQL入门:创建SparkSession时import spark.implicits._ 报错: error: value implicits is not a member of...
如下所示:
scala> import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.SparkSession
scala>
scala> val spark = SparkSession
spark: org.apache.spark.sql.SparkSession.type = org.apache.spark.sql.SparkSession$@614da024
scala> .builder()
res12: spark.Builder = org.apache.spark.sql.SparkSession$Builder@11722350
scala> .appName("Spark SQL basic example")
res13: spark.Builder = org.apache.spark.sql.SparkSession$Builder@11722350
scala> .config("spark.some.config.option", "some-value")
res14: spark.Builder = org.apache.spark.sql.SparkSession$Builder@11722350
scala> .getOrCreate()
20/02/02 18:10:32 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
res15: org.apache.spark.sql.SparkSession = org.apache.spark.sql.SparkSession@1062f767
scala>
scala> // For implicit conversions like converting RDDs to DataFrames
scala> import spark.implicits._
<console>:40: error: value implicits is not a member of object org.apache.spark.sql.SparkSession
import spark.implicits._
报错原因:代码格式有问题
解决办法:
调整代码格式,如下所示:
scala> import org.apache.spark.sql.SparkSession
scala> val spark = SparkSession.builder().appName("Spark SQL basic example").config("spark.some.config.option", "some-value").getOrCreate()
scala> import spark.implicits._