spark常见异常

spark常见错误

错误一、Error:(31, 126) Unable to find encoder for type stored in a Dataset.  Primitive types (Int, String, etc) and Product types (case classes) are supported by importing spark.implicits._  Support for serializing other types will be added in future releases.
    val kafkaDatasString: Dataset[(String, String)] = kafkaDatas.selectExpr("CAST(key AS STRING)","CAST(value AS STRING)").as[(String,String)]

原因

Product types (case classes) are supported by importing spark.implicits._

没有 spark.implicits._ 的支持

解决办法

添加

import spark.implicits._

写了spark.implicits._还是报错

要注意 import spark.implicits._ 的位置,要在关键代码上边,尽量往前放

错误二、org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 9.0 failed 1 times, most recent failure: Lost task 0.0 in stage 9.0 (TID 808, localhost, executor driver): java.lang.NullPointerException

WARN TaskSetManager: Lost task 0.0 in stage 9.0 (TID 808, localhost, executor driver): java.lang.NullPointerException

 

可能出现的原因:

 

解决方案:

方法内的局部变量使得方法外的变量无作用了

发布了124 篇原创文章 · 获赞 21 · 访问量 3167

猜你喜欢

转载自blog.csdn.net/qq_44065303/article/details/105576410