Spark代码提交到Yarn报错:java.lang.BootstrapMethodError: java.lang.NoClassDefFoundError: scala/runtime/java8

保存信息:

ERROR Client: Application diagnostics message: User class threw exception: java.lang.BootstrapMethodError: java.lang.NoClassDefFoundError: scala/runtime/java8/JFunction2$mcIII$sp
        at com.czxy.WordCount_Spark$.main(WordCount_Spark.scala:29)
        at com.czxy.WordCount_Spark.main(WordCount_Spark.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:685)
Caused by: java.lang.NoClassDefFoundError: scala/runtime/java8/JFunction2$mcIII$sp
        ... 7 more
Caused by: java.lang.ClassNotFoundException: scala.runtime.java8.JFunction2$mcIII$sp
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        ... 7 more

Exception in thread "main" org.apache.spark.SparkException: Application application_1616148997795_0008 finished with failed status
        at org.apache.spark.deploy.yarn.Client.run(Client.scala:1150)
        at org.apache.spark.deploy.yarn.YarnClusterApplication.start(Client.scala:1530)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

通过报错可知:
NoClassDefFoundError:类(class)和方法(def)找不到错误

检查:

  1. 检查spark集群上的scala版本
    ll $SPARK_HOME/jars/scala*
    在这里插入图片描述
    2.检查本机环境的scala版本
    本机scala版本:2.12.13

得出结论:spark集群上的scala版本与本机scala版本不匹配导致错误

解决

  1. 进入scala官网下载安装Scala2.11.12版本
    https://www.scala-lang.org/

  2. 进入IDEA重新选择安装的Scala2.11.12

  • 右键项目名称找到Open Module Settings
    在这里插入图片描述

  • 点击左侧Global Settings

  • 点击+选择Scala 2.11.12版本SDK
    在这里插入图片描述

  1. 删除项目中的target目录,重新编译,在jar包上传到集群上运行即可

猜你喜欢

转载自blog.csdn.net/zh2475855601/article/details/115014282