Spark:问题记录

把在做毕设的过程中将遇见的问题整理发出来,供大家一起学习!

1

  • 问题详情:
Caused by: java.net.URISyntaxException: Relative path in absolute URI: file:C:/Users/dx/Desktop/HrtDisDetection/spark-warehouse
    at java.net.URI.checkPath(URI.java:1823)
    at java.net.URI.<init>(URI.java:745)
    at org.apache.hadoop.fs.Path.initialize(Path.java:202)
  • 解决方案:

在构建SparkConf时添加spark.sql.warehouse.dir属性即可:

  SparkConf conf = new SparkConf().setAppName(appName).setMaster(appType).set("spark.sql.warehouse.dir", "file:///");

2

  • 问题详情

Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createFileWithMode0(Ljava/lang/String;JJJI)Ljava/io/FileDescriptor;
  • 解决方案
将C:\Windows\System32中的hadoop.dll相关文件覆盖,找到相对应的hadoop,更新hadoop版本,

3

  • 问题详情

将Scala打包提交到Spark集群运行时提示:

Caused by: java.lang.ClassNotFoundException: scala.runtime.java8.JFunction2$mcIII$sp
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
  • 解决方案

由于Scala的版本不一致导致此问题,将本地编码的版本进行重新换到与集群的版本一致即可。

4

  • 问题详情
Exception in thread "main" org.apache.spark.SparkException: Could not parse Master URL:Spark://localhost:7077
  • 解决方案
由于网址出现了问题:小写spark即可解决

val sparkConf: SparkConf = new SparkConf().setAppName("Connect Spark Cluster").setMaster("spark://192.168.31.130:7077").set("spark.ui.port","7077")

猜你喜欢

转载自blog.csdn.net/it_dx/article/details/79901710