spark1.6升级spark2.1时候sparkstreaming程序问题总结2018

升级版本说明:从spark2.1的maven配置pom.xml中可以看出版本!!

  <dependency>
   <groupId>org.apache.spark</groupId>
   <artifactId>spark-core_2.11</artifactId>
   <version>2.1.0</version>
  </dependency>


<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-sql_2.11</artifactId>
  <version>2.1.0</version>
   </dependency> 


<dependency>
   <groupId>org.apache.spark</groupId>
   <artifactId>spark-streaming_2.11</artifactId>
   <version>2.1.0</version>
  </dependency>


 <dependency>
   <groupId>org.apache.spark</groupId>
   <artifactId>spark-streaming-kafka_2.11</artifactId>
   <version>1.6.3</version>
  </dependency>


<dependency>
      <groupId>org.scala-lang</groupId>
      <artifactId>scala-actors</artifactId>
      <version>2.11.8</version>
      <scope>compile</scope>
    </dependency>



问题一:

More than one scala library found in the build path (/cache_V0.1/lib/spark-assembly-1.6.0-cdh5.8.3-hadoop2.6.0-cdh5.8.3.jar,D:/softinstall/eclipse/plugins/org.scala-lang.scala-library_2.11.8.v20160304-115712-1706a37eb8.jar/cache_V0.1/lib/spark-assembly-1.6.0-cdh5.8.3-hadoop2.6.0-cdh5.8.3.jar,).At least one has an incompatible version. Please update the project build path so it contains only one compatible scala library. cache_V0.1   Unknown Scala Classpath Problem


问题一分析原因:因为和spark2.1有冲突了

问题一解决方法:

1)把eclipse中的build ---->add jar的包spark-assembly-1.6.0-cdh5.8.3-hadoop2.6.0-cdh5.8.3.jar删除掉,即不从本地加载这个jar包

2)eclipse选中scala的maven项目---》右键--》scala---->set scala installe..... ----->选择2.11.8版本的scala


问题二:

[ERROR] scalac error: bad option: '-make:transitive'

[ERROR] Failed to execute goal org.scala-tools:maven-scala-plugin:2.15.0:compile (default) on project cache_chinalife_amis: 
wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 1(Exit value: 1) -> [Help 1]

问题二分析原因:spark2.1不支持'-make:transitive'

问题二解决方法:注释掉pom.xmi文件中的这一行即可:<!-- <parameter value="-make:transitive"/>  -->


问题三:打包完成后,放服务器跑,报错:main方法找不到:

java.lang.ClassNotFoundException: StartCBPS8
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:340)
at org.apache.spark.util.Utils$.classForName(Utils.scala:176)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:689)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

问题三分析原因:jdk版本不一致导致,服务器的jdk版本为1.8,本地打包的为1.7,

问题三解决方法:修改本地jdk八版为1.8后重新打包即可


问题四:服务器上执行时,包类org.apache.spark.scheduler.SparkListenerInterface无法找到

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/scheduler/SparkListenerInterface
at StartCBPS8.main(StartCBPS8.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.scheduler.SparkListenerInterface
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 10 more

问题四问题分析:由于服务上提交任务时候,使用的命令为spark-submit,而安装的spark为2.1,因此是执行的命令不对,需要使用spark2-submit命令提交

问题四解决方法:

 nohup /usr/bin/spark-submit --class StartCBPS8  --master yarn --deploy-mode client  --conf spark.port.maxRetries=50  --jars $BASEDIR/dom4j-1.3.jar,$BASEDIR/scala-actors-2.11.8.jar,$BASEDIR/ojdbc5.jar,$BASEDIR/jedis-2.7.3.jar,$BASEDIR/SDK.jar,$BASEDIR/fastjson-1.1.33.jar  --executor-memory 1g --executor-cores 1 ../bin/cache_chinalife_amis-0.0.1.jar $PROVICE $2 $3 > ../logs/cache-cbps8_$PROVICE-start.out 2>&1 &

改为:

 nohup /usr/bin/spark2-submit --class StartCBPS8  --master yarn --deploy-mode client  --conf spark.port.maxRetries=50  --jars $BASEDIR/dom4j-1.3.jar,$BASEDIR/scala-actors-2.11.8.jar,$BASEDIR/ojdbc5.jar,$BASEDIR/jedis-2.7.3.jar,$BASEDIR/SDK.jar,$BASEDIR/fastjson-1.1.33.jar  --executor-memory 1g --executor-cores 1 ../bin/cache_chinalife_amis-0.0.1.jar $PROVICE $2 $3 > ../logs/cache-cbps8_$PROVICE-start.out 2>&1 &


问题五:spark命令提交参数不正确

问题五分析原因:spark2-submit和spark-submit提交命令参数不兼容

问题五解决方法:修改为spark2-submit兼容的提交命令方式,去掉不支持的2个命令: --driver-memory 128m  --driver-java-options "-Xms64m -Xmx128m -XX:PermSize=64M -XX:MaxPermSize=128M" 


nohup /opt/cloudera/parcels/CDH/lib/spark/bin/spark-submit --class StartCBPS8  --master yarn --deploy-mode client --driver-memory 128m  --driver-java-options "-Xms64m -Xmx128m -XX:PermSize=64M -XX:MaxPermSize=128M"  --conf spark.port.maxRetries=40   --jars $BASEDIR/dom4j-1.3.jar,$BASEDIR/scala-actors-2.10.6.jar,$BASEDIR/ojdbc5.jar,$BASEDIR/jedis-2.7.3.jar,$BASEDIR/SDK.jar,$BASEDIR/fastjson-1.1.33.jar  --executor-memory 1g --executor-cores 1 ../bin/cache_chinalife_amis-0.0.1.jar $PROVICE $2 $3  > ../logs/cache-cbps8_$PROVICE-start.out 2>&1 &


改为:

nohup /usr/bin/spark2-submit --class StartCBPS8  --master yarn --deploy-mode client  --conf spark.port.maxRetries=50  --jars $BASEDIR/dom4j-1.3.jar,$BASEDIR/scala-actors-2.11.8.jar,$BASEDIR/ojdbc5.jar,$BASEDIR/jedis-2.7.3.jar,$BASEDIR/SDK.jar,$BASEDIR/fastjson-1.1.33.jar  --executor-memory 1g --executor-cores 1 ../bin/cache_chinalife_amis-0.0.1.jar $PROVICE $2 $3 > ../logs/cache-cbps8_$PROVICE-start.out 2>&1 &


猜你喜欢

转载自blog.csdn.net/wumiqing1/article/details/78955199
今日推荐