Use spark.{driver,executor}.userClassPathFirst

Hi

I'm on Spark 1.6.1, and it happens that we override Yarn classpath in yarn-site.xml. So I have a simple job that reads avro files using com.databricks.avro library. When I run my job like that, it works and reports success:

 
  1. ./bin/spark-submit --class com.test.MyJob --verbose --master yarn-cluster --conf
  2. spark.yarn.user.classpath.first=true --num-executors 5 /tmp/fat-app.jar

Then I get a warning that I should use spark.{driver,executor}.userClassPathFirst, because spark.yarn.user.classpath.first is deprecated. So I change it into the following and there is an exception:

 
  1. ./bin/spark-submit --class com.test.MyJob --verbose --master yarn-cluster --conf spark.executor.userClassPathFirst=true --conf spark.driver.userClassPathFirst=true --num-executors 5 /tmp/fat-app.jar
 
  1. Exception in thread "main" org.apache.spark.SparkException: Application application_1467198279864_0129 finished with failed status
  2. at org.apache.spark.deploy.yarn.Client.run(Client.scala:1034)
  3. at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1081)
  4. at org.apache.spark.deploy.yarn.Client.main(Client.scala)
  5. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  6. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
  7. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.j
  8. Max number of executor failures (10) reached

So those options are not interchangeable, or I'm doing something wrong here?

The exact exception is here:

 
  1. 16/06/29 15:31:17 ERROR server.TransportRequestHandler: Error while invoking RpcHandler#receive() on RPC id 7914409773016323976
  2. java.lang.ClassCastException: cannot assign instance of scala.concurrent.duration.FiniteDuration to field org.apache.spark.rpc.RpcTimeout.duration of type scala.concurrent.duration.FiniteDuration in instance of org.apache.spark.rpc.RpcTimeout
  3. at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2083)
  4. at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1261)
  5. at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1996)
  6. at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
  7. at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
  8. at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
  9. at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
  10. at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
  11. at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
  12. at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
  13. at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
  14. at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
  15. at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
  16. at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
  17. at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
  18. at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
  19. at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:109)
  20. at org.apache.spark.rpc.netty.NettyRpcEnv$anonfun$deserialize$1$anonfun$apply$1.apply(NettyRpcEnv.scala:258)
  21. at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
  22. at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:310)
  23. at org.apache.spark.rpc.netty.NettyRpcEnv$anonfun$deserialize$1.apply(NettyRpcEnv.scala:257)
  24. at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
  25. at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:256)
  26. at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:588)
  27. at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:570)
  28. at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:149)
  29. at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:102)
  30. at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:104)
  31. at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)
  32. at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
  33. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
  34. at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
  35. at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
  36. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
  37. at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
  38. at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
  39. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
  40. at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
  41. at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:86)
  42. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
  43. at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
  44. at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
  45. at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
  46. at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
  47. at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
  48. at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
  49. at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
  50. at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)

Regards

猜你喜欢

转载自blog.csdn.net/fly_time2012/article/details/82810709