Hive3.1.2编译

1、修改pom文件

    <spark.version>3.1.1</spark.version>
    <scala.binary.version>2.12</scala.binary.version>
    <scala.version>2.12.10</scala.version>
	<hadoop.version>3.2.2</hadoop.version>
	<guava.version>27.0-jre</guava.version>
	<druid.version>0.12.3</druid.version>

2、需要修改的相关类

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.6.1:compile (default-compile) on project hive-llap-common: Compilation failure: Compilation failure: 
[ERROR] /opt/workspace/hive/llap-common/src/java/org/apache/hadoop/hive/llap/AsyncPbRpcProxy.java:[173,16] method addCallback in class 
com.google.common.util.concurrent.Futures cannot be applied to given types;

[ERROR] /opt/workspace/hive/llap-common/src/java/org/apache/hadoop/hive/llap/AsyncPbRpcProxy.java:[274,12] method addCallback in class com.google.common.util.concurrent.Futures cannot be applied to given types;
[ERROR]   required: com.google.common.util.concurrent.ListenableFuture<V>,com.google.common.util.concurrent.FutureCallback<? super V>,java.util.concurrent.Executor
[ERROR]   found: com.google.common.util.concurrent.ListenableFuture<java.lang.Void>,<anonymous com.google.common.util.concurrent.FutureCallback<java.lang.Void>>
[ERROR]   reason: cannot infer type-variable(s) V
[ERROR]     (actual and formal argument lists differ in length)


[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.6.1:compile (default-compile) on project hive-llap-tez: Compilation failure: Compilation failure: 
[ERROR] /opt/workspace/hive/llap-tez/src/java/org/apache/hadoop/hive/llap/tezplugins/LlapTaskSchedulerService.java:[747,14] method addCallback in class com.google.common.util.concurrent.Futures cannot be applied to given types;
[ERROR]   required: com.google.common.util.concurrent.ListenableFuture<V>,com.google.common.util.concurrent.FutureCallback<? super V>,java.util.concurrent.Executor
[ERROR]   found: com.google.common.util.concurrent.ListenableFuture<java.lang.Void>,org.apache.hadoop.hive.llap.tezplugins.scheduler.LoggingFutureCallback
[ERROR]   reason: cannot infer type-variable(s) V
[ERROR]     (actual and formal argument lists differ in length)


[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.6.1:compile (default-compile) on project hive-spark-client: Compilation failure: Compilation failure: 
[ERROR] /opt/workspace/hive/spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounter.java:[22,24] cannot find symbol
[ERROR]   symbol:   class Accumulator
[ERROR]   location: package org.apache.spark
[ERROR] /opt/workspace/hive/spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounter.java:[23,24] cannot find symbol
[ERROR]   symbol:   class AccumulatorParam


[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.6.1:compile (default-compile) on project hive-spark-client: Compilation failure: Compilation failure: 
[ERROR] /opt/workspace/hive/spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleWriteMetrics.java:[50,39] cannot find symbol
[ERROR]   symbol:   method shuffleBytesWritten()
[ERROR]   location: class org.apache.spark.executor.ShuffleWriteMetrics
[ERROR] /opt/workspace/hive/spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleWriteMetrics.java:[51,36] cannot find symbol
[ERROR]   symbol:   method shuffleWriteTime()
[ERROR]   location: class org.apache.spark.executor.ShuffleWriteMetrics
[ERROR] -> [Help 1]

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.6.1:compile (default-compile) on project hive-exec: Compilation failure: Compilation failure: 
[ERROR] /opt/workspace/hive/ql/src/java/org/apache/hadoop/hive/ql/exec/tez/WorkloadManager.java:[1095,12] method addCallback in class com.google.common.util.concurrent.Futures cannot be applied to given types;
[ERROR]   required: com.google.common.util.concurrent.ListenableFuture<V>,com.google.common.util.concurrent.FutureCallback<? super V>,java.util.concurrent.Executor
[ERROR]   found: com.google.common.util.concurrent.ListenableFuture<capture#1 of ?>,com.google.common.util.concurrent.FutureCallback<java.lang.Object>
[ERROR]   reason: cannot infer type-variable(s) V

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.6.1:testCompile (default-testCompile) on project hive-exec: Compilation failure
[ERROR] /opt/workspace/hive/ql/src/test/org/apache/hadoop/hive/ql/stats/TestStatsUtils.java:[34,39] package org.spark_project.guava.collect does not exist
[ERROR] 

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.6.1:testCompile (default-testCompile) on project hive-exec: Compilation failure
[ERROR] /opt/workspace/hive/ql/src/test/org/apache/hadoop/hive/ql/exec/tez/SampleTezSessionState.java:[121,12] method addCallback in class com.google.common.util.concurrent.Futures cannot be applied to given types;
[ERROR]   required: com.google.common.util.concurrent.ListenableFuture<V>,com.google.common.util.concurrent.FutureCallback<? super V>,java.util.concurrent.Executor
[ERROR]   found: com.google.common.util.concurrent.ListenableFuture<java.lang.Boolean>,<anonymous com.google.common.util.concurrent.FutureCallback<java.lang.Boolean>>
[ERROR]   reason: cannot infer type-variable(s) V
[ERROR]     (actual and formal argument lists differ in length)

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.6.1:compile (default-compile) on project hive-llap-server: Compilation failure: Compilation failure: 
[ERROR] /opt/workspace/hive/llap-server/src/java/org/apache/hadoop/hive/llap/daemon/impl/AMReporter.java:[162,12] method addCallback in class com.google.common.util.concurrent.Futures cannot be applied to given types;
[ERROR]   required: com.google.common.util.concurrent.ListenableFuture<V>,com.google.common.util.concurrent.FutureCallback<? super V>,java.util.concurrent.Executor
[ERROR]   found: com.google.common.util.concurrent.ListenableFuture<java.lang.Void>,<anonymous com.google.common.util.concurrent.FutureCallback<java.lang.Void>>
[ERROR]   reason: cannot infer type-variable(s) V
[ERROR]     (actual and formal argument lists differ in length)
[ERROR] /opt/workspace/hive/llap-server/src/java/org/apache/hadoop/hive/llap/daemon/impl/TaskExecutorService.java:[178,12] method addCallback in class com.google.common.util.concurrent.Futures cannot be applied to given types;
[ERROR]   required: com.google.common.util.concurrent.ListenableFuture<V>,com.google.common.util.concurrent.FutureCallback<? super V>,java.util.concurrent.Executor
[ERROR]   found: com.google.common.util.concurrent.ListenableFuture<capture#1 of ?>,org.apache.hadoop.hive.llap.daemon.impl.TaskExecutorService.WaitQueueWorkerCallback
[ERROR]   reason: cannot infer type-variable(s) V
[ERROR]     (actual and formal argument lists differ in length)
[ERROR] /opt/workspace/hive/llap-server/src/java/org/apache/hadoop/hive/llap/daemon/impl/LlapTaskReporter.java:[131,12] method addCallback in class com.google.common.util.concurrent.Futures cannot be applied to given types;
[ERROR]   required: com.google.common.util.concurrent.ListenableFuture<V>,com.google.common.util.concurrent.FutureCallback<? super V>,java.util.concurrent.Executor
[ERROR]   found: com.google.common.util.concurrent.ListenableFuture<java.lang.Boolean>,org.apache.hadoop.hive.llap.daemon.impl.LlapTaskReporter.HeartbeatCallback
[ERROR]   reason: cannot infer type-variable(s) V
[ERROR]     (actual and formal argument lists differ in length)
[ERROR] -> [Help 1]

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.6.1:compile (default-compile) on project hive-druid-handler: Compilation failure
[ERROR] /opt/workspace/hive/druid-handler/src/java/org/apache/hadoop/hive/druid/serde/DruidScanQueryRecordReader.java:[46,61] <T>emptyIterator() is not public in com.google.common.collect.Iterators; cannot be accessed from outside package

3、执行编译命令

mvn clean package -Pdist -DskipTests

4、Hive运行报错

Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
        at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357)
        at org.apache.hadoop.conf.Configuration.set(Configuration.java:1338)
        at org.apache.spark.deploy.SparkHadoopUtil$.$anonfun$appendSparkHadoopConfigs$6(SparkHadoopUtil.scala:481)
        at org.apache.spark.deploy.SparkHadoopUtil$.$anonfun$appendSparkHadoopConfigs$6$adapted(SparkHadoopUtil.scala:480)
        at scala.collection.TraversableLike$WithFilter.$anonfun$foreach$1(TraversableLike.scala:877)
        at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
        at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
        at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
        at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:876)
        at org.apache.spark.deploy.SparkHadoopUtil$.org$apache$spark$deploy$SparkHadoopUtil$$appendSparkHadoopConfigs(SparkHadoopUtil.scala:480)
        at org.apache.spark.deploy.SparkHadoopUtil$.org$apache$spark$deploy$SparkHadoopUtil$$appendS3AndSparkHadoopHiveConfigurations(SparkHadoopUtil.scala:454)
        at org.apache.spark.deploy.SparkHadoopUtil$.newConfiguration(SparkHadoopUtil.scala:427)
        at org.apache.spark.deploy.SparkSubmit.$anonfun$prepareSubmitEnvironment$2(SparkSubmit.scala:342)
        at scala.Option.getOrElse(Option.scala:189)
        at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:342)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1030)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1039)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

        at org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcServer.java:211) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hive.spark.client.SparkClientImpl$2.run(SparkClientImpl.java:491) ~[hive-exec-3.1.2.jar:3.1.2]
        at java.lang.Thread.run(Thread.java:748) [?:1.8.0_202]
2021-05-24T17:20:12,167 ERROR [Thread-14] spark.SparkTask: Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create Spark client for Spark session 63df0c54-c8f5-495d-bcb9-135a6b9f4ac9)'
org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create Spark client for Spark session 63df0c54-c8f5-495d-bcb9-135a6b9f4ac9
        at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.getHiveException(SparkSessionImpl.java:215)
        at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:92)
        at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115)
        at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:136)
        at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:115)
        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205)
        at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97)
        at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:76)
Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel client '63df0c54-c8f5-495d-bcb9-135a6b9f4ac9'. Error: Child process (spark-submit) exited before connecting back with error log SLF4J: Class path contains multiple SLF4J bindings.

猜你喜欢

转载自blog.csdn.net/docsz/article/details/117228175
今日推荐