Spark读取Hbase报错NoSuchMethodError: org.apache.hadoop.conf.Configuration.getPassword(Ljava/lang/String;

NoSuchMethodError general packet error is caused by the conflict.

java.lang.NoSuchMethodError: org.apache.hadoop.conf.Configuration.getPassword(Ljava/lang/String;)[C

 at org.apache.spark.SSLOptions$$anonfun$8.apply(SSLOptions.scala:188)
 at org.apache.spark.SSLOptions$$anonfun$8.apply(SSLOptions.scala:188)
 at scala.Option.orElse(Option.scala:289)
 at org.apache.spark.SSLOptions$.parse(SSLOptions.scala:188)
 at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:117)
 at org.apache.spark.SparkEnv$.create(SparkEnv.scala:236)
 at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
 at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)
 at org.apache.spark.SparkContext.<init>(SparkContext.scala:424)
 at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
 at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
 at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
 at scala.Option.getOrElse(Option.scala:121)
 at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
 at com.lcc.source.hbase.staticTable.ReadStaticHbase.readHbase(ReadStaticHbase.scala:33)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:498)
 at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
 at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
 at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
 at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
 at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
 at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
 at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
 at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
 at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
 at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
 at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
 at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
 at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
 at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
 at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
 at com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:51)
 at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
 at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)

This problem is caused by packet collisions, I view the next go maven, I found no conflict, very confused

Here Insert Picture Description
Then point to open later org.apache.hadoop.conf.Configuration.getPasswordfound to be hadoop 2.5.1 version of the package.
But I have not introduced globally hadoop, only introduced Spark 2.4 kudu 1.2 hbase 1.2 Then, I think it is not wrong version, so the view hdfs CDH version 2.6.0 Then I added the maven

	<dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-client</artifactId>
            <version>2.6.0</version>
	</dependency>
	

Then I exclude conflict

Here Insert Picture Description

Even if only discharged here 2.5.1 2.6.0 is marked red, I do not exclude, after the last successful operation

Guess you like

Origin blog.csdn.net/qq_21383435/article/details/93087305