java.lang.NoSuchMethodError: scala.util.Properties$.coloredOutputEnabled()Z

In the process of testing spark3.0 today, the Scala version of the spark environment was increased to version 2.12.x.
At the beginning I set the Scala version to be 2.12.0

<properties>
        <scala.version>2.12.0</scala.version>
        <spark.version>3.0.0</spark.version>
</properties>

<dependencies>
        <dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-library</artifactId>
            <version>${scala.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.12</artifactId>
            <version>${spark.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.12</artifactId>
            <version>${spark.version}</version>
        </dependency>
</dependencies>

Simply execute the code to read the MySQL database:

package com.gw.sparkSql

import java.util.Properties

import org.apache.spark.sql.{
    
    DataFrame, SparkSession}

/**
  * 利用sparksql从mysql表中读取数据
  */
object DataFromMysql {
    
    
  def main(args: Array[String]): Unit = {
    
    
    //创建sparkSession对象
    val sparkSession: SparkSession = SparkSession.builder().master("local[4]").appName("DataFromMysql").getOrCreate()

    //读取sql数据需要提前设置rul,table和properties
    val data: DataFrame = sparkSession.read.format("jdbc")
      .option("url", "jdbc:mysql://localhost:3306/test11")
      .option("dbtable", "blood")
      .option("user", "root")
      .option("password", "123")
      .load()

    //获取数据
    data.show()
    data.printSchema()

    //释放资源
    sparkSession.stop()
  }
}

Result execution error:

Exception in thread "main" java.lang.NoSuchMethodError: scala.util.Properties$.coloredOutputEnabled()Z
	at scala.reflect.internal.TypeDebugging$typeDebug$.<init>(TypeDebugging.scala:69)
	at scala.reflect.internal.SymbolTable.typeDebug$lzycompute$1(SymbolTable.scala:28)
	at scala.reflect.internal.SymbolTable.typeDebug(SymbolTable.scala:28)
	at scala.reflect.runtime.JavaUniverseForce.force(JavaUniverseForce.scala:67)
	at scala.reflect.runtime.JavaUniverseForce.force$(JavaUniverseForce.scala:18)
	at scala.reflect.runtime.JavaUniverse.force(JavaUniverse.scala:30)
	at scala.reflect.runtime.JavaUniverse.init(JavaUniverse.scala:162)
	at scala.reflect.runtime.JavaUniverse.<init>(JavaUniverse.scala:93)
	at scala.reflect.runtime.package$.universe$lzycompute(package.scala:29)
	at scala.reflect.runtime.package$.universe(package.scala:29)
	at org.apache.spark.sql.catalyst.ScalaReflection$.<init>(ScalaReflection.scala:50)
	at org.apache.spark.sql.catalyst.ScalaReflection$.<clinit>(ScalaReflection.scala)
	at org.apache.spark.sql.catalyst.encoders.RowEncoder$.serializerFor(RowEncoder.scala:77)
	at org.apache.spark.sql.catalyst.encoders.RowEncoder$.apply(RowEncoder.scala:66)
	at org.apache.spark.sql.Dataset$.$anonfun$ofRows$1(Dataset.scala:92)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:763)
	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:89)
	at org.apache.spark.sql.SparkSession.baseRelationToDataFrame(SparkSession.scala:427)
	at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:279)
	at org.apache.spark.sql.DataFrameReader.$anonfun$load$2(DataFrameReader.scala:268)
	at scala.Option.getOrElse(Option.scala:121)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:268)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:203)
	at com.gw.sparkSql.DataFromMysql$.main(DataFromMysql.scala:20)
	at com.gw.sparkSql.DataFromMysql.main(DataFromMysql.scala)

Baidu later discovered that only the following two articles deal with this problem, and there is no solution at all.
https://stackoverflow.com/questions/60121386/what-dependencies-i-need-to-run-spark-read-csv-in-local-distribution#
https://stackoverflow.com/questions/59935502/java- lang-nosuchmethoderror-boolean-scala-util-properties-coloredoutputenable

There is no other way but to explore it by yourself.
From the perspective of the type of error, it clearly shows a problem with Scala. The first thing I thought of was the correspondence between Scala 2.12.x and Java (mine is Java 1.8). Baidu found that this is not the reason.

Then I started to target the Scala version. I first changed the Scala version to:

<scala.version>2.12.1</scala.version>

Still reporting an error.

Then I changed the version of Scala to the latest version in 2.12.x:

<scala.version>2.12.12</scala.version>

It was unexpected and it worked perfectly.
Insert picture description here
The above is the blind cat ran into the dead mouse, the following is the real reason analysis.

Cause Analysis:
There are generally only two reasons for NoSuchMethodError:

1. If the actual dependent package is missing, just see which one is missing and add which;
2. The dependent package has conflicts;

As for why the jar package conflict will cause NoSuchMethodError, please refer to:
https://blog.csdn.net/ycccsdn/article/details/90549347?depth_1-

There is obviously the second reason.
First review the dependency package of spark2.x version that we are familiar with. I usually use version 2.2.0. The following writing method has not changed for many years, so I did not delve into why it is written like this:

<properties>
        <scala.version>2.11.8</scala.version>
        <spark.version>2.2.0</spark.version>
</properties>

In fact, there is a correspondence between the Scala version and the spark version. Here I use spark2.2.0. There is a reason why I choose 2.11.8 for the Scala version.
You can see what is in the spark-core dependency package? The scala-library dependency exists in spark-core of version 2.2.0, and the version is also 2.11.8.
Insert picture description here
So do you understand, after upgrading spark3.0, the dependency package is not written casually:
let's take a look at how many versions of the scala-library dependency exist in the spark-core package of spark3.0 version,
Insert picture description here
so the problem is here Obviously, just match the Scala version with the version in spark-core

<properties>
        <scala.version>2.12.10</scala.version>
        <spark.version>3.0.0</spark.version>
</properties>

The Scala version only needs to be greater than or equal to 2.12.10.

Guess you like

Origin blog.csdn.net/weixin_44455388/article/details/107767760