Today I used Spark for RDD operations. When I started to run the program, Caused by: java.lang.ClassNotFoundException: scala.Product$class appeared .
I finally checked and found that the spark-core version in my project was inconsistent with the scala version of the entire project.
For the project I built with Maven, the pom.xml file is partially structured as follows:
The reason for my previous error was: spark-core relied on spark-core_2.11 I wrote and finally caused an error in operation. The spark-core_2.12 in the artifactId here must be consistent with the scala 2.12.0 version.
Note: It is necessary to check that all versions must be consistent