when i do some sql-related certain simple test programs,this exception occurs to me.although it seems weird.
(used spark-1.3.1 for project needness)
scala.reflect.internal.MissingRequirementError: class org.apache.spark.sql.catalyst.ScalaReflection in JavaMirror at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16) at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17) at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
and the program is pretty simple
val x = sqlCtx.udf.register("cude", (num :Integer) => num * num *num); sqlCtx.sql("select cude(4) c").show()
then i run it again using spark-shell repl,but it runs quite well.so i think what's the differences between it and ide?
first,the spark jars will be one of the case.so i run this code again in a new project which refers the spark's assembly jar.yeh it accomplishs successfuly too.so the proof of jar is here.
but i look into the exception stacktrace agagin,i found that one of the caller path is:RootsBase.getMuduleOrClass(described previously),then i dived into the src of spark,yeh. a inspiration flashes in my mind:class compatiblity is it!
after a googling, i found that that is a bug about dataframe
Registering table on RDD is giving MissingRequirementError [spark-5281]
ie. https://github.com/apache/spark/pull/5981
Replaced calls to typeOf with typeTag[T].in(mirror). The convenience method assumes all types can be found in the classloader that loaded scala-reflect (the primordial classloader). This assumption is not valid in all contexts (sbt console, Eclipse launchers).
solutions:
1.replace all the scala-related jars installed in ide with the ones you installed scala manually
i found that all the scala jars referenced by project are located in eclipse plugins dir.
2.upgrade to spark 1.4.1+
it means that replace the scala-related jars or spark jars may do the trick,that is it.