The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH

Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient

Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke the "DBCP" plugin to create a ConnectionPool gave an error : The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.


解决方案一:查了好多都没有解决由于stackoverflow回复出了点bug只好在自己的博客记录了。

I meet the same problems !
All of the config goes well ,but the mysql-connector-java-5.1.17.jar seems to not work.
Now try this way ,specify the jars when you run spark shell.


spark-shell --jars /usr/local/hive-0.13.1/lib/mysql-connector-java-5.1.17.jar 
En ,you can try hive shell as this way.


直接:添加 --jars   指定mysql-connector-java-5.1.17.jar 



方案二: 

将该jar包上传到hdfs,然后 --jars指定


方案三:

将该jar包上传到各个节点上,并在 /etc/alternatives/spark-conf/classpath.txt 指定

/etc/alternatives/spark-conf/classpath.txt


猜你喜欢

转载自blog.csdn.net/mtj66/article/details/51841053