hive删除表错误

在配好了hive环境,成功创建了表,并尝试load data local inpath ‘/home/data’ into table test 成功导入本地数据,觉得环境配置已经大功告成之后,问题又悄悄降临…删除表又报错了.
Error, return code 1 from org.apache.Hadoop.hive.ql.exec.DDLTask. MetaException
这什么错,看上去好像是mysql元数据的连接问题啊…于是百度了以下,发现要在mysql 中对hive数据库编码进行设置:

 alter database hive character set latin1;

重启hadoop,重启hive,show tables;之后尝试 drop table test;
还是报错,想了想,会不会是之前hive启动出错时留下的问题,那么把mysql中的hive数据删掉重新创建一个看看。
删掉hive数据库,重建创建hive数据库,设置编码,再重启hadoop,重启hive:

drop database hive; 
create database hive; 
alter database hive character set latin1;

继续show tables;drop table test;
结局依然让人失望…还是报出了Error…
运行下面这条语句,debug信息打印控制台,我倒要看看到底什么bug…

hive -hiveconf hive.root.logger=DEBUG,console

看到了exception:

InvalidObjectException(message:Role admin already exists.)
    at org.apache.hadoop.hive.metastore.ObjectStore.addRole(ObjectStore.java:3187)
    ...
    at java.lang.reflect.Method.invoke(Method.java:606)
InvalidObjectException(message:All is already granted by admin)
    at org.apache.hadoop.hive.metastore.ObjectStore.grantPrivileges(ObjectStore.java:3912)
    ...
    at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

显然,以我现在的水平看不出具体哪出错,只知道是mysql连接的问题。
再去百度上看看,有人说换掉mysql的connector jar包,我也试试..
然后把mysql-connector-java-5.1.6-bin.jar 删掉,从官网上下了个
mysql-connector-java-5.1.40-bin.jar,放入hive的lib下后,重启hive,又出现了别的提示信息:

Fri Dec 23 21:13:57 CST 2016 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.

估计打印了5次上面的信息,一看就懵了..但是倒是进了hive,然后

show tables;
drop table test;

神奇出现了,返回了OK!
问题锁定是connector的问题,但是上面这串提示什么鬼,查了一下,原来是高版本的connector 要指明是否使用ssl连接,可以在url中指定。我懒得修改hive-site.xml了,于是下了个connector-java-5.1.32-bin.jar,放到lib下,重启hadoop,重启hive,再创建一个test表,再删除,成功…但是我还是不知道是什么问题…也许是原来5.1.6的connector太老了吧….

猜你喜欢

转载自blog.csdn.net/sinat_30333853/article/details/53844306
今日推荐