【hive】启动时的一些notfind日志优化

1、ls: 无法访问/export/servers/spark/lib/spark-assembly-*.jar: 没有那个文件或目录

原因:

spark升级到spark2以后,原有lib目录下的大JAR包被分散成多个小JAR包,原来的spark-assembly-*.jar已经不存在,所以hive没有办法找到这个JAR包。

解决:

vim ${HIVE_HOME}/bin/hive

搜索    sparkAssemblyPath=

将定位的=号右边改成    `ls ${SPARK_HOME}/jars/*.jar`

2、SLF4J: Found binding in [jar:file:/export/servers/hbase-1.1.1/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/export/servers/hadoop-2.6.0-cdh5.14.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

原因:

jar包冲突

解决:

删除或重命名其中一个文件即可

mv /export/servers/hbase-1.1.1/lib/slf4j-log4j12-1.7.5.jar /export/servers/hbase-1.1.1/lib/slf4j-log4j12-1.7.5.jar.bak

3、which: no hbase in (:/export/servers/hadoop-2.6.0-cdh5.14.0/bin:/export/servers/hadoop-2.6.0-cdh5.14.0/sbin::/export/servers/hadoop-2.6.0-cdh5.14.0/bin:/export/servers/hadoop-2.6.0-cdh5.14.0/sbin::/export/servers/hadoop-2.6.0-cdh5.14.0/bin:/export/servers/hadoop-2.6.0-cdh5.14.0/sbin::/export/servers/hadoop-2.6.0-cdh5.14.0/bin:/export/servers/hadoop-2.6.0-cdh5.14.0/sbin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:HCAT_HOME/bin:/export/servers/hive-1.1.0-cdh5.14.0/bin:/export/servers/jdk1.8.0_144/bin:/export/servers/mysql-5.7.31/bin:/export/servers/sqoop/bin:/root/bin:/export/servers/hbase-1.1.1/bin:HCAT_HOME/bin:/export/servers/hive-1.1.0-cdh5.14.0/bin:/export/servers/jdk1.8.0_144/bin:/export/servers/mysql-5.7.31/bin:/export/servers/sqoop/bin:/export/servers/hbase-1.1.1/bin:HCAT_HOME/bin:/export/servers/hive-1.1.0-cdh5.14.0/bin:/export/servers/jdk1.8.0_144/bin:/export/servers/mysql-5.7.31/bin:/export/servers/sqoop/bin:/export/servers/hive-1.1.0-cdh5.14.0/bin:HCAT_HOME/bin:/export/servers/hive-1.1.0-cdh5.14.0/bin:/export/servers/jdk1.8.0_144/bin:/export/servers/mysql-5.7.31/bin:/export/servers/sqoop/bin)

这个信息是你添加了多少环境变量就有多长,有的人日志短

解决:

安装hbase即可

猜你喜欢

转载自blog.csdn.net/qq_44065303/article/details/112647954
今日推荐