Mac hive安装



1、下载apache-Hive-1.2.1-bin.tar.gz

2、解压到/Users/xiaoph/Documents/java/hive/​apache-hive-1.2.1-bin

3、环境变量配置

export HIVE_HOME=​/Users/xiaoph/Documents/java/hive/​apache-hive-1.2.1-bin

export PATH=$HIVE_HOME/bin:$PATH

4、​确定你的hadoop环境里,hdfs-site.xml里的dfs.replication的值是1,否则会报拒绝链接的错误。

  <property>

        <name>dfs.replication</name>

        <value>1</value>

    </property>

5、​下载  mysql-connector-Java-5.1.42-bin.jar ,并且copy到你的hive下的lib里,  /Users/xiaoph/Documents/java/hive/​apache-hive-1.2.1-bin/lib

6、进入hive的lib目录​

cp conf/hive-env.sh.template  conf/hive-env.sh

添加配置信息:

export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home

export HADOOP_HOME=/Users/xiaoph/Documents/java/hadoop/hadoop-2.6.2

export HIVE_HOME=/Users/xiaoph/Documents/java/hive/apache-hive-1.2.1-bin

:wq! 保存退出

7、修改log4j文件
cp hive-log4j.properties.template hive-log4j.properties

将EventCounter修改成org.apache.hadoop.log.metrics.EventCounter

#log4j.appender.EventCounter=org.apache.hadoop.hive.shims.HiveEventCounter

log4j.appender.EventCounter=org.apache.hadoop.log.metrics.EventCounter



​8、touch hive-site.xml   vi

配置信息:

<configuration>

        <property>

                <name>javax.jdo.option.ConnectionURL</name>

                <value>jdbc:mysql://localhost:3306/hivedb?createDatabaseIfNotExist=true&amp;useUnicode=true&amp;characterEncoding=UTF-8</value>

               <description>数据库链接地址</description>

        </property>

        <property>

                <name>javax.jdo.option.ConnectionDriverName</name>

                <value>com.mysql.jdbc.Driver</value>

                <description>数据库驱动类</description>

        </property>

        <property>

                <name>javax.jdo.option.ConnectionUserName</name>

                <value>root</value>

                <description>数据库用户名</description>

        </property>

        <property>

                <name>javax.jdo.option.ConnectionPassword</name>

                <value>root</value>

                <description>数据库密码</description>

        </property>

</configuration>

:wq!保存退出

9、mysql -u root -p 给用户赋予权限,以使得该用户可以远程登录数据库:

​GRANT ALL PRIVILEGES ON *.* TO 'root'@'%' IDENTIFIED BY 'root' WITH GRANT OPTION;

FLUSH PRIVILEGES;

10、​如果是第一次启动Hive,则需要先执行如下初始化命令:

   schematool -dbType mysql -initSchema

11、hive

----------------------------------------------------------------------------------------------------------------

注意:如果启动报错

(1)、Found class jline.Terminal, but interface was​

  说明 hadoop中存在旧版的jline,把hive中的jline拷贝到hadoop中

​cp /Users/xiaoph/Documents/java/hive/apache-hive-1.2.1-bin/lib/jline-2.12.jar /Users/xiaoph/Documents/java/hadoop/hadoop-2.6.2/share/hadoop/yarn/lib/

(2)、Duplicate key name 'PCS_STATS_IDX'

这个错误是因为,执行schematool -initSchema -dbType mysql的时候,hivedb数据库表里已经存在表了,所以在库里吧所有的表都删除了,再执行命令。

(3)、

org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="jing":jing:supergroup:rwxr-xr-x

这个时候是因为,在你的hadoop里安全模式是开着的,所以

hdfs-site.xml里的dfs.replication的值是1

 <property>

        <name>dfs.replication</name>

        <value>1</value>

    </property>

重新格式化,再重启hadoop

猜你喜欢

转载自blog.csdn.net/XiWangDeFengChe/article/details/71270263