hive-1.1.0-cdh1.51.1部署

1.解压源文件

[hadoop@hadoop001 software]# tar -zxvf apache-hive-1.1.0-cdh5.15.1-bin.tar.gz  -C /home/hadoop/app/
#建立软链接
[hadoop@hadoop001 software]# ln -s apache-hive-1.1.0-cdh5.15.1-bin zookeeper
# 配置环境
export HIVE_HOME=/home/hadoop/app/hive
export PATH=$PATH:$HIVE_HOME/bin:$HIVE_HOME/conf
#执行 source /etc/profile:生效

2.修改配置文件

  • 复制 hive-default.xml.template hive-env.sh.template 生成
    hive-default.xml hive-env.sh文件
-rw-r--r-- 1 hadoop hadoop 177608 Oct  6 16:37 hive-default.xml.template
-rw-rw-r-- 1 hadoop hadoop   2378 Aug  9  2018 hive-env.sh.template
  • 修改hive-env.sh文件
#在末尾 添加环境变量
export HADOOP_HOME=/root/apps/hadoop
export JAVA_HOME=/root/apps/jdk1.8.0_221
export HIVE_HOME=/home/hadoop/app/hive
  • 修改hive-default.xml
#配置项中添加mysql信息 注 需在mysql  中键相关数据库 角色 权限
<property>
        <name>hive.exec.scratchdir</name>
        <value>hdfs://hadoop001:9000/tmp/hive</value>
</property>
<property>
        <name>javax.jdo.option.ConnectionURL</name>
        <value>jdbc:mysql://hadoop001:3306/hivedb?createDatabaseIfNotExist=true&amp;characterEncoding=UTF-8</value>
</property>
<property>
        <name>javax.jdo.option.ConnectionDriverName</name>
        <value>com.mysql.jdbc.Driver</value>
</property>
<property>
        <name>javax.jdo.option.ConnectionUserName</name>
        <value>hive</value>
</property>
<property>
        <name>javax.jdo.option.ConnectionPassword</name>
        <value>123456</value>
</property>
<property>
        <name>hive.metastore.warehouse.dir</name>
      <value>hdfs://hadoop001:9000/hive/warehouse</value>
        <description>location of default database for the warehouse</description>
</property>
<property>
        <name>javax.jdo.option.Multithreaded</name>
        <value>true</value>
</property>
# 修改 含有 ${system:java.io.tmpdir}/${system:user.name}的路径 自定义一个路径
 <property>
    <name>hive.exec.local.scratchdir</name>
    <value>${system:java.io.tmpdir}/${system:user.name}</value>
    <description>Local scratch space for Hive jobs</description>
  </property>
#例如
<property>
    <name>hive.exec.local.scratchdir</name>
    <value>/home/hadoop/app/hive/iotmp</value>
    <description>Local scratch space for Hive jobs</description>
  </property>

3.增加mysql 驱动

[hadoop@hadoop001 conf]$ cp mysql-connector-java-5.1.45 $HIVE_HOME/lib

4.启动hive

[hadoop@hadoop001 hive]$ hive
ls: cannot access /home/hadoop/app/spark-2.4.4-bin-2.6.0-cdh5.15.1/lib/spark-assembly-*.jar: No such file or directory
which: no hbase in (/root/apps/hadoop/bin:/root/apps/hadoop/sbin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/root/apps/jdk1.8.0_221/bin:/root/apps/apache-maven-3.6.1/bin:/home/hadoop/app/zookeeper-3.4.5-cdh5.15.1/bin:/root/apps/scala-2.13.0/bin:/home/hadoop/app/flume/bin:/home/hadoop/app/hive/bin:/home/hadoop/app/hive/conf:/usr/local/nginx/sbin:/home/hadoop/app/azkaban-solo-server/bin:/home/hadoop/app/spark-2.4.4-bin-2.6.0-cdh5.15.1/bin:/home/hadoop/app/spark-2.4.4-bin-2.6.0-cdh5.15.1/sbin:/home/hadoop/.local/bin:/home/hadoop/bin)

Logging initialized using configuration in jar:file:/home/hadoop/app/apache-hive-1.1.0-cdh5.15.1-bin/lib/hive-common-1.1.0-cdh5.15.1.jar!/hive-log4j.properties
WARNING: Hive CLI is deprecated and migration to Beeline is recommended.
hive> 

发布了33 篇原创文章 · 获赞 1 · 访问量 2579

猜你喜欢

转载自blog.csdn.net/weixin_44131414/article/details/102395666