搭建hadoop+spark+hive环境(配置安装hive)

I、下载并且解压hive

#下载hive
wget http://apache.claz.org/hive/hive-2.3.6/apache-hive-2.3.6-bin.tar.gz

#解压
tar zxf apache-hive-2.3.6-bin.tar.gz

#移动到hadoop文件夹中
mv apache-hive-2.3.6-bin /usr/local/hadoop/hive-2.1.1

#配置系统环境变量
vim /etc/profile
#添加下面三行

export HIVE_HOME=/usr/local/hadoop/hive
export HIVE_CONF_DIR=$HIVE_HOME/conf
export PATH=$PATH:$HIVE_HOME/bin

 

II、安装其他依赖包

1、mysql-connector驱动

#下载mysql-connector 驱动
wget https://dev.mysql.com/get/Downloads/Connector-J/mysql-connector-java-5.1.48.tar.gz

#解压
tar zxf mysql-connector-java-5.1.48.tar.gz 

#进入并将mysql-connector-java-5.1.48-bin.jar,mysql-connector-java-5.1.48.jar两个包移动到/hive/lib中
cp mysql-connector-java-5.1.48-bin.jar /usr/local/hadoop/hive-2.1.1/lib/
cp mysql-connector-java-5.1.48.jar /usr/local/hadoop/hive-2.1.1/lib/

2、hive on spark所必须的包

#进入已经安装好的/hadoop/spark/jars目录下的scala-library spark-core spark-network-common包复制到hive/lib下,依据版本不同,后缀数字不同
cp scala-library-2.11.12.jar spark-core_2.11-2.4.4.jar spark-network-common_2.11-2.4.4.jar /usr/local/hadoop/hive/lib/

3、

猜你喜欢

转载自www.cnblogs.com/gambler/p/11869614.html