2.4-2.5, Hive integration (integration Spark, integration Hbase), connection Cli, HiveServer and hivemetastore, Squirrel SQL Client etc.

2.4 Other Integration

2.4.1Hive integration Spark

 Spark整合hive,需要将hive_home下的conf下的hive_site.xml放到spark_home下的conf目录下。(3台服务器都做相同的配置)
[root@bigdata2 spark-2.3.0-bin-hadoop2.7]# cd $HIVE_HOME/conf
[root@bigdata2 conf]# cp hive-site.xml $SPARK_HOME/conf

If you want to use ./spark-sql yarn run in a way, we need to mysql-connector-java-5.1.38.jar put $ SPARK_HOME / jars below

2.4.2 Hive integration HBASE

(1) modify the hive-site.xml file, add the configuration properties (ZooKeeper address)

<property>
     <name>hbase.zookeeper.quorum</name>           
     <value>bigdata2:2181,bigdata3:2181,bigdata4:2181,bigdata5:2181,bigdata6:2181</value>
</property>

(2) introducing hbase dependencies
will hbase lib installation package directory file folder under the package into the environment variable in the hive
added hive-env.sh file:

export HIVE_CLASSPATH=$HIVE_CLASSPATH:$HBASE_HOME/lib

The above configuration is synchronized to another machine 2

2.5 Connection

2.5.1Cli connection

Here Insert Picture Description

2.5.2HiveServer2 / beeline
more about beeline use, refer to: https: //cwiki.apache.org/confluence/display/Hive/HiveServer2+Clients (which introduced more detailed hiveserver2 beeline configuration /)

现在使用的最新的hive版本是hive-2.3.5,都需要对hadoop集群做如下改变,否则无法使用。

2.5.2.1 Modify hadoop cluster core-site.xml are as follows

<property>
    <name>hadoop.proxyuser.root.hosts</name>
    <!-- <value>master</value> -->
    <value>*</value>
</property>

<property>
    <name>hadoop.proxyuser.root.groups</name>
    <!-- <value>hadoop</value> -->
    <value>*</value>
</property>

which is:
Here Insert Picture Description

Configuration parsing:
1, hadoop.proxyuser.root.hosts configured to sense * indicates any node in the cluster hadoop use a proxy user can access to hdfs root cluster.
2, hadoop.proxyuser.hadoop.groups user agent represents a group belongs.
. 3, the root is hadoop.proxyuser.root.hosts hadoop user, i.e. hadoop installation directory.

Above take effect after restart the cluster hadoop

2.5.2.2 Modify hive-site.xml the following:

<property>
        <name>dfs.webhdfs.enabled</name>
        <value>true</value>
    </property>

    <property>
        <name>hive.server2.thrift.client.user</name>
        <value>root</value>
        <description>Username to use against thrift client</description>
    </property>
    <property>
        <name>hive.server2.thrift.client.password</name>
        <value>123456</value>
        <description>Password to use against thrift client</description>
    </property>

    <property>
        <name>hive.server2.thrift.bind.host</name>
        <value>hadoop1</value>
        <description>Bind host on which to run the HiveServer2 Thrift service.</description>
    </property>

    <property>
        <name>hive.server2.thrift.port</name>
        <value>10000</value>
    <description>Port number of HiveServer2 Thrift interface when hive.server2.transport.mode is 'binary'.</description>
    </property>

    <property>
        <name>hive.server2.thrift.http.port</name>
        <value>10001</value>
        <description>Port number of HiveServer2 Thrift interface when hive.server2.transport.mode is 'http'.</description>
    </property>

Note that this username root, password is the operating system login name and password.

Then execute:

nohup  hive --service hiveserver2 &

Or similar run using the following methods:

nohup hiveserver2 1>/home/hadoop/hiveserver.log 2>/home/hadoop/hiveserver.err &
或者:nohup hiveserver2 1>/dev/null 2>/dev/null &
或者:nohup hiveserver2 >/dev/null 2>&1 &

Login beenline way:

beeline -u jdbc:hive2//hadoop1:10000 -n root
-u :指定元数据的连接信息
-n :指定用户名和密码
另外还有一种方式可以去连接
先执行beeline,然后再输入:!connect jdbc:hive2://hadoop1:10000
[root@hadoop1 apache-hive-2.3.4-bin]# bin/beeline 
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/installed/apache-hive-2.3.4-bin/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/installed/hadoop-2.8.5/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Beeline version 2.3.4 by Apache Hive
beeline> !connect jdbc:hive2://hadoop1:10000
Connecting to jdbc:hive2://hadoop1:10000
Enter username for jdbc:hive2://hadoop1:10000: 
Enter password for jdbc:hive2://hadoop1:10000: 
Connected to: Apache Hive (version 2.3.4)
Driver: Hive JDBC (version 2.3.4)
Transaction isolation: TRANSACTION_REPEATABLE_READ
0: jdbc:hive2://hadoop1:10000>

Here Insert Picture Description
Here Insert Picture Description
Another example:

[root@hadoop1 apache-hive-2.3.4-bin]# beeline -u jdbc:hive2://hadoop1:10000 -n root
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/installed/apache-hive-2.3.4-bin/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/installed/hadoop-2.8.5/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Connecting to jdbc:hive2://hadoop1:10000
Connected to: Apache Hive (version 2.3.4)
Driver: Hive JDBC (version 2.3.4)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 2.3.4 by Apache Hive
0: jdbc:hive2://hadoop1:10000>

2.5.3Hive wui

Temporarily slightly

2.5.4 and Squirrel SQL Client Integration

https://cwiki.apache.org/confluence/display/Hive/HiveServer2+Clients the bottom of which contain introduction.

2.5.5 and Oracle SQL Developer Integration

Oracle's integrated and can SQLDevelopers
https://community.hortonworks.com/articles/1887/connect-oracle-sql-developer-to-hive.html

Released 1048 original articles · won praise 328 · Views 3.85 million +

Guess you like

Origin blog.csdn.net/toto1297488504/article/details/103982099
Recommended