Hive的API操作

基于hadoop的Hive数据仓库JavaAPI简单调用的实例,关于Hive的简介在此不赘述。hive提供了三种用户接口:CLI,JDBC/ODBC和 WebUI

  1. CLI,即Shell命令行
  2. JDBC/ODBC 是 Hive 的Java,与使用传统数据库JDBC的方式类似
  3. WebGUI是通过浏览器访问 Hive

本文主要介绍的就是第二种用户接口,直接进入正题。

1、Hive 安装:

1)hive的安装请参考网上的相关文章,测试时只在hadoop一个节点上安装hive即可。

2)测试数据data文件'\t'分隔:

1 zhangsan

2 lisi

3 wangwu

3)将测试数据data上传到linux目录下,我放置在:/home/hadoop01/data

2、在使用 JDBC 开发 Hive 程序时, 必须首先开启 Hive 的远程服务接口。使用下面命令进行开启:

Java代码  收藏代码
收藏代码
  1. hive --service hiveserver >/dev/null 2>/dev/null &

3、测试代码:

Java代码  收藏代码
收藏代码
  1. import java.sql.Connection;
  2. import java.sql.DriverManager;
  3. import java.sql.ResultSet;
  4. import java.sql.SQLException;
  5. import java.sql.Statement;
  6. import org.apache.log4j.Logger;

  7. public class HiveJdbcCli {
  8. private static String driverName = "org.apache.hadoop.hive.jdbc.HiveDriver";
  9. private static String url = "jdbc:hive://hadoop3:10000/default";
  10. private static String user = "hive";
  11. private static String password = "mysql";
  12. private static String sql = "";
  13. private static ResultSet res;
  14. private static final Logger log = Logger.getLogger(HiveJdbcCli.class);
  15. public static void main(String[] args) {
  16. Connection conn = null;
  17. Statement stmt = null;
  18. try {
  19. conn = getConn();
  20. stmt = conn.createStatement();
  21. // 第一步:存在就先删除
  22. String tableName = dropTable(stmt);
  23. // 第二步:不存在就创建
  24. createTable(stmt, tableName);
  25. // 第三步:查看创建的表
  26. showTables(stmt, tableName);
  27. // 执行describe table操作
  28. describeTables(stmt, tableName);
  29. // 执行load data into table操作
  30. loadData(stmt, tableName);
  31. // 执行 select * query 操作
  32. selectData(stmt, tableName);
  33. // 执行 regular hive query 统计操作
  34. countData(stmt, tableName);
  35. catch (ClassNotFoundException e) {
  36. e.printStackTrace();
  37. log.error(driverName + " not found!", e);
  38. System.exit(1);
  39. catch (SQLException e) {
  40. e.printStackTrace();
  41. log.error("Connection error!", e);
  42. System.exit(1);
  43. finally {
  44. try {
  45. if (conn != null) {
  46. conn.close();
  47. conn = null;
  48. }
  49. if (stmt != null) {
  50. stmt.close();
  51. stmt = null;
  52. }
  53. catch (SQLException e) {
  54. e.printStackTrace();
  55. }
  56. }
  57. }
  58. private static void countData(Statement stmt, String tableName)
  59. throws SQLException {
  60. sql = "select count(1) from " + tableName;
  61. System.out.println("Running:" + sql);
  62. res = stmt.executeQuery(sql);
  63. System.out.println("执行“regular hive query”运行结果:");
  64. while (res.next()) {
  65. System.out.println("count ------>" + res.getString(1));
  66. }
  67. }
  68. private static void selectData(Statement stmt, String tableName)
  69. throws SQLException {
  70. sql = "select * from " + tableName;
  71. System.out.println("Running:" + sql);
  72. res = stmt.executeQuery(sql);
  73. System.out.println("执行 select * query 运行结果:");
  74. while (res.next()) {
  75. System.out.println(res.getInt(1) + "\t" + res.getString(2));
  76. }
  77. }
  78. private static void loadData(Statement stmt, String tableName)
  79. throws SQLException {
  80. String filepath = "/home/hadoop01/data";
  81. sql = "load data local inpath '" + filepath + "' into table "
  82. + tableName;
  83. System.out.println("Running:" + sql);
  84. res = stmt.executeQuery(sql);
  85. }
  86. private static void describeTables(Statement stmt, String tableName)
  87. throws SQLException {
  88. sql = "describe " + tableName;
  89. System.out.println("Running:" + sql);
  90. res = stmt.executeQuery(sql);
  91. System.out.println("执行 describe table 运行结果:");
  92. while (res.next()) {
  93. System.out.println(res.getString(1) + "\t" + res.getString(2));
  94. }
  95. }
  96. private static void showTables(Statement stmt, String tableName)
  97. throws SQLException {
  98. sql = "show tables '" + tableName + "'";
  99. System.out.println("Running:" + sql);
  100. res = stmt.executeQuery(sql);
  101. System.out.println("执行 show tables 运行结果:");
  102. if (res.next()) {
  103. System.out.println(res.getString(1));
  104. }
  105. }
  106. private static void createTable(Statement stmt, String tableName)
  107. throws SQLException {
  108. sql = "create table "
  109. + tableName
  110. " (key int, value string) row format delimited fields terminated by '\t'";
  111. stmt.executeQuery(sql);
  112. }
  113. private static String dropTable(Statement stmt) throws SQLException {
  114. // 创建的表名
  115. String tableName = "testHive";
  116. sql = "drop table " + tableName;
  117. stmt.executeQuery(sql);
  118. return tableName;
  119. }
  120. private static Connection getConn() throws ClassNotFoundException,
  121. SQLException {
  122. Class.forName(driverName);
  123. Connection conn = DriverManager.getConnection(url, user, password);
  124. return conn;
  125. }
  126. }

4、测试结果

5、终端查询结果:

hive> select * from testHive;

OK

1 zhangsan

2 lisi

3 wangwu

Time taken: 11.232 seconds

hive>

猜你喜欢

转载自blog.csdn.net/qq_35703919/article/details/77683613
今日推荐