Hive JDBC运行连接注意事项

通过jdbc方式连接hive,能让你非常方便、简单地去使用hadoop挖掘数据,门槛大大降低。其实连接方式很简单,但是第一次使用,总会有些莫名奇妙的错误,下面给出一些注意事项,希望对初学者有帮助。

首先我的环境:hadoop2.4.0+hive0.14.0。

所需jar包:

    <classpathentry kind="lib" path="lib/commons-collections-3.2.1.jar"/>
    <classpathentry kind="lib" path="lib/commons-logging-1.1.3.jar"/>
    <classpathentry kind="lib" path="lib/hadoop-common-2.4.0.jar"/>
    <classpathentry kind="lib" path="lib/libfb303-0.9.0.jar"/>
    <classpathentry kind="lib" path="lib/httpclient-4.2.5.jar"/>
    <classpathentry kind="lib" path="lib/httpcore-4.2.5.jar"/>
    <classpathentry kind="lib" path="lib/log4j-1.2.16.jar"/>
    <classpathentry kind="lib" path="lib/slf4j-api-1.6.1.jar"/>
    <classpathentry kind="lib" path="lib/slf4j-log4j12-1.6.1.jar"/>
    <classpathentry kind="lib" path="lib/hive-exec-0.14.0.jar"/>
    <classpathentry kind="lib" path="lib/hive-jdbc-0.14.0.jar"/>
    <classpathentry kind="lib" path="lib/hive-metastore-0.14.0.jar"/>
    <classpathentry kind="lib" path="lib/hive-service-0.14.0.jar"/>
    <classpathentry kind="lib" path="lib/hadoop-mapreduce-client-core-2.4.0.jar"/>

如果你运行程序出现以下错误:

java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.

java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.

解决方案:

1、System.setProperty("hadoop.home.dir", "D:/hadoop-2.4.0");

2、下载winutils.exe https://github.com/srccodes/hadoop-common-2.2.0-bin/blob/master/bin/winutils.exe

附上测试代码:

public class HiveJdbcClient2 {
	private static String driverName = "org.apache.hadoop.hive.jdbc.HiveDriver";  
	  
    /** 
     * @param args 
     * @throws SQLException 
     */  
    public static void main(String[] args) throws SQLException {  
    	
    	System.setProperty("hadoop.home.dir", "D:/hadoop-2.4.0");
    	
    	BasicConfigurator.configure();
    	
        try {  
            Class.forName(driverName);  
        } catch (ClassNotFoundException e) {  
            // TODO Auto-generated catch block  
            e.printStackTrace();  
            System.exit(1);  
        }  
        Connection con = DriverManager.getConnection("jdbc:hive://127.0.0.1:10000/defalt", "","");  
        Statement stmt = con.createStatement();  
        
	    //stmt.executeQuery("drop table test");
	    stmt.executeQuery("create table if not exists test(amount DOUBLE, st_name string) " +
	    		"ROW FORMAT DELIMITED " +
	    		"FIELDS TERMINATED BY '\t' " +
	    		"STORED AS TEXTFILE");
        
	    //stmt.executeQuery("load data inpath '/user/hive_data/test_data.txt' into table gas");
        
	    long st = System.currentTimeMillis();
        ResultSet res = stmt.executeQuery("select st_name,sum(amount) c from test group by st_name  sort by c");  
        int i=0;
        while (res.next()) {  
        	i++;
            System.out.println(res.getString(1)+" - "+res.getString(2));  
        }  
        long en = System.currentTimeMillis();
        
        System.out.println("总耗时:"+(en-st)+",记录总数:"+i);
    } 
}

 

猜你喜欢

转载自just2do.iteye.com/blog/2198361