hive日常

1、 sql
select ROUND(RAND()*1000,0) from dual;    生成随机数
 
2、 hive创建临时表
create TEMPORARY table up_speed select a.place,a.quart,a.year,a.kwh/b.kwh-1 as elec_speed_up from  (select c.place,c.year,c.quart,sum(c.kwh) as kwh from user_h_growth c group by place,quart,year) a JOIN (select d.place,d.year,d.quart,sum(d.kwh) as kwh from user_h_growth d grou```  
 
3、
drop database aaa cascade⋯⋯傻琪记不住命令
 
4、

 
 
支持Python中运行hive语句,还需要安装bison  下载thrift依赖的东西  yum -y install automake libtool flex bison pkgconfig gcc-c++ boost-devel libevent-devel zlib-devel Python-devel ruby-devel crypto-utils openssl openssl-devel
 
8、
hive调用的包明明在本机存在,却报不存在,需要把端口kill掉重启hive --service hiveserver 重启,因为ambari开机时会自动起一个10086,这个端口是不能用的。
 

 
10、
ORA-12899: value too large for column 列宽太小,修改列宽
 
11、
hive的join 要用on 用where的时候outer join不起作用和inner join一样····(这都能差······)
 
 

 
 
 
17、
Hive没有关系数据库的临时表,只能先生成一个表后再用完再删除叫喊
 
18、oracle insert数据后需要commit 否则不生效
 
 

20、

今天完成一个简单的join,却一直报

 

org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row 

这样的错误,还具体列出了某行,查询两表,均有该字段···

 

最后解决。

就是把小的表加入内存,可以配置以下参数,是hive自动根据sql,选择使用common join或者map join

set hive.auto.convert.join = true;

hive.mapjoin.smalltable.filesize 默认值是25mb

 

21、hive oracle语句有很多函数区别,比如参数,hive没有to_char,oracle需要trun之类,最大的区别是字段标识 hive子查询可以没有标识符,字段得加AS

oracle子查询表需要加标识符

  22、
不能从表中查取数据存入csv中
 
2015-05-22 14:23:44,354 WARN org.apache.hadoop.mapred.Child: Error running child
java.lang.RuntimeException: java.lang.AbstractMethodError: com.bizo.hive.serde.csv.CSVSerde.getSerDeStats()Lorg/apache/hadoop/hive/serde2/SerDeStats;
    at org.apache.hadoop.hive.ql.exec.ExecMapper.map(ExecMapper.java:161)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
    at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.lang.AbstractMethodError: com.bizo.hive.serde.csv.CSVSerde.getSerDeStats()Lorg/apache/hadoop/hive/serde2/SerDeStats;
    at org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:574)
    at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471)
    at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:762)
    at org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84)
    at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471)
    at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:762)
    at org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOperator.java:83)
    at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471)
    at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:762)
    at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:529)
    at org.apache.hadoop.hive.ql.exec.ExecMapper.map(ExecMapper.java:143)
    ... 8 more
 
 
23、libthrift JAR包冲突导致,libthrift-0.9.0  替换为 libthrift-0.8.0 
Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)
... 7 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
... 12 more
Caused by: java.lang.NoSuchMethodError: org.apache.thrift.EncodingUtils.setBit(BIZ)B
at org.apache.hadoop.hive.metastore.api.PrivilegeGrantInfo.setCreateTimeIsSet(PrivilegeGrantInfo.java:245)
at org.apache.hadoop.hive.metastore.api.PrivilegeGrantInfo.<init>(PrivilegeGrantInfo.java:163)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles(HiveMetaStore.java:563)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:398)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:356)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4944)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:171)
 
 24、

csv-serde下载地址:http://ogrodnek.github.io/csv-serde/

用法:
add jar path/to/csv-serde.jar;

create table my_table(a string, b string,...)
 row format serde 'com.bizo.hive.serde.csv.CSVSerde'
 stored as textfile
;

自定义分隔符:

 
       
add jar path/to/csv-serde.jar;

create table my_table(a string, b string,...)
 row format serde 'com.bizo.hive.serde.csv.CSVSerde'with serdeproperties ("separatorChar"="\t","quoteChar"="'","escapeChar"="\\")	  
 stored as textfile
;
 
      

猜你喜欢

转载自719607746.iteye.com/blog/2397524
今日推荐