hive-同一份数据多种处理(节省时间的写法)

hive 提供了一个独特的语法,可以从一个数据源产生多个数据聚合,无需每次聚合都要重新扫描一次。对于大的数据输入集来说,可优化节约非常可观的时间。


例子:

    hive > from table1

           > INSERT OVERWRITE TABLE2 select *  where action='xx1'

          > INSERT OVERWRITE TABLE3 select *  where action='xx2';


这样只需扫描table1表一次。

亲身验证:可以同一时间,同一操作插入数据到同一个表的不同分区


   限制:不能在同一时间,同一操作插入数据到同一个表,如:

    

    hive > from table1

           > INSERT into table TABLE2 select *  where action='xx1'

          > INSERT  into table TABLE2 select *  where action='xx2';

这样提示错误


   应用场景:

      特别适用于笛卡尔积的用户数计算,如:计算15天的每天的月(30天)的活跃用户数

       原SQL(mysql语法):

        SELECT b.day_id, a.col1, a.col2,COUNT(DISTINCT col4) AS num
    FROM TABLE1 a, TABLE2 b
   WHERE a.op_day_id >=
         date_format(date_add('20140310', INTERVAL - 43 DAY), '%Y%m%d')
     AND a.op_day_id <= b.day_id
     AND a.op_day_id >= b.month_day_id
   GROUP BY b.day_id, a.col1, a.col2;

   --说明:table2 为时间格式表,day_id是当天,month_day_id是30天前的日期,一共有15条记录

   执行时间:10000多秒

  

   优化后的语法:

FROM TABLE1
  INSERT OVERWRITE LOCAL DIRECTORY '/data/data_table/data_table1.txt' SELECT 20140303, col1, col2, 2160701, COUNT(DISTINCT col) WHERE col3 <= 20140303 AND col3 >= 20140201 GROUP BY col1, col2
  INSERT OVERWRITE LOCAL DIRECTORY '/data/data_table/data_table2.txt' SELECT 20140302, col1, col2, 2160701, COUNT(DISTINCT col) WHERE col3 <= 20140302 AND col3 >= 20140131 GROUP BY col1, col2
  INSERT OVERWRITE LOCAL DIRECTORY '/data/data_table/data_table3.txt' SELECT 20140301, col1, col2, 2160701, COUNT(DISTINCT col) WHERE col3 <= 20140301 AND col3 >= 20140130 GROUP BY col1, col2
  INSERT OVERWRITE LOCAL DIRECTORY '/data/data_table/data_table4.txt' SELECT 20140228, col1, col2, 2160701, COUNT(DISTINCT col) WHERE col3 <= 20140228 AND col3 >= 20140129 GROUP BY col1, col2
  INSERT OVERWRITE LOCAL DIRECTORY '/data/data_table/data_table5.txt' SELECT 20140227, col1, col2, 2160701, COUNT(DISTINCT col) WHERE col3 <= 20140227 AND col3 >= 20140128 GROUP BY col1, col2
  INSERT OVERWRITE LOCAL DIRECTORY '/data/data_table/data_table6.txt' SELECT 20140226, col1, col2, 2160701, COUNT(DISTINCT col) WHERE col3 <= 20140226 AND col3 >= 20140127 GROUP BY col1, col2
  INSERT OVERWRITE LOCAL DIRECTORY '/data/data_table/data_table7.txt' SELECT 20140225, col1, col2, 2160701, COUNT(DISTINCT col) WHERE col3 <= 20140225 AND col3 >= 20140126 GROUP BY col1, col2
  INSERT OVERWRITE LOCAL DIRECTORY '/data/data_table/data_table8.txt' SELECT 20140224, col1, col2, 2160701, COUNT(DISTINCT col) WHERE col3 <= 20140224 AND col3 >= 20140125 GROUP BY col1, col2
  INSERT OVERWRITE LOCAL DIRECTORY '/data/data_table/data_table9.txt' SELECT 20140223, col1, col2, 2160701, COUNT(DISTINCT col) WHERE col3 <= 20140223 AND col3 >= 20140124 GROUP BY col1, col2
  INSERT OVERWRITE LOCAL DIRECTORY '/data/data_table/data_table10.txt' SELECT 20140222, col1, col2, 2160701, COUNT(DISTINCT col) WHERE col3 <= 20140222 AND col3 >= 20140123 GROUP BY col1, col2
  INSERT OVERWRITE LOCAL DIRECTORY '/data/data_table/data_table11.txt' SELECT 20140221, col1, col2, 2160701, COUNT(DISTINCT col) WHERE col3 <= 20140221 AND col3 >= 20140122 GROUP BY col1, col2
  INSERT OVERWRITE LOCAL DIRECTORY '/data/data_table/data_table12.txt' SELECT 20140220, col1, col2, 2160701, COUNT(DISTINCT col) WHERE col3 <= 20140220 AND col3 >= 20140121 GROUP BY col1, col2
  INSERT OVERWRITE LOCAL DIRECTORY '/data/data_table/data_table13.txt' SELECT 20140219, col1, col2, 2160701, COUNT(DISTINCT col) WHERE col3 <= 20140219 AND col3 >= 20140120 GROUP BY col1, col2
  INSERT OVERWRITE LOCAL DIRECTORY '/data/data_table/data_table14.txt' SELECT 20140218, col1, col2, 2160701, COUNT(DISTINCT col) WHERE col3 <= 20140218 AND col3 >= 20140119 GROUP BY col1, col2
  INSERT OVERWRITE LOCAL DIRECTORY '/data/data_table/data_table15.txt' SELECT 20140217, col1, col2, 2160701, COUNT(DISTINCT col) WHERE col3 <= 20140217 AND col3 >= 20140118 GROUP BY col1, col2;

Time taken: 685.172 seconds

猜你喜欢

转载自blog.csdn.net/zhuiqiuuuu/article/details/79170694