请看原文作者的博客:
https://blog.csdn.net/qq_36743482/article/details/78418343
我补充的是:
外部表的分区 :
create external table t2(
id int
,name string
,hobby array<string>
,add map<String,string>
)
partitioned by (pt_d string)
row format delimited
fields terminated by ','
collection items terminated by '-'
map keys terminated by ':'
location '/user/d4t2'
加载数据:
load data local inpath '/home/hadoop/apps/hiveData/t2.txt' overwrite into table t2 partition ( pt_d = '201701');
再加载一次:
load data local inpath '/home/hadoop/apps/hiveData/t2.txt' overwrite into table t2 partition ( pt_d = '201801');
你会发现会在外部表存放数据文件的文件夹里面会有两个文件被加载到里面了。