sqoop export to hive

After installation sqqop, sometimes you need to hive-common-3.1.2.jar installation directory under the hive sqoop copied to a directory (not sqoop may be parsed correctly to the installation directory hive variable profile).
1, Data View ## on the HDFS
Hadoop -ls FS / sqphdfsdata / dh_call_info2 / datajob1004

2、##sqoop导入mysql到 hdfs
sqoop list-tables --username root --password '2019_Mysql' --connect jdbc:mysql://localhost:3306/bgdmysqldb --target-dir /sqphdfsdata/dh_call_info2/datajob1004

3, ## and create a consistent mysql table structure in the hive end

sqoop create-hive-table --connect jdbc:mysql://192.168.91.112:3306/bgdmysqldb --username root --password '2019_Mysql' --table dh_call_info2

4、##在hive端建表
CREATE TABLE IF NOT EXISTS `rdw.dh_call_info2`
(id BIGINT, telephone string, name string, create_time int, update_time int)
ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t' STORED AS TEXTFILE;

5、##sqoop导入hive
sqoop import \
--connect jdbc:mysql://192.168.91.112:3306/bgdmysqldb \
--username root \
--password '2019_Mysql' \
--table dh_call_info2 \
--fields-terminated-by '\t' \
--num-mappers 1 \
--hive-import \
--hive-database default \
--hive-table dh_call_info2 \
--delete-target-dir

Guess you like

Origin www.cnblogs.com/bjxdd/p/11979827.html