sqoop command

1. list the database

 sqoop list-databases --connect jdbc:mysql://192.168.1.9:3306/ -username root -password root;

2. list tables in one database

sqoop list-tables --connect jdbc:mysql://192.168.1.9:3306/test -username root -password root;

3. import only table structure from mysql to hive

sqoop create-hive-table --connect jdbc:mysql://192.168.1.9:3306/test --table userinfo -username root -password root ot -password root --hive-table hive_userinfo --fields-terminated-by "," --lines-terminated-by "\n";

we'd better identify the path of the table data:

create table hive_userinfo(id int, name string, age int, address string) row format delimited fields terminated by ',' location '/user/hivetest/userinfo';

扫描二维码关注公众号,回复: 532803 查看本文章

4. import data(many records) from mysql to hive

sqoop import --connect jdbc:mysql://192.168.1.9:3306/test -username root -password root --table userinfo --hive-import --hive-table hive_userinfo -m 2 --fields-terminated-by ",";

-m 2 means use two maps to join the job

 5. export data from hdfs to mysql

sqoop export --connect jdbc:mysql://192.168.1.9:3306/test -username root -password root --table userinfo1 --export-dir /user/hivetest/userinfo/part-m-00000 --input-fields-terminated-by ',';

 http://www.jb51.net/LINUXjishu/43356.html

猜你喜欢

转载自fypop.iteye.com/blog/2217562