sqoop导入导出

1.列出有多少数据库
sqoop list-databases \
–connect jdbc:mysql://192.168.85.3:3306/hadoop \
–username root \
–password Caofeng2012@

2.简单导入HDFS
sqoop import \
–connect jdbc:mysql://192.168.85.3:3306/hadoop \
–username root \
–password Caofeng2012@ \
–table t_student \
-m 1 \
–target-dir /sqo/01 –delete-target-dir;

导出的查询结果:
[root@localhost conf]# hadoop fs -ls /sqo/01;
Found 2 items
-rw-r–r– 1 root supergroup 0 2018-07-18 20:31 /sqo/01/_SUCCESS
-rw-r–r– 1 root supergroup 39 2018-07-18 20:31 /sqo/01/part-m-00000
[root@localhost conf]# hadoop fs -cat /sqo/01/part-m-00000;
11,111,1111
11,11,11
222,222,222
3,3,3
[root@localhost conf]#

查询语句mysql导入到hdfs
sqoop import \
–connect jdbc:mysql://192.168.85.3:3306/hadoop \
–username root \
–password Caofeng2012@ \
–target-dir /sqo/03 \
–query ‘select id,name from t_student where $CONDITIONS and id=”3”’ \
–split-by id \
–fields-terminated-by ‘\t’ \
-m 4;
除配置好环境外,需要复制jar包
cp /home/hive/lib/hive-common-2.3.3.jar /home/sqoop1.4.7/lib
不然会报错
Import failed: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf

导入到hive
sqoop import \
–connect jdbc:mysql://192.168.85.3:3306/hadoop \
–username root \
–password Caofeng2012@ \
–query ‘select id,name from t_student where $CONDITIONS and id=”3”’ \
–split-by id \
–fields-terminated-by ‘\t’ \
–create-hive-table –hive-import –hive-overwrite \
–target-dir /temp3/ \
–hive-table hivetest.student16 –delete-target-dir ;

下面是参考别人的
使用Sqoop导入MySQL数据到HDFS

sqoop import –connect jdbc:mysql://localhost:3306/hadoop \
–username root –password Caofeng2012@ \
–table user –columns ‘uid,uname’ -m 1 \
-target-dir ‘/sqoop/user’;

使用Sqoop导入MySQL数据到Hive中

sqoop import –hive-import –connect jdbc:mysql://localhost:3306/hadoop \
–username root –password Caofeng2012@ \
–table user \
–columns ‘uid,uname’ -m 1 ;

使用Sqoop导入MySQL数据到Hive中,并使用查询语句
sqoop import –hive-import –connect jdbc:mysql://localhost:3306/hadoop \
–username root –password Caofeng2012@ -m 1 –hive-table user6 \
–query ‘select * from user where uid<10 and $CONDITIONS’ \
–target-dir /sqoop/user5;

使用Sqoop将Hive中的数据导出到MySQL中

sqoop export –connect jdbc:mysql://localhost:3306/hadoop \
–username root –password Caofeng2012@ -m 1 \
–table user5 –export-dir /user/hive/warehouse/user6 ;

猜你喜欢

转载自blog.csdn.net/lkpklpk/article/details/81107768