sqoop实现 mysql与hive交互

一 数据从mysql导入到hive

1.下图为mysql数据库 要将testhive表中的数据导入到hive


2. [hadoop@master bin]$  sqoop import --connect jdbc:mysql://192.168.145.128:3306/test --username root --password 123456 --table testhive  -m 1 --hive-import

3.查看到表和数据,说明导入成功。

0: jdbc:hive2://192.168.145.128:10000> select * from testhive;
INFO  : Compiling command(queryId=hadoop_20180307155656_2a6f16c9-93ab-4e2c-a5b5-f776006dd4e6): select * from testhive
INFO  : Semantic Analysis Completed
INFO  : Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:testhive.id, type:int, comment:null)], properties:null)
INFO  : Completed compiling command(queryId=hadoop_20180307155656_2a6f16c9-93ab-4e2c-a5b5-f776006dd4e6); Time taken: 1.613 seconds
INFO  : Concurrency mode is disabled, not creating a lock manager
INFO  : Executing command(queryId=hadoop_20180307155656_2a6f16c9-93ab-4e2c-a5b5-f776006dd4e6): select * from testhive
INFO  : Completed executing command(queryId=hadoop_20180307155656_2a6f16c9-93ab-4e2c-a5b5-f776006dd4e6); Time taken: 0.0 seconds
INFO  : OK
+--------------+--+
| testhive.id  |
+--------------+--+
| 0            |
| 1            |
| 2            |
+--------------+--+
3 rows selected (1.989 seconds)

0: jdbc:hive2://192.168.145.128:10000> 

二 数据从hive导入到mysql

1.先在mysql中创建导出数据存放的表flwordcount


2. sqoop export --connect jdbc:mysql://192.168.145.128:3306/test --username root --password 123456 --table flwordcount --export-dir /user/hive/warehouse/flwordcount   --input-null-string '\\N' --input-null-non-string '\\N' --fields-terminated-by '\t'


猜你喜欢

转载自blog.csdn.net/feilong2483/article/details/79472916