sqoop(5):export之hbaseTomysql

一、实现功能

1.目的:使用sqoop将hbase中数据导入到mysql中,sqoop没有这个直接功能,需要hive做一个中间转换。

2.环境:hadoop2.7.3、hive1.2.1、hbase-0.98.6、sqoop-1.4.7.bin。

二、实现步骤

1.开启

hdfs
yarn(这个一定要开启,因为sqoop需要调用)
zk
metastore
hbase服务器

2.建表hbase

create 'course_clickcount_to_mysql','info' 

插入语句:

put 'course_clickcount_to_mysql',20181117_1,'info:click_count',100 
put 'course_clickcount_to_mysql',20181117_2,'info:click_count',2000 

3.hive中创建外部临时表

目的:使用hive和hbase表数据关联,从而可以取出来数据。

CREATE
EXTERNAL TABLE
imooc_course_clickcount_ext(
key string, 
click_count int
)
STORED BY
'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
WITH
SERDEPROPERTIES ("hbase.columns.mapping"
= ":key, info:click_count")
TBLPROPERTIES ("hbase.table.name"
= "imooc_course_clickcount");

4.hive创建导出临时表,为内部表,为了导入mysql

CREATE
TABLE  course_clickcount_export(
key string, 
click_count INT
)
ROW FORMAT DELIMITED FIELDS TERMINATED BY
'\054';

将外部表数据导入临时表

insert overwrite table course_clickcount_export select * from course_clickcount_ext;

检查:

select * from course_clickcount_export;
结果:
course_clickcount_export.key    course_clickcount_export.click_count
201811171       100
201811172       2000

5.mysql创建表

create table course_clickcount_sqoop(
rowkey varchar(40) not null,
id int(11) not null
)charset=utf8;

6.hive导入mysql

sqoop export --connect jdbc:mysql://bigdata.ibeifeng.com:3306/imooc_spark?useSSL=false --username  root --password 123456 --table course_clickcount_sqoop --export-dir /user/hive/warehouse/imooc.db/course_clickcount_export
 

mysql中测试成功~

mysql> select * from course_clickcount_sqoop;
+-----------+------+
| rowkey    | id   |
+-----------+------+
| 201811171 |  100 |
| 201811172 | 2000 |
+-----------+------+
2 rows in set (0.00 sec)

猜你喜欢

转载自blog.csdn.net/u010886217/article/details/84194719