Using Sqoop to import and export data between Hive and Oracle

      Recently, I encountered the problem of importing and exporting data between Hive and Oracle in my work, so I organized the following for reference.


1. Import Oracle data into Hive


/usr/bin/sqoop import \
--connect jdbc:oracle:thin:@72.*.*.185:1521:dbcxj2 \
--username name --password pw \
--table tableName \
-m 1 \
--fields-terminated-by "," \
--hive-database hiveName \
--hive-table hiveTable \
--hive-import \
--hive-overwrite

##explain
72.*.*.185: IP address of the server where the Oracle database is located;
dbcxj2: SID number;
name: User name for logging in to the Oracle database;
pw: the password to log in to the Oracle database;
tableName: The table name of the table where the data needs to be exported
hiveName: The name of the database imported into Hive (the database needs to be created in advance)
hiveTable: The name of the table imported into Hive (the table needs to be created in advance)

2. Import Hive data into Oracle


/usr/bin/sqoop export \
--connect jdbc:oracle:thin:@72.*.*.185:1521:dbcxj2 \
--username name \
--password pw \
--table tableName \
--export-dir dir \
--fields-terminated-by ',' \
-m 1

##explain
72.*.*.185: IP address of the server where the Oracle database is located;
dbcxj2: SID number;
name: User name for logging in to the Oracle database;
pw: the password to log in to the Oracle database;
dir: the absolute path of the data of the data table in hive on HDFS

What problems did you encounter in this process, welcome to leave a message, let me see what problems you have encountered.

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=325936785&siteId=291194637