Command
export :
sqoop export --connect jdbc:mysql://localhost:3306/test --username root --password root --table test --export-dir '/user/hive/warehouse/test' --fields- terminated-by '\001' --input-null-string '\\N' --input-null-non-string '\\N' --input-lines-terminated-by '\n' -m 1
import
mysql -->hdfs;
sqoop import --connect jdbc:mysql://localhost:3306/test--username root --password root--table test--as-textfile --delete-target-dir --fields- terminated-by '\001' --target-dir /user/hive/warehouse/test
hdfs -->hive data loading:
LOAD DATA
LOCAL
INPATH 'dim_csl_rule_config.txt' OVERWRITE into table dim.dim_csl_rule_config;
where local refers to the local disk .
to boot hdfs just
LOAD DATA INPATH 'dim_csl_rule_config.txt' OVERWRITE into table dim.dim_csl_rule_config;
load data inpath '/user/hdfs/user/hive/warehouse/myTable' overwrite into table myTable
sqoop data export, hive -->mysql
Guess you like
Origin http://43.154.161.224:23101/article/api/json?id=326270270&siteId=291194637
Ranking