DataX (5): read HDFS data and write to MySQL

1 Rename the file uploaded in the previous case

hadoop fs -mv /handsome.txt* /handsome.txt

 

2 View the official template

python /opt/module/datax/bin/datax.py -r hdfsreader -w mysqlwriter

 

3 Create a configuration file

{
	"job": {
		"content": [{
			"reader": {
				"name": "hdfsreader",
				"parameter": {
					"column": ["*"],
					"defaultFS": "hdfs://hadoop138.com:9000",
					"encoding": "UTF-8",
					"fieldDelimiter": "\t",
					"fileType": "text",
					"path": "/handsome.txt"
				}
			},
			"writer": {
				"name": "mysqlwriter",
				"parameter": {
					"column": [
						"id",
						"name"
					],
					"connection": [{
						"jdbcUrl": "jdbc:mysql://192.168.222.132:3306/datax",
						"table": ["handsome2"]
					}],
					"password": "123456",
					"username": "root",
					"writeMode": "insert"
				}
			}
		}],
		"setting": {
			"speed": {
				"channel": "1"
			}
		}
	}
}

4 Create handsome2 in MySQL's datax database

mysql> use datax;
mysql> create table student2(id int,name varchar(20));

5 Executing tasks

bin/datax.py job/hdfs2mysql.json

 

6 View the handsome2 table

 

Guess you like

Origin blog.csdn.net/u013938578/article/details/130041107