Record the problem that mysql importing tens of millions of test data is too slow!

The database uses stored procedures to insert 10 million pieces of test data without any optimization.

CREATE PROCEDURE addmaxdata(IN n int)
BEGIN
DECLARE i INT DEFAULT 1;
START TRANSACTION;
WHILE (i <= n ) DO
INSERT INTO maxdata (name,createdate) VALUES (CONCAT('loser超',last_insert_id()),NOW());
set i=i+1;
END WHILE;
commit;
END

 

It is necessary to manually submit the transaction, put the transaction outside, and submit the transaction manually after all data is inserted, which is a qualitative leap in time. Otherwise, each time a data database is inserted in a loop, a transaction will be automatically opened to submit, which takes a lot of time.

 

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=324771182&siteId=291194637