mysqsl -pt-archiver

- WHERE  ' ID <3000 '     provided operating conditions
 --limit 10000     each data line 1000 to fetch Pt- Archive processing
 --txn size- 1000     provided a 1000 transaction commit time behavior
 --progress 5000     every 5000 primary processing line outputs information
 --statistics the end of the given statistics: point start time, end time point, the number of lines of inquiry, the number of rows of filing, the number of rows deleted, and the total time and the ratio of the various stages of consumption, in order to facilitate optimize. They do not add --quiet, pt- case of default archive will execute the output process
 --charset = UTF8 character set specified as UTF8, character set the character set corresponding to the current library needs to operate
 --no- the Delete said they did not delete the original data Note: If you do not specify this parameter, after all processing is complete, the data will be cleaning up the original table
 --bulk- the delete batch delete the old data on the Source
 --bulk- iNSERT bulk insert data into dest host (see the dest it is found through general log LOAD dATA LOCAL INFILE data inserted in the host dest)
 -Remove purge source database match related records
 - local or not to optimize write operations to analyze the inside binlog (prevent latency from the master great)
 --analyze = After ds operation, the table space optimization (d represent dest, s represents a source ) 
By default, pt after -archiver operation, will not source, dest table execute analyze or optimize operation because this operation takes time, and you need to advance estimates have enough disk space for the copy of the table. General recommendations also pt- after the archiver operation is performed analyze table to reclaim table space in the business trough manually. 

1 ) can not be a space between DSN appear after the note --source, or an error occurs. - values where the conditions, the character string, use quotes. 
2 ) --limit represents how much each set of data to delete (Note: if the data relatively long time, a large number may be provided, to reduce the cycle times), final cleaning operation or by the pK = the Where XX handled; 

bulk inserts and non-bulk inserting 
    deleted using dELETE FROM TABLE WHERE ..., remove each row of data 
    is inserted into the INTO TABLE using the VALUES the iNSERT ( ' ... ' ), each row of data is inserted into 
    the mass delete employed dELETE FROM TABLE WHERE ... LIMIT10000 
    bulk insert the DATA using the LOCAL INFILE the LOAD ' File ' the INTO TABLE ... 

. 1 , the whole archive table, the table does not delete the original data, the non-bulk insert ---- bulk insert 
Pt -archiver --source H = 172.16 . 1.10 , P = 3306 , U = backup_user, P = ' XXX ' , test123 = D, T = C1 --dest H = 172.16 . 1.10 , P = 3306 , U = backup_user, P = ' XXX ' , test123 = D, T = C1 = the UTF8 the --charset - WHERE  ' . 1. 1 = ' --progress 10000 --limit = 10000 --txn-size 10000--no- --statistics Delete ## 

Pt -archiver --source H = 172.16 . 1.10 , P = 3306 , U = backup_user, P = ' XXX ' , test123 = D, T = C1 --dest H = 172.16 . 1.10 , P = 3306 , U = backup_user, P = ' XXX ' , test123 = D, T = C1 = the --charset the UTF8 - WHERE  ' . 1. 1 = ' --progress 10000 --limit = 10000 --txn-size 10000 --bulk-iNSERT---bulk the delete --statistics --no- the delete 

2 , data archiving, deleting the original table data, non-bulk insert, delete non-bulk - bulk insert, delete batch 
pt-archiver --source h=172.16.1.10,P=3306,u=backup_user,p='xxx',D=test123,t=c1 --dest h=172.16.1.10,P=3306,u=backup_user,p='xxx',D=test123,t=c1 --charset=UTF8 --where '1=1' --progress 10000 --limit=10000 --txn-size 10000 --statistics --purge

pt-archiver --source h=172.16.1.10,P=3306,u=backup_user,p='xxx' ,, test123 = D, T = C1 --dest H = 172.16 . 1.10 , P = 3306 , U = backup_user, P = ' XXX ' , test123 = D, T = C1 = the --charset the UTF8 - WHERE  ' . 1 . 1 = ' --progress 10000 --limit = 10000 --txn-size 10000 --bulk-INSERT---bulk delete --statistics - purge 


. 3 , the data for the export file, do not delete the original data in the table 
pt H = --source -archiver 127.0 . 1.1 , P = 3306 , D = Test, Test T = the --charset the UTF8 = - WHERE  ' the itemID> 100 ' --progress 1000--file " /tmp/aa.txt " --limit = 10000 --no- the Delete 

4 , enforce the index, the index name is specified by the parameters i, go by default PRIMARY, when large data very slow 

 pt -archiver - H = -source ' XX ' , P = ' 3306 ' , U = ' XX ' , P = ' XX ' , D = ' Db_order ' , T = ' XX ' --dest = H ' XX ' , P = ' 3306 ' , U = ' XX ' , P = 'xx' , D = ' XX ' , T = ' XX ' the --charset = utf8mb4, I = index_createTime - WHERE  ' createTime <20.180201 trillion ' --progress 10000 --limit 10000 - statistics 

. 5 , data archiving primary key violation, replace by to solve 

 Pt -archiver --source = H ' XX ' , P = ' 3306 ' , U = ' XX ' , P = ' XX ' , D = ' Db_order ' , T = ' XX 'H = --dest ' XX ' , P = ' 3306 ' , U = ' XX ' , P = ' XX ' , D = ' XX ' , T = ' XX ' the --charset = utf8mb4 --replace - WHERE  ' createTime <20.180201 trillion ' --progress 10000 --limit 10000 - statistics 

. 6 , by dry- view PT execution plan run, index data query using 

Pt -archiver --source = H ' XX ' , P = ' 3306 'u ='xx',p='xx',D='db_order',t='xx' --dest h='xx',P='3306',u='xx',p='xx',D='xx',t='xx' --charset=utf8mb4,i=index_createTime --replace  --where 'createTime<20180201000000' --progress 10000 --limit 10000 --statistics --dry-RUN 

. 7 , Source and dest character set is not system solutions 

Pt -archiver --source = H 'XX', P = '3306', U = 'XX', P = 'XX', D = 'Db_order', T = 'xx', A = utf8mb4 --dest h = 'xx', P = '3306', u = 'xx', p = 'xx', D = 'xx', t = 'xx' --charset = --where the UTF8 'createTime <20.180201 trillion' --progress --limit 10000 10000 - statistics


 

Guess you like

Origin www.cnblogs.com/monkeybron/p/11614182.html
pt