To promote the project to clean old data

1. Background

  After the initial data cleaning, doing the task in the process of data will continue to decay. After the initial cleaning of data subsequent decay again to do the task of cleaning methods (do the task successfully on behalf of the data is available) may also use the data found. It is reasoned that some data may have a cooling time, cooling time after when they can do the task, the results of this analysis for the design of the old data cleaning cycle functions.

2, design

2.1, every 10 days of data to maintain a history table, for example: his_cookie_t2019082, his_cookie_t2019083 representing 2019-08-11 to 2019-08-20 and 2019-08-21 to 2019-08-31 data;

2.2, for every 10 or more all the historical data in the table key is inserted into the first wash segment table cookie_clean_t duplicates removed;

2.3, normal cleaning of the data storage to meet cookie_succ_t;

2.4, the first wash data (cookie_succ_t) compliant systems, and normal operation data (cookie_used_t) delete the data associated with the cleaning matching (cookie_succ_t) repeating transactions;

2.5 Data (cookie_succ_t) allocated in line to do the task;

2.6, robots do the task successfully uuid data stored in redis;

2.7, insert redis extracted data from the system to the normal operation of the data (cookie_used_t) repeated use;

Guess you like

Origin www.cnblogs.com/xx0829/p/11566834.html