Frightened -- use phpmyadmin to recover accidentally deleted data

By the way, today is a hell of a convulsion. Recently, I have been working intensively on the website http://www.meimei199.com/star/ . I accidentally deleted the entire site, 5555, and there are a total of 3 t files in it. . . . . There are backups, but I can only go back slowly. After reading the estimated one week, I feel a little urge to hit people, but it’s okay, but the database was also deleted by me, hey. . . Remedy

In desperation, I googled and found that many people were as stupid as me, but I almost didn't see a specific solution (some said to use hard disk software to recover, and some said to use binary files to recover), but I myself I don't know much about the server, and I feel very entangled. They are urging them to say why the data is gone, and they say that they must be retrieved.
Now I'm in a hurry, and I'm not in the mood to eat dinner.
 
At this time, I found some technicians who specialize in helping people recover data.
He asked me about the amount of data, the database engine, and when I said 'myisam', he said: That's no solution. .
I was stunned at the time.
 
In desperation, I thought, there was a replica database in the test server before, but the data there has been for some time, and there is no data for the latest month.
This doesn't save me at all!
 
Here I see this in phpmyadmin:

 

I came up with the binary recovery that Google said in my mind, so I clicked it and took a look. Great results! OMG, the server has opened the log!


There are database operation logs, and they are in sql format! ! !
I hadn't touched this thing before, but it was my lifesaver at this point.
I browsed it, and there are nearly a month's worth of update records, which is enough. However, there is a problem. In phpmyadmin, only a small part of the content can be displayed. It is too difficult to find out the data of the deleted table one by one in the face of 10 or several watts of data rows.
 
At this point, I thought of downloading these files from the server and getting the data in them.
As soon as I said it, I logged into the server and searched for these binaries:


The red circle in the picture above is all that is needed,
 
and then export these files one by one into readable sql:


In this way, these binary files are output one by one into normal sql files.
 
At this time, it is necessary to find the data related to the deleted table from these files, so I wrote a java program to help me do this:
[java] view plain copy
package com.nerve.sql.reload; 
  
import java.io .BufferedReader; 
import java.io.BufferedWriter; 
import java.io.FileReader; 
import java.io.FileWriter; 
import java.util.ArrayList; 
import java.util.List; 
  
import org.nerve.util.NumberUtil; 
  
/**
 * @project: cloudOffice_swing
 * @file: ReloadWorker.java
 * @package: com.nerve.sql.reload
 * @description:
 * Bring out the operation records of the corresponding table in the file exported by the binary log
 * @author:  http:// www.meimei199.com
 * @date&time: Jan 23, 2014
 * @change log:
 */ 
public class ReloadWorker { 
     
    public void read(List<String> orgF, String targetF, String table) throws Exception{ 
        BufferedWriter bw = new BufferedWriter(new FileWriter(targetF, true)); 
         
        for(String or:orgF){ 
            BufferedReader br = new BufferedReader(new FileReader(or)); 
            String t = null; 
            String t2 = null; 
            table = table.toUpperCase(); 
            while((t=br.readLine())!=null){ 
                t2 = t.toUpperCase(); 
                /*
                 * 如果是update操作,直接提出
                 */ 
                if(t2.startsWith("UPDATE "+table)){ 
                    bw.append(t+";\n"); 
                } 
                /*
                 * If it is an insert statement, because there is some data from the old server
                 * so perform the delete operation first
                 */ 
                else if(t2.startsWith("INSERT INTO "+table)){ 
                    String ids = t2.substring(t2. lastIndexOf(",")); 
                    bw.append("delete from "+table+" where id="+NumberUtil.toDigital(ids)+";\n"); 
                    bw.append(t+";\n"); 
                } 
                /*
                 * You must add ; after the sql statement, because there is no original, if you don't add it, an error occurs when importing to the database
                 */ 
            } 
            br.close(); 
        } 
         
        bw.flush();   
        bw.close(); 
    } 
     
    public static void main(String[] args) throws Exception{ 
        long sd = System.currentTimeMillis(); 
        ReloadWorker w = new ReloadWorker(); 
        List<String> orgs = new ArrayList<String>(); 
        orgs.add("C:/Users/IBM_ADMIN/Desktop/000015.txt"); 
        orgs.add("C:/Users/IBM_ADMIN/Desktop/000016.txt"); 
        orgs.add("C:/Users/IBM_ADMIN/Desktop/000017.txt"); 
        orgs.add("C:/Users/IBM_ADMIN/Desktop/000018.txt"); 
        orgs.add("C:/Users/IBM_ADMIN/Desktop/000019.txt"); 
         
        String targetS = "C:/Users/IBM_ADMIN/Desktop/000017_sql.txt"; 
        w.read(orgs, targetS, "task"); 
         
        System.out.println("DONE, on " +(System.currentTimeMillis() - sd)/1000+" s"); 
    } 

  
After getting the summary sql file, import it into the database.
Finally, success

Forehead. . . remind everyone

On the importance of backup

On the importance of backup

On the importance of backup

。。。。。

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=326062572&siteId=291194637