In our work environment are just two requirements, but also the use of the official version, we do not want to try to influence the very beginning under the premise of using a script mysql performance execution itself every second show processlist get sql record, there is always this time interval, if sql execution itself quickly this way and can not get to the end we mysql through research protocol, consider using python capture an analytical manner to realize the function, through our online using function basically no problem, come out for the open-source we tested the use of the code is very simple, the main understanding of mysql protocol:
The principle
1 of the packet by pypcap, using dpkt unpacking
2. The contract returns during the assembly session by mysql agreement
3. execution time using the calculated session start and end time (time stamp pypcap returned in py2 only accurate to ten milliseconds, the time to do so using current calculated in the code, there will be deviation)
4. Perform the operation of obtaining the user name is divided into two ways, if the new connection is unpacked by acquiring, if it is already connected to a long, obtained by the backend database PROCESSLIST (provided that the data stream is acquired backend tools mysql to use)
Implementation modalities:
python tcp_audit.py -h parameter description can be obtained
Example:
1, for example, I listen to the port after the data stream 3306 in the middleware layer:
python tcp_audit.py -e eth0 -p 3306 -t src -u username -P password
2, if the local port is listening, such as our middleware layer port 6001:
python tcp_audit.py -e eth0 -p 6001 -t des (here not provide a user name and password, because the middleware has changed the source of information, and I was listening to the middleware application layer of this traffic, so you can not get a direct link to the users can only use the default unpack acquisition)
3, access to print content as follows:
2019-08-06 08:52:22,984 INFO log.py : INFO source_host: 10.1.11.59 source_port: 59272 destination_host: 10.1.1.46 destination_port: 3306 user_name: test01 sql: INSERT INTO proxy_heart_beat.tb_heartbeat (p_id, p_ts) VALUES('?', '?') values: None execute_time:0.0001 status:#42000INSERT, UPDATE command denied to user 'test01'@'10.1.11.59' for table 'tb_heartbeat'
special reminder:
1. If the data flow is very large, the tool does not use a small cpu time
2. logs in log directory, and the default 10 minutes cutting time, one hour retention data, if necessary to modify the configuration itself log.py
Demand package:
dpkt、psutil、pypacp、pymysql
Using this tool can collect logs to elk and other log analysis platform for preservation, but also save files directly to their own filter analysis
github Address: https://github.com/wwwbjqcom/mysql_audit
Have any questions or problems can add qq group (479,472,450) AC