record system log

System Logging with Redis

When building a system, we often need to record what is currently happening, and record the frequency of specific messages, and determine the arrangement of messages according to the frequency of occurrence to help us find important information.

There are two common methods of logging:

Log the log to a file. Log lines are added to the file over time, and new log files are created after a period of time. This approach creates different logs for each different service, and since services have different mechanisms for rotating logs, there is also a lack of a common way to easily aggregate all logs and process them.
syslog service. This service runs almost on TCP port 514 and UDP port on Linux servers and Unix servers. syslog accepts log messages from other programs, routes the messages to various log files stored on the hard disk, and is responsible for the rotation and deletion of old logs. Log messages can even be forwarded to other services for further processing.
The forwarding function of syslog can store different logs in multiple files on the same server, which is very helpful for logging for a long time. We can use redis to store time-related logs, functionally replacing syslog messages that need to be stored for a short period of time.

1. Latest log

We need to use a "list" to store the latest log files, and use the LPUSH command to push log messages into the list. If we want to view the existing log messages later, we can use the LRANGE command to pull the messages in the list.

We'll also name the different log message queues and rank the logs according to the severity of the problem.
'''
import time
import logging
import unittest
import redis
from datetime import datetime

Set a dictionary that maps the security level of most logs to strings

SEVERITY = {
logging.DEBUG: 'debug',
logging.INFO: 'info',
logging.WARNING: 'warning',
logging.ERROR: 'error',
logging.CRITICAL: 'critical',
}

SEVERITY.update((name, name) for name in SEVERITY.values())

"""
Store the latest log files, name different log message queues, and grade the logs according to the severity of the problem

@param {object}
@param {string} name message queue name
@param {string} message message
@param {string} severity security level
@param {object} pip pipline

"""
def logRecent(conn, name, message, severity=logging.INFO, pip=None):
# Convert the log's security level to a simple string
severity = str(SEVERITY.get(severity, severity)).lower ()
# Create a redis list to be saved key
destination = 'recent:%s:%s'%(name, severity)
# Add the current time to the message to record the sending time of the
message message = time.asctime() + ' ' + message
# Use pipelining to reduce the number of communication round trips to one
pipe = pip or conn.pipeline()
# Add message to the top of the list
pipe.lpush(destination, message)
# Trim the log list so that it is only Contains the latest 100 messages
pipe.ltrim(destination, 0, 99)
pipe.execute()

'''

2. Common logs

We need to record the log of higher frequency, use "ordered set", take the message as a member, and the frequency of the message is the score of the member.

To ensure that the common messages we see are up-to-date, we need to rotate the messages every hour, and keep the common messages recorded in the previous hour when rotating the log, thus preventing the situation that there is no message storage.
'''
"""
Record the log that occurs more frequently, rotate the message every hour, and keep the common message recorded in the previous hour when rotating the log

@param {object}
@param {string} name message queue name
@param {string} message message
@param {string} severity security level
@param {int} timeout execution timeout

"""
def logCommon(conn, name, message, severity=logging.INFO, timeout=5):
# Set the log security level
severity = str(SEVERITY.get(severity, severity)).lower()
# Responsible for storing recent logs The key of common log messages
destination = 'common:%s:%s'%(name, severity)
# Need to rotate the log every hour, need to record the current number of hours
start_key = destination + ':start'
pipe = conn.pipeline( )
end = time.time() + timeout
while time.time() < end:
try:
# Monitor the key that records the current hour to ensure that the rotation operation can be performed normally
pipe.watch(start_key)
# The current time
now = datetime .utcnow().timetuple()
# Get the current hour
hour_start = datetime(*now[:4]).isoformat()

        existing = pipe.get(start_key)
        # 开始事务
        pipe.multi()
        # 如果这个常见日志消息记录的是上个小时的日志
        if existing and existing < hour_start:
            # 将这些旧的常见日志归档
            pipe.rename(destination, destination + ':last')
            pipe.rename(start_key, destination + ':pstart')
            # 更新当前所处的小时数
            pipe.set(start_key, hour_start)
        elif not existing:
            pipe.set(start_key, hour_start)

        # 记录日志出现次数
        pipe.zincrby(destination, message)
        # 将日志记录到日志列表中,调用excute
        logRecent(pipe, name, message, severity, pipe)
        return
    except redis.exceptions.WatchError:
        continue

'''

test

The test code is as follows:

'''
class TestLog(unittest.TestCase):
def setUp(self):
import redis
self.conn = redis.Redis(db=15)
self.conn.flushdb

def tearDown(self):
    self.conn.flushdb()
    del self.conn
    print
    print

def testLogRecent(self):
    import pprint
    conn = self.conn

    print "Let's write a few logs to the recent log"
    for msg in xrange(5):
        logRecent(conn, 'test', 'this is message %s'%msg)

    recent = conn.lrange('recent:test:info', 0, -1)
    print 'The current recent message log has this many message:', len(recent)
    print 'Those message include:'
    pprint.pprint(recent[:10])
    self.assertTrue(len(recent) >= 5)

def testLogCommon(self):
    import pprint
    conn = self.conn

    print "Let's writ a few logs to the common log"
    for count in xrange(1, 6):
        for i in xrange(count):
            logCommon(conn, 'test', 'message-%s'%count)

    common = conn.zrevrange('common:test:info', 0, -1, withscores=True)
    print 'The current common message log has this many message:', len(common)
    print 'Those common message include:'
    pprint.pprint(common)
    self.assertTrue(len(common) >= 5)

if name == 'main':
unittest.main()

'''

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=324946327&siteId=291194637