Scrapy crawler frame configuration and use log output --log

Turn: https://blog.csdn.net/weixin_41666747/article/details/82716688

1, set the log level and log output file name in the configuration file

1, Why date for the file name?

Because it can facilitate the development of a daily log to view information, but can also prevent a single log file log information accumulated more and more, it will save the day, the day of the log information to a log file

2, Configuration options log level LOG_LEVEL, log file path LOG_FILE, I am here to set the level to WARNING

2, the logging module introduced in the program, the content needs to be output from the output log log

I configured with the WARNING level, then I 1 100 and which is defined when an exception (logging.warning) to WARNING output is equal to or higher than the level of information can be outputted to the log I, information is lower than the output level is less than my log information

The following are the warning level information, two log files are output to the log

The following is an info, a level of information warning, only warning level of information output to a log file log

3, extension

Scrapy offer 5 layer logging level:

CRITICAL - Critical error

ERROR - General error

WARNING - warning messages

INFO - General Information

DEBUG - Debug Information

logging settings 
may be used by the logging in setting.py the following settings:

LOG_ENABLED default: True, Enable logging

LOG_ENCODING Default: 'utf-8', encoded using logging

LOG_FILE Default: None, create logging output file in the current directory in the file name

LOG_LEVEL default: 'DEBUG', log lowest level

I have limited experience, inadequacies please correct me
 

Guess you like

Origin blog.csdn.net/chenxijie1985/article/details/92615717