scrapy custom command

Custom command

  • Create any directory spiders at the same level, such as: commands
  • In which to create crawlall.py file (where filename is the custom command)
  •  1 from scrapy.commands import ScrapyCommand
     2     from scrapy.utils.project import get_project_settings
     3 
     4 
     5     class Command(ScrapyCommand):
     6 
     7         requires_project = True
     8 
     9         def syntax(self):
    10             return '[options]'
    11 
    12         def short_desc(self):
    13             return 'Runs all of the spiders'
    14 
    15         def run(self, args, opts):
    16             spider_list = self.crawler_process.spiders.list()
    17             for name in spider_list:
    18                 self.crawler_process.crawl(name, **opts.__dict__)
    19             self.crawler_process.start()
    crawlall.py
  • In settings.py add configuration COMMANDS_MODULE = 'project name. Directory name'
  • In the project directory execute the command: scrapy crawlall

 Single reptiles:

import sys
from scrapy.cmdline import execute

if __name__ == '__main__':
    execute(["scrapy","crawl","chouti","--nolog"])

 

Guess you like

Origin www.cnblogs.com/ganxiang/p/11029003.html