scrapy 为每个spider 设置不同的pipelines

class ExceptionspiderSpider(scrapy.Spider):
    name = 'exceptionSpider'
    # allowed_domains = ['baidu.com']
    start_urls = ['http://baidu.com/']

    custom_settings = {
        'ITEM_PIPELINES':{
            'TestExceptionSpider.pipelines.TestexceptionspiderPipeline':300,
            'TestExceptionSpider.exceptionPipeline.ExceptionPipeline':400
        }
    }

    def start_requests(self):
        for url in self.start_urls:
           yield scrapy.Request(url = url,callback= self.parse)

    def parse(self, response):
        pass
 

在custom_settings里写
http://www.waitingfy.com/archives/3833

 
 

猜你喜欢

转载自blog.csdn.net/fox64194167/article/details/80500289