scrapy frame connection database MongoDB

Disclaimer: This article is a blogger original article, shall not be reproduced without the bloggers allow https://blog.csdn.net/g_optimistic/article/details/90202071

table of Contents

1. Download pymongo module

2. Coding simple database

(1) linked database

(2) create a database

(3) create table

(4) inserting data

3. Open Robo 3T, view data


1. Download pymongo module

pip install pymongo

2. Coding simple database

scrapy framework and database links, mainly in order to save the data to crawl inside the database, so we put the relevant code of this operation is written to the file scrapy pipeline project

(1) linked database

(2) create a database

(3) create table

(4) inserting data

import pymongo
class YaoPipeline(object):
    def __init__(self):
        #链接数据库
        self.client=pymongo.MongoClient('localhost')
        #创建库
        self.db=self.client['yaoyao']
        self.table=self.db['xiaobin']
    def process_item(self,item,spider):
        #插入值
        self.table.insert(dict(item))
        return item

Scrapy project execution:

scrapy crawl s_111

3. Open Robo 3T, view data

Create link

File--->Connect--->Creat--->save

View the database, tables, records:

Guess you like

Origin blog.csdn.net/g_optimistic/article/details/90202071