Python crawler series 1--------Scrapy installation and use

linux install scrapy

  • Install scrapy
#安装
pip install scrapy
#安装如果不顺利,报错如下
#error: command 'gcc' failed with exit status 1
#安装依赖
yum install gcc libffi-devel python-devel openssl-devel
#再次安装则会成功
#如果是python3环境下则有可能报错
*********************************************************************************
    Could not find function xmlCheckVersion in library libxml2. Is libxml2 installed?
    *********************************************************************************
error: command 'gcc' failed with exit status 1  
#这种情况下需要安装依赖包libxslt-devel
yum install libxslt-devel
#再次安装
pip3 install scrapy
#不报错则完成安装
  • scrapy use
     
    After installing scrapy, the command scrapy will appear. At this point we need to create a crawler template project
#创建文件夹
mkdir Spider
cd Spider

#创建scrapy项目
scrapy startproject mySpider
#这样scrapy模板项目创建完毕

scrapy create

To be continued. . . .

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=324839897&siteId=291194637