python3.7学习scrapy crawl MySpider报错

版权声明:转载请标明出处 https://blog.csdn.net/easy_purple/article/details/82805345

解决方法:

将python安装目录里的python\Lib\site-packages\twisted\conch里的manhole.py文件里的async全部替换成async1就好了

报错内容:

>scrapy crawl MySpider
2018-09-21 18:03:27 [scrapy.utils.log] INFO: Scrapy 1.5.1 started (bot: tutorial)
2018-09-21 18:03:27 [scrapy.utils.log] INFO: Versions: lxml 4.2.5.0, libxml2 2.9.5, cssselect 1.0.3, parsel 1.5.0, w3lib 1.19.0, Twisted 18.7.0, Python 3.7.0 (v3.7.0:1bf9cc5093, Jun 27 2018, 04:59:51) [MSC v.1914 64 bit (AMD64)], pyOpenSSL 18.0.0 (OpenSSL 1.1.0i  14 Aug 2018), cryptography 2.3.1, Platform Windows-10-10.0.17134-SP0
2018-09-21 18:03:27 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'tutorial', 'NEWSPIDER_MODULE': 'tutorial.spiders', 'ROBOTSTXT_OBEY': True, 'SPIDER_MODULES': ['tutorial.spiders']}
Traceback (most recent call last):
  File "d:\app\python\lib\runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "d:\app\python\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "D:\app\python\Scripts\scrapy.exe\__main__.py", line 9, in <module>
  File "d:\app\python\lib\site-packages\scrapy\cmdline.py", line 150, in execute
    _run_print_help(parser, _run_command, cmd, args, opts)
  File "d:\app\python\lib\site-packages\scrapy\cmdline.py", line 90, in _run_print_help
    func(*a, **kw)
  File "d:\app\python\lib\site-packages\scrapy\cmdline.py", line 157, in _run_command
    cmd.run(args, opts)
  File "d:\app\python\lib\site-packages\scrapy\commands\crawl.py", line 57, in run
    self.crawler_process.crawl(spname, **opts.spargs)
  File "d:\app\python\lib\site-packages\scrapy\crawler.py", line 170, in crawl
    crawler = self.create_crawler(crawler_or_spidercls)
  File "d:\app\python\lib\site-packages\scrapy\crawler.py", line 198, in create_crawler
    return self._create_crawler(crawler_or_spidercls)
  File "d:\app\python\lib\site-packages\scrapy\crawler.py", line 203, in _create_crawler
    return Crawler(spidercls, self.settings)
  File "d:\app\python\lib\site-packages\scrapy\crawler.py", line 55, in __init__
    self.extensions = ExtensionManager.from_crawler(self)
  File "d:\app\python\lib\site-packages\scrapy\middleware.py", line 58, in from_crawler
    return cls.from_settings(crawler.settings, crawler)
  File "d:\app\python\lib\site-packages\scrapy\middleware.py", line 34, in from_settings
    mwcls = load_object(clspath)
  File "d:\app\python\lib\site-packages\scrapy\utils\misc.py", line 44, in load_object
    mod = import_module(module)
  File "d:\app\python\lib\importlib\__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
  File "<frozen importlib._bootstrap>", line 983, in _find_and_load
  File "<frozen importlib._bootstrap>", line 967, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 728, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "d:\app\python\lib\site-packages\scrapy\extensions\telnet.py", line 12, in <module>
    from twisted.conch import manhole, telnet
  File "d:\app\python\lib\site-packages\twisted\conch\manhole.py", line 154
    def write(self, data, async=False):
                              ^
SyntaxError: invalid syntax

猜你喜欢

转载自blog.csdn.net/easy_purple/article/details/82805345
今日推荐