首页 新闻 会员 周边

Scrapy下载图片出现错误

0
[已关闭问题] 关闭于 2016-06-22 11:04

用Scrapy下载图片用ImagesPipeline下载出错

 1 Unhandled error in Deferred:
 2 2016-06-22 09:51:29 [twisted] CRITICAL: Unhandled error in Deferred:
 3 
 4 
 5 Traceback (most recent call last):
 6   File "C:\Python27\lib\site-packages\scrapy\commands\crawl.py", line 57, in run
 7     self.crawler_process.crawl(spname, **opts.spargs)
 8   File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 163, in crawl
 9     return self._crawl(crawler, *args, **kwargs)
10   File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 167, in _crawl
11     d = crawler.crawl(*args, **kwargs)
12   File "C:\Python27\lib\site-packages\twisted\internet\defer.py", line 1274, in unwindGenerator
13     return _inlineCallbacks(None, gen, Deferred())
14 --- <exception caught here> ---
15   File "C:\Python27\lib\site-packages\twisted\internet\defer.py", line 1128, in _inlineCallbacks
16     result = g.send(result)
17   File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 90, in crawl
18     six.reraise(*exc_info)
19   File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 72, in crawl
20     self.engine = self._create_engine()
21   File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 97, in _create_engine
22     return ExecutionEngine(self, lambda _: self.stop())
23   File "C:\Python27\lib\site-packages\scrapy\core\engine.py", line 69, in __init__
24     self.scraper = Scraper(crawler)
25   File "C:\Python27\lib\site-packages\scrapy\core\scraper.py", line 71, in __init__
26     self.itemproc = itemproc_cls.from_crawler(crawler)
27   File "C:\Python27\lib\site-packages\scrapy\middleware.py", line 58, in from_crawler
28     return cls.from_settings(crawler.settings, crawler)
29   File "C:\Python27\lib\site-packages\scrapy\middleware.py", line 36, in from_settings
30     mw = mwcls.from_crawler(crawler)
31   File "C:\Python27\lib\site-packages\scrapy\pipelines\media.py", line 33, in from_crawler
32     pipe = cls.from_settings(crawler.settings)
33   File "C:\Python27\lib\site-packages\scrapy\pipelines\images.py", line 61, in from_settings
34     return cls(store_uri, settings=settings)
35   File "C:\Python27\lib\site-packages\scrapy\pipelines\images.py", line 42, in __init__
36     super(ImagesPipeline, self).__init__(store_uri, settings=settings, download_func=download_func)
37   File "C:\Python27\lib\site-packages\scrapy\pipelines\files.py", line 230, in __init__
38     self.store = self._get_store(store_uri)
39   File "C:\Python27\lib\site-packages\scrapy\pipelines\files.py", line 252, in _get_store
40     store_cls = self.STORE_SCHEMES[scheme]
41 exceptions.KeyError: 'e'
42 2016-06-22 09:51:29 [twisted] CRITICAL: 
43 Traceback (most recent call last):
44   File "C:\Python27\lib\site-packages\twisted\internet\defer.py", line 1128, in _inlineCallbacks
45     result = g.send(result)
46   File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 90, in crawl
47     six.reraise(*exc_info)
48   File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 72, in crawl
49     self.engine = self._create_engine()
50   File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 97, in _create_engine
51     return ExecutionEngine(self, lambda _: self.stop())
52   File "C:\Python27\lib\site-packages\scrapy\core\engine.py", line 69, in __init__
53     self.scraper = Scraper(crawler)
54   File "C:\Python27\lib\site-packages\scrapy\core\scraper.py", line 71, in __init__
55     self.itemproc = itemproc_cls.from_crawler(crawler)
56   File "C:\Python27\lib\site-packages\scrapy\middleware.py", line 58, in from_crawler
57     return cls.from_settings(crawler.settings, crawler)
58   File "C:\Python27\lib\site-packages\scrapy\middleware.py", line 36, in from_settings
59     mw = mwcls.from_crawler(crawler)
60   File "C:\Python27\lib\site-packages\scrapy\pipelines\media.py", line 33, in from_crawler
61     pipe = cls.from_settings(crawler.settings)
62   File "C:\Python27\lib\site-packages\scrapy\pipelines\images.py", line 61, in from_settings
63     return cls(store_uri, settings=settings)
64   File "C:\Python27\lib\site-packages\scrapy\pipelines\images.py", line 42, in __init__
65     super(ImagesPipeline, self).__init__(store_uri, settings=settings, download_func=download_func)
66   File "C:\Python27\lib\site-packages\scrapy\pipelines\files.py", line 230, in __init__
67     self.store = self._get_store(store_uri)
68   File "C:\Python27\lib\site-packages\scrapy\pipelines\files.py", line 252, in _get_store
69     store_cls = self.STORE_SCHEMES[scheme]
70 KeyError: 'e'

不使用imagespipeline可以正常使用

Dranched的主页 Dranched | 初学一级 | 园豆:23
提问于:2016-06-22 09:57
< >
分享
所有回答(1)
0

是储存路径的问题

Dranched | 园豆:23 (初学一级) | 2016-06-22 11:04
清除回答草稿
   您需要登录以后才能回答,未注册用户请先注册