AttributeError: 'FeedExporter' object has no attribute 'slot'
See original GitHub issueI have this simple spider, when I call scrapy crawl dataspider
it works fine and prints the item in the output :
import json
from scrapy.spiders import Spider
class dataspider(Spider):
name='dataspider'
start_urls=('https://www.google.com/finance/match?matchtype=matchall&ei=UVlPWNmDEYm_U7SqgvAH&q=AAPL',)
def parse(self, response):
j=json.loads( response.body.decode('utf-8') )
yield j['matches'][1]
Outputs :
{‘t’: ‘AAPL’, ‘n’: ‘Apple Inc.’, ‘e’: ‘NASDAQ’, ‘id’: ‘22144’}
However as soon as I try to save the item in a file using scrapy crawl dataspider -o out.json
I get this error :
AttributeError: ‘FeedExporter’ object has no attribute ‘slot’
Full Traceback is :
$ scrapy crawl dataspider -o ./test.json
2017-01-30 14:32:06 [scrapy.utils.log] INFO: Scrapy 1.3.0 started (bot: googlefinance)
2017-01-30 14:32:06 [scrapy.utils.log] INFO: Overridden settings: {'BOT_NAME': 'googlefinance', 'CONCURRENT_REQUESTS': 100, 'CONCURRENT_REQUESTS_PER_DOMAIN': 100, 'DNS_TIMEOUT': 30, 'DOWNLOAD_TIMEOUT': 30, 'FEED_FORMAT': 'json', 'FEED_URI': './test.json', 'NEWSPIDER_MODULE': 'googlefinance.spiders', 'RETRY_HTTP_CODES': [500, 502, 503, 504, 400, 403, 404, 408], 'RETRY_TIMES': 30, 'SPIDER_MODULES': ['googlefinance.spiders'], 'USER_AGENT': 'Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; FSL 7.0.6.01001)'}
2017-01-30 14:32:06 [scrapy.middleware] INFO: Enabled extensions:
['scrapy.extensions.corestats.CoreStats',
'scrapy.extensions.telnet.TelnetConsole',
'scrapy.extensions.feedexport.FeedExporter',
'scrapy.extensions.logstats.LogStats']
2017-01-30 14:32:06 [scrapy.middleware] INFO: Enabled downloader middlewares:
['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
'scrapy.downloadermiddlewares.retry.RetryMiddleware',
'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
'scrapy.downloadermiddlewares.stats.DownloaderStats']
2017-01-30 14:32:06 [scrapy.middleware] INFO: Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',
'scrapy.spidermiddlewares.referer.RefererMiddleware',
'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
'scrapy.spidermiddlewares.depth.DepthMiddleware']
2017-01-30 14:32:06 [scrapy.middleware] INFO: Enabled item pipelines:
[]
2017-01-30 14:32:06 [scrapy.core.engine] INFO: Spider opened
2017-01-30 14:32:06 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.open_spider of <scrapy.extensions.feedexport.FeedExporter object at 0x7ff68de97ef0>>
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/twisted/internet/defer.py", line 150, in maybeDeferred
result = f(*args, **kw)
File "/usr/lib/python3.6/site-packages/pydispatch/robustapply.py", line 55, in robustApply
return receiver(*arguments, **named)
File "/usr/lib/python3.6/site-packages/scrapy/extensions/feedexport.py", line 187, in open_spider
uri = self.urifmt % self._get_uri_params(spider)
File "/usr/lib/python3.6/site-packages/scrapy/extensions/feedexport.py", line 262, in _get_uri_params
params[k] = getattr(spider, k)
File "/usr/lib/python3.6/site-packages/scrapy/spiders/__init__.py", line 36, in logger
logger = logging.getLogger(self.name)
File "/usr/lib/python3.6/logging/__init__.py", line 1813, in getLogger
return Logger.manager.getLogger(name)
File "/usr/lib/python3.6/logging/__init__.py", line 1167, in getLogger
raise TypeError('A logger name must be a string')
TypeError: A logger name must be a string
2017-01-30 14:32:06 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2017-01-30 14:32:06 [scrapy.extensions.telnet] DEBUG: Telnet console listening on 127.0.0.1:6023
2017-01-30 14:32:07 [scrapy.core.engine] DEBUG: Crawled (200) <GET https://www.google.com/finance/match?matchtype=matchall&ei=UVlPWNmDEYm_U7SqgvAH&q=AAPL> (referer: None)
2017-01-30 14:32:07 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.google.com/finance/match?matchtype=matchall&ei=UVlPWNmDEYm_U7SqgvAH&q=AAPL>
{'t': 'AAPL', 'n': 'Apple Inc.', 'e': 'NASDAQ', 'id': '22144'}
2017-01-30 14:32:07 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x7ff68de97ef0>>
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/twisted/internet/defer.py", line 150, in maybeDeferred
result = f(*args, **kw)
File "/usr/lib/python3.6/site-packages/pydispatch/robustapply.py", line 55, in robustApply
return receiver(*arguments, **named)
File "/usr/lib/python3.6/site-packages/scrapy/extensions/feedexport.py", line 217, in item_scraped
slot = self.slot
AttributeError: 'FeedExporter' object has no attribute 'slot'
2017-01-30 14:32:07 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.google.com/finance/match?matchtype=matchall&ei=UVlPWNmDEYm_U7SqgvAH&q=AAPL>
{'t': 'AAPL', 'n': 'APPLE INC CEDEAR(REPR 1/10 SHR)', 'e': 'BCBA', 'id': '640373807586235'}
2017-01-30 14:32:07 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x7ff68de97ef0>>
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/twisted/internet/defer.py", line 150, in maybeDeferred
result = f(*args, **kw)
File "/usr/lib/python3.6/site-packages/pydispatch/robustapply.py", line 55, in robustApply
return receiver(*arguments, **named)
File "/usr/lib/python3.6/site-packages/scrapy/extensions/feedexport.py", line 217, in item_scraped
slot = self.slot
AttributeError: 'FeedExporter' object has no attribute 'slot'
2017-01-30 14:32:07 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.google.com/finance/match?matchtype=matchall&ei=UVlPWNmDEYm_U7SqgvAH&q=AAPL>
{'t': 'AAPL', 'n': 'Apple', 'e': 'SWX', 'id': '268194557752272'}
2017-01-30 14:32:07 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x7ff68de97ef0>>
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/twisted/internet/defer.py", line 150, in maybeDeferred
result = f(*args, **kw)
File "/usr/lib/python3.6/site-packages/pydispatch/robustapply.py", line 55, in robustApply
return receiver(*arguments, **named)
File "/usr/lib/python3.6/site-packages/scrapy/extensions/feedexport.py", line 217, in item_scraped
slot = self.slot
AttributeError: 'FeedExporter' object has no attribute 'slot'
2017-01-30 14:32:07 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.google.com/finance/match?matchtype=matchall&ei=UVlPWNmDEYm_U7SqgvAH&q=AAPL>
{'t': 'AVSPY', 'n': 'NASDAQ OMX Alpha AAPL vs. SPY Index', 'e': 'INDEXNASDAQ', 'id': '3139928'}
2017-01-30 14:32:07 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x7ff68de97ef0>>
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/twisted/internet/defer.py", line 150, in maybeDeferred
result = f(*args, **kw)
File "/usr/lib/python3.6/site-packages/pydispatch/robustapply.py", line 55, in robustApply
return receiver(*arguments, **named)
File "/usr/lib/python3.6/site-packages/scrapy/extensions/feedexport.py", line 217, in item_scraped
slot = self.slot
AttributeError: 'FeedExporter' object has no attribute 'slot'
2017-01-30 14:32:07 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.google.com/finance/match?matchtype=matchall&ei=UVlPWNmDEYm_U7SqgvAH&q=AAPL>
{'t': 'AAPL34', 'n': 'APPLE DRN', 'e': 'BVMF', 'id': '486420404817650'}
2017-01-30 14:32:07 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x7ff68de97ef0>>
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/twisted/internet/defer.py", line 150, in maybeDeferred
result = f(*args, **kw)
File "/usr/lib/python3.6/site-packages/pydispatch/robustapply.py", line 55, in robustApply
return receiver(*arguments, **named)
File "/usr/lib/python3.6/site-packages/scrapy/extensions/feedexport.py", line 217, in item_scraped
slot = self.slot
AttributeError: 'FeedExporter' object has no attribute 'slot'
2017-01-30 14:32:07 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.google.com/finance/match?matchtype=matchall&ei=UVlPWNmDEYm_U7SqgvAH&q=AAPL>
{'t': 'AAPL', 'n': 'APPLE COMPUTER INC', 'e': 'BMV', 'id': '119565461895124'}
2017-01-30 14:32:07 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x7ff68de97ef0>>
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/twisted/internet/defer.py", line 150, in maybeDeferred
result = f(*args, **kw)
File "/usr/lib/python3.6/site-packages/pydispatch/robustapply.py", line 55, in robustApply
return receiver(*arguments, **named)
File "/usr/lib/python3.6/site-packages/scrapy/extensions/feedexport.py", line 217, in item_scraped
slot = self.slot
AttributeError: 'FeedExporter' object has no attribute 'slot'
2017-01-30 14:32:07 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.google.com/finance/match?matchtype=matchall&ei=UVlPWNmDEYm_U7SqgvAH&q=AAPL>
{'t': 'AAPL-EUR', 'n': 'Apple', 'e': 'SWX', 'id': '706336206708362'}
2017-01-30 14:32:07 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x7ff68de97ef0>>
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/twisted/internet/defer.py", line 150, in maybeDeferred
result = f(*args, **kw)
File "/usr/lib/python3.6/site-packages/pydispatch/robustapply.py", line 55, in robustApply
return receiver(*arguments, **named)
File "/usr/lib/python3.6/site-packages/scrapy/extensions/feedexport.py", line 217, in item_scraped
slot = self.slot
AttributeError: 'FeedExporter' object has no attribute 'slot'
2017-01-30 14:32:07 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.google.com/finance/match?matchtype=matchall&ei=UVlPWNmDEYm_U7SqgvAH&q=AAPL>
{'t': 'AAPL-USD', 'n': 'Apple', 'e': 'SWX', 'id': '1009743014824088'}
2017-01-30 14:32:07 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x7ff68de97ef0>>
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/twisted/internet/defer.py", line 150, in maybeDeferred
result = f(*args, **kw)
File "/usr/lib/python3.6/site-packages/pydispatch/robustapply.py", line 55, in robustApply
return receiver(*arguments, **named)
File "/usr/lib/python3.6/site-packages/scrapy/extensions/feedexport.py", line 217, in item_scraped
slot = self.slot
AttributeError: 'FeedExporter' object has no attribute 'slot'
2017-01-30 14:32:07 [scrapy.core.engine] INFO: Closing spider (finished)
2017-01-30 14:32:07 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.close_spider of <scrapy.extensions.feedexport.FeedExporter object at 0x7ff68de97ef0>>
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/twisted/internet/defer.py", line 150, in maybeDeferred
result = f(*args, **kw)
File "/usr/lib/python3.6/site-packages/pydispatch/robustapply.py", line 55, in robustApply
return receiver(*arguments, **named)
File "/usr/lib/python3.6/site-packages/scrapy/extensions/feedexport.py", line 198, in close_spider
slot = self.slot
AttributeError: 'FeedExporter' object has no attribute 'slot'
2017-01-30 14:32:07 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
{'downloader/request_bytes': 309,
'downloader/request_count': 1,
'downloader/request_method_count/GET': 1,
'downloader/response_bytes': 761,
'downloader/response_count': 1,
'downloader/response_status_count/200': 1,
'finish_reason': 'finished',
'finish_time': datetime.datetime(2017, 1, 30, 13, 32, 7, 192220),
'item_scraped_count': 8,
'log_count/DEBUG': 10,
'log_count/ERROR': 10,
'log_count/INFO': 7,
'response_received_count': 1,
'scheduler/dequeued': 1,
'scheduler/dequeued/memory': 1,
'scheduler/enqueued': 1,
'scheduler/enqueued/memory': 1,
'start_time': datetime.datetime(2017, 1, 30, 13, 32, 6, 846350)}
2017-01-30 14:32:07 [scrapy.core.engine] INFO: Spider closed (finished)))
Any idea what the problem is ?
Issue Analytics
- State:
- Created 7 years ago
- Comments:7 (4 by maintainers)
Top Results From Across the Web
"'FeedExporter' object has no attribute 'slot'" exception? - Stack ...
It happens when feedexporter is not able to write file, it happend to me when i had my previously exported csv file open...
Read more >AttributeError: 'FeedExporter' object has no attribute 'slot' - Zyte
I have a spider that suddenly throws this exception on about a half of by requests: [scrapy.utils.signal] Error caught on signal handler: >...
Read more >how to fix it? - scrapy-users@googlegroups.com
FeedExporter object at 0x8a7cecc>> ... slot.exporter.export_item(item) ... AttributeError: 'NoneType' object has no attribute 'iterkeys'.
Read more >Why doesn't my spider find body text? - Python Forum
ERROR:RISJbot.pipelines.checkcontent:No bodytext: ... slot = self.slot AttributeError: 'FeedExporter' object has no attribute 'slot'
Read more >AttributeError: 'FeedExporter' object has no attribute 'slot'
使用scrapy时候你报错:AttributeError: 'FeedExporter' object has no attribute 'slot'因为当前需要写入的文件被占用,写不进去!
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I’ve seen this error when I had FEED_URI’s CSV file already opened in another program (Excel).
Thanks for this. @yadalik