Is it possible to close the spider at spider_opened signal?
See original GitHub issueHello,
I’m working on a middleware that loads some resources at spider_opened
handler method. If those resources can’t be loaded, I need the spider to be closed.
I tried to do that by raising the CloseSpider
exception and also by calling crawler.engine.close_spider(...)
, but none of them works.
Is there a way to do that?
Thanks!
Issue Analytics
- State:
- Created 5 years ago
- Comments:13 (10 by maintainers)
Top Results From Across the Web
Signals — Scrapy 2.7.1 documentation
If it was closed because the spider has completed scraping, the reason is 'finished' . Otherwise, if the spider was manually closed by...
Read more >scrapy: Call a function when a spider quits - Stack Overflow
Called when the spider closes. This method provides a shortcut to signals.connect() for the spider_closed signal. Scrapy Doc : scrapy.spiders.
Read more >Connecting to spider_closed signal inside the spider... is it safe?
I have a spider that keeps track of the urls scarped from each page that it visits. When the scraping is complete, I...
Read more >Spider signal threads reveal remote sensing design secrets
The spiders that employ signal threads seem to make the most of the added protection from a retreat, while still being able to...
Read more >How Ultra-Sensitive Hearing Allows Spiders to Cast a Net on ...
A study published today in Current Biology reveals that the spiders strike behind them with amazing accuracy after hearing lower-frequency tones ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@pauloromeira more likely because of the Scrapy version. Anyway, the situation when you can’t stop a thing which is not fully started yet sounds quite common in programming, I don’t know what are the proper solutions.
@wRAR spider:
logs: