logging in pipelines breaks scrapy deploy success message
See original GitHub issueNot sure if this should be reported here on on scrapyd’s issue tracker.
using the logging facility on scrapy breaks the spider count on deploy, it returns 0 all the time.
To reproduce the error, place the snippet below on your pipelines module and try to deploy to a scrapyd target
from scrapy import log
class MyPipeline(object):
def __init__(self):
log.start()
...
Issue Analytics
- State:
- Created 10 years ago
- Comments:5 (2 by maintainers)
Top Results From Across the Web
Logging — Scrapy 2.7.1 documentation
Python's builtin logging defines 5 different levels to indicate the severity of a given log message. Here are the standard ones, listed in...
Read more >Scrapy Cluster Documentation
This Scrapy project uses Redis and Kafka to create a distributed on demand scraping cluster. The goal is to distribute seed URLs among...
Read more >How to Monitor Your Scrapy Spiders! | ScrapeOps
From day to day, your scrapers can break or their performance degrade for ... Scrapy Logs & Stats; ScrapeOps Extension; Spidermon Extension ...
Read more >Scrapy Documentation - Read the Docs
The next steps for you are to install Scrapy, follow through the tutorial ... Wrapper that sends a log message through the Spider's...
Read more >An Introduction | Python Scrapy Tutorial - Great Learning
parse(response) – Callback method is used to get the response returns the scraped data. log(message,level,component) – Sends the log throught the “logger”.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Just use the spider’s built-in logging:
@pablohoffman I came here because I’m had the same question as OP. To answer the question about why do it in a pipeline … in my case it’s because I’m storing items in a MySQL database from within the pipeline. I need to log any exceptions when inserting into the database.
In my case I needed to log an
ERROR
. This is how I did it: