question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

logging in pipelines breaks scrapy deploy success message

See original GitHub issue

Not sure if this should be reported here on on scrapyd’s issue tracker.

using the logging facility on scrapy breaks the spider count on deploy, it returns 0 all the time.

To reproduce the error, place the snippet below on your pipelines module and try to deploy to a scrapyd target

from scrapy import log

class MyPipeline(object):    
    def __init__(self):
        log.start()

...

Issue Analytics

  • State:closed
  • Created 10 years ago
  • Comments:5 (2 by maintainers)

github_iconTop GitHub Comments

7reactions
stavcommented, May 10, 2013

Just use the spider’s built-in logging:

class MyPipeline(object):
    def process_item(self, item, spider):
        spider.log('Well, here is an Item: %s.' % item)
1reaction
JamesTheHackercommented, Aug 1, 2017

@pablohoffman I came here because I’m had the same question as OP. To answer the question about why do it in a pipeline … in my case it’s because I’m storing items in a MySQL database from within the pipeline. I need to log any exceptions when inserting into the database.

In my case I needed to log an ERROR. This is how I did it:

import logging
spider.log('Error writing to database: {0}'.format(e), logging.ERROR)
Read more comments on GitHub >

github_iconTop Results From Across the Web

Logging — Scrapy 2.7.1 documentation
Python's builtin logging defines 5 different levels to indicate the severity of a given log message. Here are the standard ones, listed in...
Read more >
Scrapy Cluster Documentation
This Scrapy project uses Redis and Kafka to create a distributed on demand scraping cluster. The goal is to distribute seed URLs among...
Read more >
How to Monitor Your Scrapy Spiders! | ScrapeOps
From day to day, your scrapers can break or their performance degrade for ... Scrapy Logs & Stats; ScrapeOps Extension; Spidermon Extension ...
Read more >
Scrapy Documentation - Read the Docs
The next steps for you are to install Scrapy, follow through the tutorial ... Wrapper that sends a log message through the Spider's...
Read more >
An Introduction | Python Scrapy Tutorial - Great Learning
parse(response) – Callback method is used to get the response returns the scraped data. log(message,level,component) – Sends the log throught the “logger”.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found