logging level won't working
See original GitHub issueUsing the code in the doc:
logging.basicConfig(
filename='/home/jason/scrapyd/logs/scrapy.log',
format='%(levelname)s: %(message)s',
level=logging.ERROR
)
But in the file scrapy.log
, I can still see INFO
DEBUG
etc
ERROR: this is an error
INFO: please
ERROR: this is an error
DEBUG: Scraped from <200 http://www.xxxx.com>
I did not specify any log level in my settings.py, Which part could be wrong? Thanks
Issue Analytics
- State:
- Created 7 years ago
- Reactions:4
- Comments:12 (6 by maintainers)
Top Results From Across the Web
logging.level.root does not work (spring Boot) - Stack Overflow
I have property logging.level.root=FATAL in my application.properties, but it does not work. Application stil uses levels defined in my ...
Read more >Setting the Log Level in Spring Boot when Testing - Baeldung
In this tutorial, we'll learn how to set the log level when running tests for a Spring Boot application. Although we can mostly...
Read more >26. Logging - Spring
Spring Boot uses Commons Logging for all internal logging but leaves the underlying log implementation open. Default configurations are provided for Java ...
Read more >How to set the logging level in Spring Boot application ...
Hello guys, if you are wondering how to set the logging level on spring boot then you have come to the right place....
Read more >Setting logging level not working - Ray Serve
I'm trying to set logging level to ERROR with Ray Serve, but it's still showing INFO logs. ... Setting logging level not working....
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@Jack-Kingdom , how are you running your spider? a standalone script or
scrapy crawl spidername
?With
scrapy crawl
, you shouldn’t need to change the logging config ifLOG_LEVEL
,LOG_FORMAT
,LOG_FILE
… settings work for you.If you’re using a standalone script, are you using
CrawlerProcess
orCrawlerRunner
?CrawlerProcess
callsconfigure_logging
at init time.With
CrawlerRunner
(which is the only way – I think – to properly configure your own logging), I’m able to set the log level with this:I believe the example in the docs is wrong about calling
configure_logging
followed bylogging.basicConfig
. I had to commentconfigure_logging()
to make it use the log level I set in basic config.I am by no means a logging export, but I recently read up on logging and believe @redapple is correct.
What
configure_logging()
does is the following:Where DEFAULT_LOGGING is:
basicConfig()
automatically adds a root logger. So even if you have setconfigure_logging(install_root_handler=False)
the root logger will be enabled again when you do thebasicConfig()
.basicConfig()
adds a handler to the root logger. But the handler default level is NOTSET, which means it processes all logs (more precisely LogRecords).All loggers (such as scrapy.core.scraper or even just scrapy) are child loggers of root. What this means is that when scrapy logs something, it will send it up to the root logger since the default level of the child logger ‘scrapy’ is set to DEBUG. The root logger’s handler then has the level of NOTSET, so it will process all logs and emit them in the logs. You can read more about how this works in this flow: https://docs.python.org/3/howto/logging.html#logging-flow
Using both thus does not make sense. As such, one should preferably either call
configure_logging
orbasicConfig()
.The answer is thus that the documentation for scrapy needs to be improved here: https://doc.scrapy.org/en/latest/topics/logging.html#module-scrapy.utils.log
Note that fixing this would also close #2352 and #3146.