scrapy LOG_LEVEL setting in Spider.custom_settings does not work
See original GitHub issueI set the LOG_LEVEL
setting to INFO
in the Spider class via the custom_settings
attribute but I still see the DEBUG
messages in the console.
When I set it on the settings.py
file or via the command line option --loglevel
, it works.
I thought any settings could be set via the custom_settings
attribute. Is that a bug? (Scrapy 1.0.3 and python 2.7.10)
class TestSpider(scrapy.Spider):
name = "Test"
...
custom_settings = {
'LOG_LEVEL': 'INFO',
}
Issue Analytics
- State:
- Created 8 years ago
- Reactions:2
- Comments:13 (7 by maintainers)
Top Results From Across the Web
Scrapy CrawlProcess() is not changing LOG_LEVEL when ...
I am still getting default DEBUG logs why is this happening? When I run the same spider with scrapy crawl firstSpider -a COUNT="four"...
Read more >Logging — Scrapy 2.7.1 documentation
Logging works out of the box, and can be configured to some extent with the Scrapy settings listed in Logging settings. Scrapy calls...
Read more >Scrapy - Settings - Tutorialspoint
It is a module where a new spider is created using genspider command. Default value: ''. 50. RANDOMIZE_DOWNLOAD_DELAY. It defines a random amount...
Read more >Scrapy Documentation - Read the Docs
At this point Python 2.7 and pip package manager must be working, ... This is because Scrapy core requirement Twisted does not support....
Read more >Using Scrapy in Jupyter notebook | JJ's World
The pipeline is set in the custom_settings parameter ... file needs to be defined in FEED_URI inside the custom settings of the spider....
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
+1 to fix it; it looks like a bug for me.
About settings, i think maybe you setup but you don’t know how to import or activate them. And that’s why your logging doesn’t work.
So here are some solutions might help.
You need to sure that settings url is corret in
root/scrapy.cfg
. Then you can set your logging insettings.py
. I suggest at least setup LOG_FILE and LOG_LEVEL.Then you can run
scrapy crawl crawler_name
If you want to run the spider from a script, you will need
from scrapy.utils.project import get_project_settings
andfrom scrapy.utils.log import configure_logging
. Then putconfigure_logging(get_project_settings)
to where you need to activating logging.