How to configure multiple logger like standard logging.config.dictConfig
See original GitHub issuehi,I read through the readme and api reference but didnot find out how to get multiple logger instances. here is my configure of logging loggers:
in my project,i rely on getLogger(āxxxā) to process the log instead of getLogger() to distribute the log. And I donāt want some of the logs to be passed to multiple loggers, so I added propagate: no
to each logger.
in loguruās api referenceļ¼I only found the filter Parameters in logger.add
to handle multiple loggers.
but mine multiple loggers are stored in the same log, if i used filter=āxxx.logā
,log record will send to multiple loggers,so how to deal with my scene
Issue Analytics
- State:
- Created 5 years ago
- Comments:7 (5 by maintainers)
Top Results From Across the Web
python 3.x - How to set up multiple loggers with different ...
You used default formatter to configure specific logger and the last one C used to log you records. ... dictConfig(cfg) return logging.
Read more >logging.config ā Logging configuration ā Python 3.11.1 ...
This section describes the API for configuring the logging module. ... and then dictConfig() could be called exactly as in the default, uncustomized...
Read more >logging.config - Simple Guide to Configure Loggers from ...
An in-depth guide to configure loggers from dictionary and config ... logging by giving this dictionary as input to dictConfig() method.
Read more >Python Logging Guide - Best Practices and Hands-on Examples
The Python standard library provides a logging module as a solution to log events from applications and libraries. Once the logger is configured...
Read more >How to Collect, Customize, and Centralize Python Logs
This means that if you have a default logging configuration that you want all of your loggers to pick up, you should add...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Hey @Gideon-koutian!
I think you could use
.bind()
to replicate your current logging configuration.For example, instead of
logger = logging.getLogger("xxx")
, you could dologger = logger.bind(name="xxx")
.As a result, each message logged with this bound
logger
will contain thename
value in theextra
record dict that you can use tofilter
logs adequately.Depending on how you intend to handle incoming log messages, maybe can you use just one custom sink and filter them in your function directly:
I admit that this looks less convenient than built-in logging, though. š
Do you think using Loguru like this would suit your needs?
Hi @D3f0. š
The problem you describe is one of the reasons I did not implement any method to load a configuration from a file.
Honestly, I donāt what would be the best solution. The standard logging library partly solves it by specifying a special syntax: Access to external objects. So, you could state that
'ext://sys.stderr'
stands forsys.stderr
, and then tranform the TOML dict in your Python script. Depending on the dynamicity you are looking for, you could simply use pattern matching like{"ext:://sys.stderr": sys.stderr}
or implement a proper resolver as done by the standard library:cpython/logging/config.py
.But when it comes to parametrizing handlers with functions, itās even more complicated. If you wish to support this too, I guess you need to define a set of pre-wrote functions in your Python script that you can parametrize in the TOML file by using their identifier.
Basically, the simplest solution would look like this I think:
I suppose you probably already have thought of this solution. I donāt have any better idea for now, but Iām interested to know the solution you will choose. Alternatively, you could also define the logging configuration dict inside some kind of
config.py
file.