Transformers logging setLevel method seems not to work
See original GitHub issueSystem Info
transformers
version: 4.21.3- Platform: Linux-5.15.0-33-generic-x86_64-with-glibc2.35
- Python version: 3.10.4
- Huggingface_hub version: 0.9.1
- PyTorch version (GPU?): 1.11.0+cu113 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using GPU in script?: Yes
- Using distributed or parallel set-up in script?: Yes
Who can help?
Information
- The official example scripts
- My own modified scripts
Tasks
- An officially supported task in the
examples
folder (such as GLUE/SQuAD, …) - My own task or dataset (give details below)
Reproduction
I am subclassing the Trainer
object because I want to inject some custom features in the _inner_training_loop
method. What I noticed is that once I do that, the logger.info
printouts, which when using the standard Trainer
are printed to the console, are now no longer printed. Even If I try to explicitly force the logger level, this seems not to work. To reproduce the behaviour, run the script below
from transformers.utils import logging
logger = logging.get_logger(__name__)
logger.setLevel("INFO")
logger.info("Hello World")
Expected behavior
I would expect Hello World
to be printed to the console, but it is not.
Why is this the case? How can I set it such that it also prints out INFO level logs?
Issue Analytics
- State:
- Created a year ago
- Comments:5 (2 by maintainers)
Top Results From Across the Web
Python logger setLevel ignoring the given level - Stack Overflow
I am trying to write a logger in Python, but the .setLevel method seems to not care which logging level I pass to...
Read more >Should be able to turn off logging · Issue #3050 - GitHub
Has anyone found a way to disable the logging this? The issue appears to be tqdm. A work-around is to disable it before...
Read more >Logging - Hugging Face
Return a logger with the specified name. This function is not supposed to be directly accessed unless you are writing a custom transformers...
Read more >Issue 44440: logging does not work as documented (setLevel)
Reading on the Logger Object and the setLevel method I created the attached python code, which does not work as expected.
Read more >Log4j – Changes - Apache Logging Services
Log4j 1.2 bridge method NDC.inherit(Stack) should not use ... setLevel(Logger, Level), setLevel(String, String), and setLevel(Class, Level).
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Hey @AndreaSottana, sorry to have missed this.
The
logging
module fortransformers
acts on the code withintransformers
itself (this is how thelogging
library works), not on user code.However, by doing the following line:
The logger created will not depend from
transformers
but from the module you’re currently running. It will, therefore, not be impacted by the methods which affecttransformers
’ logging module.This is not the cleanest workaround, but you could get what you want by specifying that this logger instance should behave as a
transformers
module with something like the following:This will trick it into understanding that
logger
is the logger for a module that lives withintransformers.custom
. This should print outjust fine.
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.