question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Capturing stdout with slurm

See original GitHub issue

We’ve been running our experiments with the slurm scheduler which captures stdout and stdin and pipes them to a file. This seems to confuse sacred, which doesn’t capture stdout in this case.

Easiest solution by far would be just telling sacred where to look for stdout. Is there a way to do that?

Thanks! A

Issue Analytics

  • State:open
  • Created 6 years ago
  • Reactions:2
  • Comments:7 (1 by maintainers)

github_iconTop GitHub Comments

2reactions
Qwlousecommented, Sep 2, 2018

Ohh, I found the problem: Sys-based capturing relies on replacing sys.stdout and sys.stderr with custom wrappers. But the StreamHandler still maintains a copy of the old sys.stderr and logs to that. So in fact any Sacred experiment that uses capture mode sys looses their logging. That is clearly a problem!

You might be able to work around it with something like this, which should give you most logs (except for the “experiment started” logs):

@ex.pre_run_hook
def set_logger_stream(_run):
    _run.root_logger.handlers[0].stream = sys.stderr

I have to think about how to best fix this problem in a clean way. Let me know if you have any thoughts on this, especially also concerning the general integration of sacred with logging, since I am not entirely happy with the way it is currently done.

2reactions
Qwlousecommented, Jul 25, 2017

That is an interesting case I hadn’t thought about. So far this is not possible, no. If you are fine with only capturing stdout from within python, you could try to set the capture mode to sys. My guess is that this wouldn’t be affected by stdout redirects:

from sacred.settings import SETTINGS
SETTINGS.CAPTURE_MODE = 'sys'

But I’ll mark this as a feature request nonetheless.

Read more comments on GitHub >

github_iconTop Results From Across the Web

How do I redirect stdout/stderr in my SLURM job scripts? - Ask
I submit a job in a slurm cluster and all output and error messages go into the same file. Is there a way...
Read more >
SLURM display the stdout and stderr of an unfinished job
I ran my job with sbatch and a bash script that has a few slurm parameters, loads a few modules, cd and then...
Read more >
Handling job output | Division of Information Technology
One simple method for handling the output of your job is redirecting the program output directly in your bash script (the .slurm file...
Read more >
Submitting SLURM jobs with STDOUT & STDERR written ...
This article will show you how to refer the SLURM jobs id in the sbatch job script to redirect STDOUT and STDERR to...
Read more >
Knowledge Base: Bell User Guide: Checking Job Output
SLURM catches output written to standard output and standard error - what would be printed to your screen if you ran your program...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found