JupyterLab freezes when the output from a shell command exceeds a certain size
See original GitHub issueDescribe the issue
When the output from a shell command exceeds a certain size, JupyterLab and classic notebooks just freeze. This can be achieved by simply writing a few 100 KB to stdout: !head -c 118000 </dev/urandom
. Printing the same amount of data to stdout using Python does not cause any issues. In fact, Python allows for much larger outputs.
This only seems to affect containerized/resource restricted deployments of elyra at the moment since this cannot be reproduced locally on my workstation. Ignore, can reproduce locally on my workstation after a flat install
The problem seems to be intermittent as well, there were some cases where on FIRST execution of the command in a cell, it runs through without issue, subsequent execution of the same cell result in a stall.
To Reproduce Steps to reproduce the behavior:
- Open Kubeflow’s notebook launcher
- Request at least 2 CPUs, (or more if your deployment can allow it)
- Launch an instance of elyra/kf-notebook:latest
- Open a new jupyter notebook
- run
!head -c 118000 </dev/urandom
Screenshots or log output If applicable, add screenshots or log output to help explain your problem.
Log Output
Paste the log output here.
Expected behavior
Output from the shell to be displayed in the cell
Deployment information Describe what you’ve deployed and how:
- Elyra version: 3.9.1
- Installation source: kf-notebook image
- Deployment type: Kubeflow
- Operating system: linux
Issue Analytics
- State:
- Created a year ago
- Comments:11 (10 by maintainers)
I’ve recreated this using a plain JupyterLab deployment configuration that comes with Kubeflow [notebook server] 1.5. Since none of the Elyra extensions is installed this should confirm that this is an upstream issue.
@daschnerm, please subscribe to the issue for updates