Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Broken Pipe Issue

See original GitHub issue

Issue Description

Here’s what I need help with:

I have been banging my head against a wall trying to get detectron2 to work, but to no avail. Now after I stumbled onto this project, i was finally able to build and install detectron2 properly. I tried replicating this notebook locally, which works as expected, even predicting balloons properly.

But when I reproduce the same code as a python file and try to run it using either pycharm or cmd line I’m hitting a weird error which I’ve not found anything about either in original Detectron2 repo or simple google search.

This gist has the code I pulled into a single file.

Here’s the error log:

Config 'F:\dd\detectron2\configs\COCO-InstanceSegmentation\mask_rcnn_R_50_FPN_3x.yaml' has no VERSION. Assuming it to be compatible with latest v2.
Config 'F:\dd\detectron2\configs\COCO-InstanceSegmentation\mask_rcnn_R_50_FPN_3x.yaml' has no VERSION. Assuming it to be compatible with latest v2.
Traceback (most recent call last):
  File "<string>", line 1, in <module>
Traceback (most recent call last):
  File "sample", line 79, in <module>
    trainer = DefaultTrainer(cfg)
  File "f:\dd\conan_det\detectron2\detectron2\engine\", line 249, in __init__
      File "E:\Anaconda\envs\dd\lib\multiprocessing\", line 105, in spawn_main
super().__init__(model, data_loader, optimizer)
  File "f:\dd\conan_det\detectron2\detectron2\engine\", line 194, in __init__
    exitcode = _main(fd)
  File "E:\Anaconda\envs\dd\lib\multiprocessing\", line 114, in _main
    self._data_loader_iter = iter(data_loader)
  File "E:\Anaconda\envs\dd\lib\site-packages\torch\utils\data\", line 278, in __iter__
  File "E:\Anaconda\envs\dd\lib\multiprocessing\", line 225, in prepare
    return _MultiProcessingDataLoaderIter(self)
_fixup_main_from_path(data['init_main_from_path'])  File "E:\Anaconda\envs\dd\lib\site-packages\torch\utils\data\", line 682, in __init__

  File "E:\Anaconda\envs\dd\lib\multiprocessing\", line 277, in _fixup_main_from_path
      File "E:\Anaconda\envs\dd\lib\", line 263, in run_path
  File "E:\Anaconda\envs\dd\lib\multiprocessing\", line 112, in start
    pkg_name=pkg_name, script_name=fname)
self._popen = self._Popen(self)  File "E:\Anaconda\envs\dd\lib\", line 96, in _run_module_code

  File "E:\Anaconda\envs\dd\lib\multiprocessing\", line 223, in _Popen
    mod_name, mod_spec, pkg_name, script_name)
return _default_context.get_context().Process._Popen(process_obj)  File "E:\Anaconda\envs\dd\lib\", line 85, in _run_code

  File "E:\Anaconda\envs\dd\lib\multiprocessing\", line 322, in _Popen
    exec(code, run_globals)
      File "C:\Users\Gautam\PycharmProjects\detectron2_sample\sample", line 79, in <module>
return Popen(process_obj)
  File "E:\Anaconda\envs\dd\lib\multiprocessing\", line 89, in __init__
    trainer = DefaultTrainer(cfg)
reduction.dump(process_obj, to_child)  File "f:\dd\conan_det\detectron2\detectron2\engine\", line 249, in __init__

  File "E:\Anaconda\envs\dd\lib\multiprocessing\", line 60, in dump
    ForkingPickler(file, protocol).dump(obj)
    BrokenPipeErrorsuper().__init__(model, data_loader, optimizer):
[Errno 32] Broken pipe  File "f:\dd\conan_det\detectron2\detectron2\engine\", line 194, in __init__

    self._data_loader_iter = iter(data_loader)
  File "E:\Anaconda\envs\dd\lib\site-packages\torch\utils\data\", line 278, in __iter__
    return _MultiProcessingDataLoaderIter(self)
  File "E:\Anaconda\envs\dd\lib\site-packages\torch\utils\data\", line 682, in __init__
  File "E:\Anaconda\envs\dd\lib\multiprocessing\", line 112, in start
    self._popen = self._Popen(self)
  File "E:\Anaconda\envs\dd\lib\multiprocessing\", line 223, in _Popen
    return _default_context.get_context().Process._Popen(process_obj)
  File "E:\Anaconda\envs\dd\lib\multiprocessing\", line 322, in _Popen
    return Popen(process_obj)
  File "E:\Anaconda\envs\dd\lib\multiprocessing\", line 46, in __init__
    prep_data = spawn.get_preparation_data(process_obj._name)
  File "E:\Anaconda\envs\dd\lib\multiprocessing\", line 143, in get_preparation_data
  File "E:\Anaconda\envs\dd\lib\multiprocessing\", line 136, in _check_not_importing_main
    is not going to be frozen to produce an executable.''')
        An attempt has been made to start a new process before the
        current process has finished its bootstrapping phase.

        This probably means that you are not using fork to start your
        child processes and you have forgotten to use the proper idiom
        in the main module:

            if __name__ == '__main__':

        The "freeze_support()" line can be omitted if the program
        is not going to be frozen to produce an executable.


The filename, directory name, or volume label syntax is incorrect.
------------------------  ------------------------------------------------------------------
sys.platform              win32
Python                    3.7.4 (default, Aug  9 2019, 18:34:13) [MSC v.1915 64 bit (AMD64)]
Numpy                     1.16.5
Detectron2 Compiler       MSVC 190024215
Detectron2 CUDA Compiler  10.2
PyTorch                   1.3.1
PyTorch Debug Build       False
torchvision               0.4.2
CUDA available            True
GPU 0                     GeForce GTX 1060 3GB
CUDA_HOME                 C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.2
NVCC                      Not Available
Pillow                    6.2.1
cv2                       3.4.2
------------------------  ------------------------------------------------------------------
PyTorch built with:
  - MSVC 191125547
  - Intel(R) Math Kernel Library Version 2019.0.4 Product Build 20190411 for Intel(R) 64 architecture applications
  - OpenMP 200203
  - CUDA Runtime 10.1
  - NVCC architecture flags: -gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_61,code=sm_61;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_50,code=compute_50
  - CuDNN 7.5.1
  - Magma 2.5.0

Issue Analytics

  • State:open
  • Created 3 years ago
  • Comments:5 (1 by maintainers)

github_iconTop GitHub Comments

SFraissTUcommented, Nov 24, 2020

Hi! I solved this problem by adding if __name__ == "__main__": in the beginning of my script and putting everything in the body of that if. For some weird reason, the code was calling itself in a new process. Now I run into memory issues, but that’s another problem.

gautamchitniscommented, Dec 19, 2019
Read more comments on GitHub >

github_iconTop Results From Across the Web

What causes the Broken Pipe Error? - Stack Overflow
The error condition is detected at some point. With a small write, you are inside the MTU of the system, so the message...
Read more >
How to Fix Broken pipe in ... - Java67
These broken pipe exceptions happen when the client (browser) has closed the connection, but the server (your tag) continues to try to write...
Read more >
How I fixed Broken Pipe in Java (Wildfly ...
In simple term, Broken Pipe means that a machine is attempting to read or write data from/to a pipe, while the machine on...
Read more >
bash - How can I fix a Broken Pipe error? - Super User
The write error: Broken pipe message refers to a writing process that tries to write to a pipe with ...
Read more >
How to fix Broken Pipe Error in Linux - net2
Since type is trying to carry out a write operation to a pipe whose other end has therefore been closed – a brokenpipe...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Post

No results found

github_iconTop Related Hashnode Post

No results found