question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Multi-process isn't working

See original GitHub issue

Hello guys, and thank you for your awesome library! I’m currently struggling with making test_multi_process_translation.py script working. When it comes to the multi-process part, the following error occures:

2021-03-04 19:00:46 | INFO | easynmt.EasyNMT | Start multi-process pool on devices: cuda:0 Traceback (most recent call last): File "test_multi_process_translation.py", line 80, in <module> process_pool = model.start_multi_process_pool() File "/data/home/k.kirillova/anaconda/envs/fairseq/lib/python3.7/site-packages/easynmt/EasyNMT.py", line 258, in start_multi_process_pool p.start() File "/data/home/k.kirillova/anaconda/envs/fairseq/lib/python3.7/multiprocessing/process.py", line 112, in start self._popen = self._Popen(self) File "/data/home/k.kirillova/anaconda/envs/fairseq/lib/python3.7/multiprocessing/context.py", line 284, in _Popen return Popen(process_obj) File "/data/home/k.kirillova/anaconda/envs/fairseq/lib/python3.7/multiprocessing/popen_spawn_posix.py", line 32, in __init__ super().__init__(process_obj) File "/data/home/k.kirillova/anaconda/envs/fairseq/lib/python3.7/multiprocessing/popen_fork.py", line 20, in __init__ self._launch(process_obj) File "/data/home/k.kirillova/anaconda/envs/fairseq/lib/python3.7/multiprocessing/popen_spawn_posix.py", line 47, in _launch reduction.dump(process_obj, fp) File "/data/home/k.kirillova/anaconda/envs/fairseq/lib/python3.7/multiprocessing/reduction.py", line 60, in dump ForkingPickler(file, protocol).dump(obj) AttributeError: Can't pickle local object 'BaseFairseqModel.make_generation_fast_.<locals>.train'

Could you please give me any ideas on how to fix this? Thanks in advance!

Issue Analytics

  • State:open
  • Created 3 years ago
  • Comments:11 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
R4ZZ3commented, Apr 16, 2021

Some background why the multiprocessing doesn’t work: https://aws.amazon.com/blogs/compute/parallel-processing-in-python-with-aws-lambda/

1reaction
nreimerscommented, Apr 16, 2021

@PanicButtonPressed The transformer model uses out-of-the-box already multiple cores. So it does not make a difference if you use 1 process that encodes on two cores (and e.g. can translate 100 sentences / minute) or two processes which fight for the compute power (and might only encode 40 sentences / minute each).

But for GPUs only 1 GPU is used per process. If you have 2 GPUs, each process can use an individual GPU.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Python multiprocessing example not working - Stack Overflow
My guess is that you are using IDLE to try to run this script. Unfortunately, this example will not run correctly in IDLE....
Read more >
3 Multiprocessing Common Errors - Super Fast Python
Common Multiprocessing Errors · Error 1: RuntimeError Starting New Processes · Error 2: print() Does Not Work In Child Processes · Error 3:...
Read more >
Why your multiprocessing Pool is stuck (it's full of sharks!)
On Linux, the default configuration of Python's multiprocessing library can lead to deadlocks and brokenness. Learn why, and how to fix it.
Read more >
Why does this Python code using multiprocessing not work?
Basically no. The multiprocessing module has multiple processes, each isolated from the others except for controlled interaction through queues and shared ...
Read more >
multiprocessing — Process-based parallelism — Python 3.11 ...
Source code: Lib/multiprocessing/ Availability: not Emscripten, not WASI. This module does not work or is not available on WebAssembly platforms ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found