Can't use multiprocessing module in IPython
See original GitHub issueI am having issues with the multiprocessing
module when using IPython. It works fine running in my console. Simple example:
from multiprocessing import Pool
def f(x):
return x*x
if __name__ == '__main__':
p = Pool(5)
print(p.map(f, [1, 2, 3]))
Process SpawnPoolWorker-1:
Traceback (most recent call last):
File "C:\Users\kinga\Anaconda3\lib\multiprocessing\process.py", line 258, in _bootstrap
self.run()
Process SpawnPoolWorker-3:
File "C:\Users\kinga\Anaconda3\lib\multiprocessing\process.py", line 93, in run
self._target(*self._args, **self._kwargs)
File "C:\Users\kinga\Anaconda3\lib\multiprocessing\pool.py", line 108, in worker
task = get()
File "C:\Users\kinga\Anaconda3\lib\multiprocessing\queues.py", line 337, in get
return _ForkingPickler.loads(res)
AttributeError: Can't get attribute 'f' on <module '__main__' (built-in)>
Traceback (most recent call last):
Process SpawnPoolWorker-4:
File "C:\Users\kinga\Anaconda3\lib\multiprocessing\process.py", line 258, in _bootstrap
self.run()
Traceback (most recent call last):
File "C:\Users\kinga\Anaconda3\lib\multiprocessing\process.py", line 258, in _bootstrap
self.run()
File "C:\Users\kinga\Anaconda3\lib\multiprocessing\process.py", line 93, in run
self._target(*self._args, **self._kwargs)
File "C:\Users\kinga\Anaconda3\lib\multiprocessing\pool.py", line 108, in worker
task = get()
File "C:\Users\kinga\Anaconda3\lib\multiprocessing\queues.py", line 337, in get
return _ForkingPickler.loads(res)
AttributeError: Can't get attribute 'f' on <module '__main__' (built-in)>
File "C:\Users\kinga\Anaconda3\lib\multiprocessing\process.py", line 93, in run
self._target(*self._args, **self._kwargs)
File "C:\Users\kinga\Anaconda3\lib\multiprocessing\pool.py", line 108, in worker
task = get()
File "C:\Users\kinga\Anaconda3\lib\multiprocessing\queues.py", line 337, in get
return _ForkingPickler.loads(res)
AttributeError: Can't get attribute 'f' on <module '__main__' (built-in)>
Issue Analytics
- State:
- Created 6 years ago
- Comments:12 (1 by maintainers)
Top Results From Across the Web
Jupyter notebook never finishes processing using ...
I am using the book by Dusty Phillips and this code belongs to it. import multiprocessing import random from multiprocessing.pool import Pool ...
Read more >Multiprocessing in Python on Windows and Jupyter/Ipython
Basically It consists of two steps: First, create a function, and then use multiple processors to execute the function in parallel. #import Pool...
Read more >Jupyter notebook multiprocessing
Multiprocessing in Python is a package we can use with Python to spawn processes using an API that is much like the threading...
Read more >Why Can't Jupyter Notebooks Handle Multiprocessing on ...
From what I've read on the issue, it turns out that the issue is largely because of the ipython shell that jupyter uses....
Read more >multiprocessing — Process-based parallelism — Python 3.11 ...
multiprocessing is a package that supports spawning processes using an API similar to the threading module. The multiprocessing package offers both local ...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
I think that’s a limitation of multiprocessing, especially on Windows. This is from the multiprocessing docs:
https://docs.python.org/3/library/multiprocessing.html#using-a-pool-of-workers
This should be explicitly stated in the documentation or in tutorials examples.