question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Python 3.7, Windows: parallelisation does not work

See original GitHub issue

I’m currently tracking down why I can’t get n_jobs=-1 to work for something else, and found the following when trying the examples:

>>> from joblib import Parallel, delayed
>>> from math import modf
>>> Parallel(n_jobs=2)(delayed(modf)(i/2.) for i in range(10))

exception calling callback for <Future at 0x2d4dd8b97b8 state=finished raised BrokenProcessPool>
joblib.externals.loky.process_executor._RemoteTraceback:
'''
Traceback (most recent call last):
  File "C:\UsersLocal\nikolaus.waxweiler\Envs\damakerngrouper-py3.7\lib\site-packages\joblib\externals\loky\process_executor.py", line 391, in _process_worker
    call_item = call_queue.get(block=True, timeout=timeout)
  File "c:\program files\python37\lib\multiprocessing\queues.py", line 99, in get
    if not self._rlock.acquire(block, timeout):
PermissionError: [WinError 5] Access is denied
'''

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\UsersLocal\nikolaus.waxweiler\Envs\damakerngrouper-py3.7\lib\site-packages\joblib\externals\loky\_base.py", line 625, in _invoke_callbacks
    callback(self)
  File "C:\UsersLocal\nikolaus.waxweiler\Envs\damakerngrouper-py3.7\lib\site-packages\joblib\parallel.py", line 309, in __call__
    self.parallel.dispatch_next()
  File "C:\UsersLocal\nikolaus.waxweiler\Envs\damakerngrouper-py3.7\lib\site-packages\joblib\parallel.py", line 731, in dispatch_next
    if not self.dispatch_one_batch(self._original_iterator):
  File "C:\UsersLocal\nikolaus.waxweiler\Envs\damakerngrouper-py3.7\lib\site-packages\joblib\parallel.py", line 759, in dispatch_one_batch
    self._dispatch(tasks)
  File "C:\UsersLocal\nikolaus.waxweiler\Envs\damakerngrouper-py3.7\lib\site-packages\joblib\parallel.py", line 716, in _dispatch
    job = self._backend.apply_async(batch, callback=cb)
  File "C:\UsersLocal\nikolaus.waxweiler\Envs\damakerngrouper-py3.7\lib\site-packages\joblib\_parallel_backends.py", line 510, in apply_async
    future = self._workers.submit(SafeFunction(func))
  File "C:\UsersLocal\nikolaus.waxweiler\Envs\damakerngrouper-py3.7\lib\site-packages\joblib\externals\loky\reusable_executor.py", line 151, in submit
    fn, *args, **kwargs)
  File "C:\UsersLocal\nikolaus.waxweiler\Envs\damakerngrouper-py3.7\lib\site-packages\joblib\externals\loky\process_executor.py", line 1022, in submit
    raise self._flags.broken
joblib.externals.loky.process_executor.BrokenProcessPool: A task has failed to un-serialize. 
Please ensure that the arguments of the function are all picklable.
ERROR: The process with PID 11044 (child process of PID 10084) could not be terminated.
Reason: There is no running instance of the task.
ERROR: The process with PID 10084 (child process of PID 10936) could not be terminated.
Reason: There is no running instance of the task.
ERROR: The process "10020" not found.
joblib.externals.loky.process_executor._RemoteTraceback: 
'''
Traceback (most recent call last):
  File "C:\UsersLocal\nikolaus.waxweiler\Envs\damakerngrouper-py3.7\lib\site-packages\joblib\externals\loky\process_executor.py", line 391, in _process_worker
    call_item = call_queue.get(block=True, timeout=timeout)
  File "c:\program files\python37\lib\multiprocessing\queues.py", line 99, in get
    if not self._rlock.acquire(block, timeout):
PermissionError: [WinError 5] Access is denied
'''

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "C:\UsersLocal\nikolaus.waxweiler\Envs\damakerngrouper-py3.7\lib\site-packages\joblib\parallel.py", line 934, in __call__
    self.retrieve()
  File "C:\UsersLocal\nikolaus.waxweiler\Envs\damakerngrouper-py3.7\lib\site-packages\joblib\parallel.py", line 833, in retrieve
    self._output.extend(job.get(timeout=self.timeout))
  File "C:\UsersLocal\nikolaus.waxweiler\Envs\damakerngrouper-py3.7\lib\site-packages\joblib\_parallel_backends.py", line 521, in wrap_future_result
    return future.result(timeout=timeout)
  File "c:\program files\python37\lib\concurrent\futures\_base.py", line 432, in result      
    return self.__get_result()
  File "c:\program files\python37\lib\concurrent\futures\_base.py", line 384, in __get_result
    raise self._exception
  File "C:\UsersLocal\nikolaus.waxweiler\Envs\damakerngrouper-py3.7\lib\site-packages\joblib\externals\loky\_base.py", line 625, in _invoke_callbacks
    callback(self)
  File "C:\UsersLocal\nikolaus.waxweiler\Envs\damakerngrouper-py3.7\lib\site-packages\joblib\parallel.py", line 309, in __call__
    self.parallel.dispatch_next()
  File "C:\UsersLocal\nikolaus.waxweiler\Envs\damakerngrouper-py3.7\lib\site-packages\joblib\parallel.py", line 731, in dispatch_next
    if not self.dispatch_one_batch(self._original_iterator):
  File "C:\UsersLocal\nikolaus.waxweiler\Envs\damakerngrouper-py3.7\lib\site-packages\joblib\parallel.py", line 759, in dispatch_one_batch
    self._dispatch(tasks)
  File "C:\UsersLocal\nikolaus.waxweiler\Envs\damakerngrouper-py3.7\lib\site-packages\joblib\parallel.py", line 716, in _dispatch
    job = self._backend.apply_async(batch, callback=cb)
  File "C:\UsersLocal\nikolaus.waxweiler\Envs\damakerngrouper-py3.7\lib\site-packages\joblib\_parallel_backends.py", line 510, in apply_async
    future = self._workers.submit(SafeFunction(func))
  File "C:\UsersLocal\nikolaus.waxweiler\Envs\damakerngrouper-py3.7\lib\site-packages\joblib\externals\loky\reusable_executor.py", line 151, in submit
    fn, *args, **kwargs)
  File "C:\UsersLocal\nikolaus.waxweiler\Envs\damakerngrouper-py3.7\lib\site-packages\joblib\externals\loky\process_executor.py", line 1022, in submit
    raise self._flags.broken
joblib.externals.loky.process_executor.BrokenProcessPool: A task has failed to un-serialize. 
Please ensure that the arguments of the function are all picklable.

The errors may vary:

>>> Parallel(n_jobs=2, verbose=10)(delayed(modf)(i/2.) for i in range(10)) 
[Parallel(n_jobs=2)]: Using backend LokyBackend with 2 concurrent workers.
[Parallel(n_jobs=2)]: Done   1 tasks      | elapsed:    0.4s
[Parallel(n_jobs=2)]: Done   2 tasks      | elapsed:    0.4s
[Parallel(n_jobs=2)]: Done   3 tasks      | elapsed:    0.4s
[Parallel(n_jobs=2)]: Done   4 tasks      | elapsed:    0.4s
ERROR: The process "6640" not found.
joblib.externals.loky.process_executor._RemoteTraceback: 
'''
Traceback (most recent call last):
  File "c:\program files\python37\lib\multiprocessing\queues.py", line 109, in get
    self._sem.release()
OSError: [WinError 6] The handle is invalid

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\UsersLocal\nikolaus.waxweiler\Envs\damakerngrouper-py3.7\lib\site-packages\joblib\externals\loky\process_executor.py", line 391, in _process_worker
    call_item = call_queue.get(block=True, timeout=timeout)
  File "c:\program files\python37\lib\multiprocessing\queues.py", line 111, in get
    self._rlock.release()
OSError: [WinError 6] The handle is invalid
'''

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "C:\UsersLocal\nikolaus.waxweiler\Envs\damakerngrouper-py3.7\lib\site-packages\joblib\parallel.py", line 934, in __call__
    self.retrieve()
  File "C:\UsersLocal\nikolaus.waxweiler\Envs\damakerngrouper-py3.7\lib\site-packages\joblib\parallel.py", line 833, in retrieve
    self._output.extend(job.get(timeout=self.timeout))
  File "C:\UsersLocal\nikolaus.waxweiler\Envs\damakerngrouper-py3.7\lib\site-packages\joblib\_parallel_backends.py", line 521, in wrap_future_result
    return future.result(timeout=timeout)
  File "c:\program files\python37\lib\concurrent\futures\_base.py", line 432, in result      
    return self.__get_result()
  File "c:\program files\python37\lib\concurrent\futures\_base.py", line 384, in __get_result
    raise self._exception
joblib.externals.loky.process_executor.BrokenProcessPool: A task has failed to un-serialize. 
Please ensure that the arguments of the function are all picklable.

Python 3.7.3 x64 on Windows 10 1903, joblib 0.13.2.

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:10 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
ogriselcommented, Dec 6, 2019

This fix is included in joblib 0.14.0.

1reaction
pierreglasercommented, Sep 9, 2019

@madig I implemeted a fix in tomMoral/loky#216, which hopefully will be included in the next joblib release.

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to solve multiprocessing stop working problem in Python ...
I have been using multiprocessing.map_async and .apply_async in my data processing project. It worked fine in python 3.6 until 3.7.1 but when I ......
Read more >
multiprocessing — Process-based parallelism — Python 3.11 ...
Source code: Lib/multiprocessing/ Availability: not Emscripten, not WASI. This module does not work or is not available on WebAssembly platforms ...
Read more >
Parallel Processing in Python - A Practical Guide with Examples
Parallel processing is a mode of operation where the task is executed simultaneously in multiple processors in the same computer.
Read more >
Why your multiprocessing Pool is stuck (it's full of sharks!)
Python provides a handy module that allows you to run tasks in a pool of processes, a great way to improve the parallelism...
Read more >
Parallel processing in Pandas - python - GIS Stack Exchange
Your function denom doesn't return anything, and it should return a dataframe given how you use it in the multiprocessing.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found