'get_var' fails while the multiprocess execution if any task uses delayed creation.
See original GitHub issue@schettino72 , hey! 😃
So, i found strange behaviour while trying to implement some stuff.
#! /usr/bin/doit -f
# -*- coding: utf-8 -*-
from doit import get_var
from doit import create_after
get_var( 'A', None )
def task_foo() :
return {
'actions': [ 'echo foo' ],
'task_dep': [ 'bar' ],
}
@create_after( executed = 'baz' )
def task_bar() :
for i in range( 10 ) :
yield {
'name': 'bar_{}'.format( i ),
'actions': [ 'echo bar_{}'.format( i ) ]
}
def task_baz() :
for i in range( 10 ) :
yield {
'name': 'baz_{}'.format( i ),
'actions': [ 'echo baz_{}'.format( i ) ]
}
This code works fine while being executed in single-process mode. But it fails in multi process mode.
/root>GetVar.py -n 2
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/py3.4/lib/multiprocessing/spawn.py", line 106, in spawn_main
exitcode = _main(fd)
File "/py3.4/lib/multiprocessing/spawn.py", line 116, in _main
self = pickle.load(from_parent)
File "/root/GetVar.py", line 7, in <module>
get_var( 'A', None )
File "/py3.4/lib/site-packages/doit/doit_cmd.py", line 35, in get_var
return _CMDLINE_VARS.get(name, default)
AttributeError: 'NoneType' object has no attribute 'get'
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/py3.4/lib/multiprocessing/spawn.py", line 106, in spawn_main
exitcode = _main(fd)
File "/py3.4/lib/multiprocessing/spawn.py", line 116, in _main
self = pickle.load(from_parent)
File "/root/GetVar.py", line 7, in <module>
get_var( 'A', None )
File "/py3.4/lib/site-packages/doit/doit_cmd.py", line 35, in get_var
return _CMDLINE_VARS.get(name, default)
AttributeError: 'NoneType' object has no attribute 'get'
But if i remove the create_after
decorator, all became works fine even in multi process mode. So, i found it ambiguous. I had seen no notifications in docs about such behaviour.
Can you confirm this is the doit error or not? Currently i use the last (0.30.0) version of doit.
Issue Analytics
- State:
- Created 7 years ago
- Comments:15 (9 by maintainers)
Top Results From Across the Web
Delay-Bounded Scheduling - Microsoft
We provide a new characterization of scheduling nondeter- minism by allowing deterministic schedulers to delay their next-scheduled task.
Read more >BitBake User Manual - the Yocto Project Documentation
Fundamentally, BitBake is a generic task execution engine that allows shell and Python tasks to be run efficiently and in parallel while working...
Read more >Multiprocessing Pool Stop All Tasks If One Task Fails in Python
You can cancel all tasks in the multiprocessing pool if one task fails using a shared multiprocessing.Event object.
Read more >Changelog — Python 3.11.1 documentation
_base_executable when inside a POSIX virtual environment using copies of the ... to module level so that they can be executed late in...
Read more >A Future for R: Common Issues with Solutions
If a global variable is used in a future expression that conditionally ... a problem if you use a character string instead of...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Ah, yes, that makes sense, thanks 😃 See the PR. Not sure if this is the optimal way to test it…
The OP’s test file works now with multiple processes.
yes, it is windows specific.
Windows has no “fork”.
On linux when a process fork it continues from where it were… copy the memory exactly as in the original process.
On windows, they start a new process from scratch. there is some magic involved, but for us what matters is that all modules are imported again on every process. I guess on windows
multiprocess
just try to load all modules present in the list of modules from the original process.The issue is that
dodo.py
is loaded dynamically by doit application, andget_var
requires some initialization. So only on Windows multliprocess processes try to loaddodo.py
without proper initialization (parsing of command line arguments), then it fails.makes sense now?