Win32: Can't use @parallel: pickle.PicklingError: it's not found as fabric.tasks.inner
See original GitHub issueGreetings, First of all thanks a lot for a really great tool. But I have a problem with parrallel executions.
My setup:
- ActivePython 2.7.2.5 (ActiveState Software Inc.) based on Python 2.7.2 (default, Jun 24 2011, 12:21:10) [MSC v.1500 32 bit (Intel)] on win32
- Fabric 1.3.3 (installed via
pypm install fabric
) - WinXP
I have a script wich works perfectly with serial executions. But when I add @parallel
decorator to one of the tasks, for ex:
@roles('app', 'db')
@parallel
def checkout_or_update(branch):
path = podman_sources + branch
if not exists(path):
with cd(podman_sources):
run(checkout_command % {"branch" : branch})
else:
with cd(path):
run('svn up')
the I get something like that:
$ fab deploy:trunk,staging
[192.168.102.183] Executing task 'checkout_or_update'
[192.168.102.169] Executing task 'checkout_or_update'
[192.168.102.167] Executing task 'checkout_or_update'
Traceback (most recent call last):
File "C:\Documents and Settings\me\Application Data\Python\Python27\site-packages\fabric\main.py", line 682, in
main
*args, **kwargs
File "C:\Documents and Settings\me\Application Data\Python\Python27\site-packages\fabric\tasks.py", line 248, in execute
task.run(*args, **new_kwargs)
File "C:\Documents and Settings\me\Application Data\Python\Python27\site-packages\fabric\tasks.py", line 105, in run
return self.wrapped(*args, **kwargs)
File "D:\Mishail\My Documents\wspace\app\fabfile.py", line 122, in deploy
execute(checkout_or_update, branch)
File "C:\Documents and Settings\me\Application Data\Python\Python27\site-packages\fabric\tasks.py", line 237, in execute
exitcodes = jobs.run()
File "C:\Documents and Settings\me\Application Data\Python\Python27\site-packages\fabric\job_queue.py", line 124, in run
_advance_the_queue()
File "C:\Documents and Settings\me\Application Data\Python\Python27\site-packages\fabric\job_queue.py", line 114, in _advance_the_queue
job.start()
File "D:\Technologies\Python27\lib\multiprocessing\process.py", line 130, in start
self._popen = Popen(self)
File "D:\Technologies\Python27\lib\multiprocessing\forking.py", line 271, in __init__
dump(process_obj, to_child, HIGHEST_PROTOCOL)
File "D:\Technologies\Python27\lib\multiprocessing\forking.py", line 193, in dump
ForkingPickler(file, protocol).dump(obj)
File "D:\Technologies\Python27\lib\pickle.py", line 224, in dump
self.save(obj)
File "D:\Technologies\Python27\lib\pickle.py", line 331, in save
self.save_reduce(obj=obj, *rv)
File "D:\Technologies\Python27\lib\pickle.py", line 419, in save_reduce
save(state)
File "D:\Technologies\Python27\lib\pickle.py", line 286, in save
f(self, obj) # Call unbound method with explicit self
File "D:\Technologies\Python27\lib\pickle.py", line 649, in save_dict
self._batch_setitems(obj.iteritems())
File "D:\Technologies\Python27\lib\pickle.py", line 681, in _batch_setitems
save(v)
File "D:\Technologies\Python27\lib\pickle.py", line 286, in save
f(self, obj) # Call unbound method with explicit self
File "D:\Technologies\Python27\lib\pickle.py", line 748, in save_global
(obj, module, name))
pickle.PicklingError: Can't pickle <function inner at 0x00F73F70>: it's not found as fabric.tasks.inner
$ Traceback (most recent call last):
File "<string>", line 1, in <module>
File "D:\Technologies\Python27\lib\multiprocessing\forking.py", line 374, in main
self = load(from_parent)
File "D:\Technologies\Python27\lib\pickle.py", line 1378, in load
return Unpickler(file).load()
File "D:\Technologies\Python27\lib\pickle.py", line 858, in load
dispatch[key](self)
File "D:\Technologies\Python27\lib\pickle.py", line 880, in load_eof
raise EOFError
EOFError
Issue Analytics
- State:
- Created 12 years ago
- Comments:23 (7 by maintainers)
Top Results From Across the Web
Errors with Python and Fabric 1.x, when parallel processing
I am using Fabric 1.11.1 to run the scripts a list of servers. Here is the sample code. server1 = "" server2 =...
Read more >PicklingError in pyspark (PicklingError: Can't pickle <class ...
Possible answer from here. The problem is that you're trying to pickle an object from the module where it's defined. If you move...
Read more >PicklingError: Could not pickle the task to send it to the workers.
I am using sklearn in a databricks notebook to fit an estimator in parallel. Sklearn uses joblib with loky backend to do this....
Read more >Catherine's Auxiliary Brain
I generate cairo tilings because I'm working on generating some custom printed fabric. Cairo tilings are an ideal pattern for clothing ...
Read more >Expert Python Programming Third Edition
It does not mean that hobbyists won't find anything interesting. This book should be great for anyone who is interested in learning advanced-level...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@bitprophet , I am using Fabric 1.10 and python 2.7 on Windows 7. When I use the “parallel” decorator, I am getting the above mentioned PicklingError. I was wondering if this issue has been resolved as this ticket has been closed.
“make sure things are now thread safe” was done and the result is fabric v2, available now in alpha