Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Incompatible with dask in processes scheduler

See original GitHub issue
from pyforest import *
import dask

def inc1(a):
    return a + 1

def inc2(a):
    return np.add(a, 1)

print(dask.compute(map(dask.delayed(inc1), [1,2,3]), scheduler='processes')[0])
print(dask.compute(map(dask.delayed(inc2), [1,2,3]), scheduler='processes')[0])

inc1 work, inc2 don’t

Issue Analytics

  • State:open
  • Created 4 years ago
  • Comments:5 (1 by maintainers)

github_iconTop GitHub Comments

RafeyIqbalRahmancommented, Jul 23, 2020

@lumyuto I ran your code on Google Colab and got the following error:

PicklingError: Could not pickle object as excessively deep recursion required.

This error is related to the Dask library itself.

FlorianWetschoreckcommented, Dec 4, 2019

Thank you for the bug report. We discussed this internally but had no idea on how to even start working on this problem. If you can come up with a solution that would be great.

So long, we will flag this as “help wanted” and will see if someone comes up with a solution.

In the mean time, you can also just import np explicitely because it will overwrite pyforest. So, you should be able to use

from pyforest import *
import numpy as np

Does this work for you?

Read more comments on GitHub >

github_iconTop Results From Across the Web

Stalling Tasks? · Issue #5879 · dask/distributed - GitHub
P.S. Sometimes I'll stop the Dask processing and then retry running on the same batch it got stuck on, and then it will...
Read more >
Running process scheduler in Dask distributed - Stack Overflow
The distributed scheduler allows you to work with any number of processes, via any of the deployment options. Each of these can have...
Read more >
Futures - Dask documentation
Dask futures reimplements the Python futures API so you can scale your Python futures workflow across a Dask cluster.
Read more >
Understanding Dask Architecture: Client, Scheduler, Workers
Common pitfalls when working with Dask · Make a task graph that is too big · Make a graph with too many sinks...
Read more >
Distribute Experiments Across Machines - garage
This machine will run dask-scheduler and Prefect. ... Since the scheduler runs as a foreground process, run it in a new terminal session....
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Post

No results found

github_iconTop Related Hashnode Post

No results found