question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Multiple different errors while creating the Dask LocalCluster

See original GitHub issue

What happened

Failed to create the local cluster. I got a different cases of error when I put the extra worker argument or not.

Case 1: without worker parameter

When I run following code:

from dask.distributed import LocalCluster
cluster = LocalCluster()

Three kinds of error string are shown and then failed to create the local cluster.

Error type A

tornado.application - ERROR - Exception in callback <bound method Nanny.memory_monitor of <Nanny: None, threads: 4>>
Traceback (most recent call last):
  File "/home/idblab/anaconda3/envs/jupyter/lib/python3.7/site-packages/tornado/ioloop.py", line 907, in _run
    return self.callback()
  File "/home/idblab/anaconda3/envs/jupyter/lib/python3.7/site-packages/distributed/nanny.py", line 414, in memory_monitor
    process = self.process.process
AttributeError: 'NoneType' object has no attribute 'process'

Error type B

tornado.application - ERROR - Exception in callback functools.partial(<bound method IOLoop._discard_future_result of <zmq.eventloop.ioloop.ZMQIOLoop object at 0x7fb340c0df90>>, <Task finished coro=<Nanny._on_exit() done, defined at /home/idblab/anaconda3/envs/jupyter/lib/python3.7/site-packages/distributed/nanny.py:440> exception=TypeError('addresses should be strings or tuples, got None')>)
Traceback (most recent call last):
  File "/home/idblab/anaconda3/envs/jupyter/lib/python3.7/site-packages/tornado/ioloop.py", line 743, in _run_callback
    ret = callback()
  File "/home/idblab/anaconda3/envs/jupyter/lib/python3.7/site-packages/tornado/ioloop.py", line 767, in _discard_future_result
    future.result()
  File "/home/idblab/anaconda3/envs/jupyter/lib/python3.7/site-packages/distributed/nanny.py", line 443, in _on_exit
    await self.scheduler.unregister(address=self.worker_address)
  File "/home/idblab/anaconda3/envs/jupyter/lib/python3.7/site-packages/distributed/core.py", line 861, in send_recv_from_rpc
    result = await send_recv(comm=comm, op=key, **kwargs)
  File "/home/idblab/anaconda3/envs/jupyter/lib/python3.7/site-packages/distributed/core.py", line 660, in send_recv
    raise exc.with_traceback(tb)
  File "/home/idblab/anaconda3/envs/jupyter/lib/python3.7/site-packages/distributed/core.py", line 513, in handle_comm
    result = await result
  File "/home/idblab/anaconda3/envs/jupyter/lib/python3.7/site-packages/distributed/scheduler.py", line 2208, in remove_worker
    address = self.coerce_address(address)
  File "/home/idblab/anaconda3/envs/jupyter/lib/python3.7/site-packages/distributed/scheduler.py", line 4946, in coerce_address
    raise TypeError("addresses should be strings or tuples, got %r" % (addr,))
TypeError: addresses should be strings or tuples, got None

Error type C

distributed.utils - ERROR - addresses should be strings or tuples, got None
Traceback (most recent call last):
  File "/home/idblab/anaconda3/envs/jupyter/lib/python3.7/site-packages/distributed/utils.py", line 656, in log_errors
    yield
  File "/home/idblab/anaconda3/envs/jupyter/lib/python3.7/site-packages/distributed/scheduler.py", line 2208, in remove_worker
    address = self.coerce_address(address)
  File "/home/idblab/anaconda3/envs/jupyter/lib/python3.7/site-packages/distributed/scheduler.py", line 4946, in coerce_address
    raise TypeError("addresses should be strings or tuples, got %r" % (addr,))
TypeError: addresses should be strings or tuples, got None

Error type D

distributed.core - ERROR - addresses should be strings or tuples, got None
Traceback (most recent call last):
  File "/home/idblab/anaconda3/envs/jupyter/lib/python3.7/site-packages/distributed/core.py", line 513, in handle_comm
    result = await result
  File "/home/idblab/anaconda3/envs/jupyter/lib/python3.7/site-packages/distributed/scheduler.py", line 2208, in remove_worker
    address = self.coerce_address(address)
  File "/home/idblab/anaconda3/envs/jupyter/lib/python3.7/site-packages/distributed/scheduler.py", line 4946, in coerce_address
    raise TypeError("addresses should be strings or tuples, got %r" % (addr,))
TypeError: addresses should be strings or tuples, got None

Full logs

http://paste.debian.net/1158552/

Case 2: with worker parameter

When I put extra worker parameter local_directory into LocalCluster() like following code:

from dask.distributed import LocalCluster
cluster = LocalCluster(local_directory="/tmp/dask-worker-space")

Now only the Error type A above shown and then failed to create the local cluster.

Full logs

http://paste.debian.net/1158551/

What you expected to happen

  • This is my first time for using Dask. I couldn’t load Dask at first and never before, so I don’t know how the expected behaviour.

Minimal Complete Verifiable Example

Case 1: without worker parameter

from dask.distributed import LocalCluster
cluster = LocalCluster()

Case 2: with worker parameter

from dask.distributed import LocalCluster
cluster = LocalCluster(local_directory="/tmp/dask-worker-space")

Anything else we need to know?

  • Potential related issue: #3955 (same with Error type A)

Environment

  • Dask version: 2.20.0
    (jupyter) idblab@debian-20200402:~$ conda list
    # packages in environment at /home/idblab/anaconda3/envs/jupyter:
    #
    # Name                    Version                   Build  Channel
    dask                      2.20.0                     py_0  
    dask-core                 2.20.0                     py_0
    distributed               2.20.0                   py37_0
    
  • Python version: Python 3.7.7
  • Operating System: Linux debian-20200402 5.7.0-2-amd64 #1 SMP Debian 5.7.10-1 (2020-07-26) x86_64 GNU/Linux
  • Install method (conda, pip, source): conda install dask distributed -c conda-forge

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:20 (8 by maintainers)

github_iconTop GitHub Comments

1reaction
mrocklincommented, Sep 15, 2020

I’m glad to hear it. I’m going to close this for now in hopes that the issue is resolved. We’ll reopen if this still persists for someone on latest release.

1reaction
hisplancommented, Sep 2, 2020

Okay. Will try and let you know. Thanks.

Read more comments on GitHub >

github_iconTop Results From Across the Web

SLURM task fails when creating an instance of the Dask ...
I already tried adding more cores, more memory, setting processes=False at the instantiation of Client , and many other things but I can't ......
Read more >
API — Dask.distributed 2022.12.1 documentation
In this case the Client creates a LocalCluster in the background and connects to that. Any extra keywords are passed from Client to...
Read more >
Understanding Dask Architecture: Client, Scheduler, Workers
The Dask Client is closest to the developer and communicates with the scheduler. This in turn distributes tasks to multiple workers.
Read more >
FAQ — arboreto 0.1.5 documentation - Read the Docs
You can easily create a custom LocalCluster, with the dashboard enabled, ... Bokeh error when launching Dask scheduler; Workers do not connect with...
Read more >
Just Start with the Dask LocalCluster | Saturn Cloud Blog
Try a Dask Local Cluster; Try a multi-node (and possibly multi-GPU) Dask Cluster. In general, the simpler your stack, the less time you' ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found