question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

can't pickle local object when doing a to_hdf5 when using dask.distributed

See original GitHub issue

I’m playing around with the to_hdf5 command following some steps shown here: http://dask.pydata.org/en/latest/array-creation.html

When I try to save the dask array to hdf5 file I got the following error:

distributed.protocol.pickle - INFO - Failed to serialize (<function insert_to_ooc.<locals>.store at 0x7f72d2d1cc80>, (<function apply at 0x7f72f41b6840>, <function partial_by_order at 0x7f72d4787b70>, [(<function arange at 0x7f72e4044bf8>, 0, 3, 1, 3, dtype('int64'))], {'function': <built-in function pow>, 'other': [(1, 2)]}), (slice(0, 3, None),), <unlocked _thread.lock object at 0x7f72d2c25e40>, None)
Traceback (most recent call last):
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/distributed/protocol/pickle.py", line 41, in dumps
    result = pickle.dumps(x, protocol=pickle.HIGHEST_PROTOCOL)
AttributeError: Can't pickle local object 'insert_to_ooc.<locals>.store'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/distributed/protocol/pickle.py", line 54, in dumps
    return cloudpickle.dumps(x, protocol=pickle.HIGHEST_PROTOCOL)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/cloudpickle/cloudpickle.py", line 706, in dumps
    cp.dump(obj)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/cloudpickle/cloudpickle.py", line 146, in dump
    return Pickler.dump(self, obj)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 409, in dump
    self.save(obj)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 751, in save_tuple
    save(element)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/cloudpickle/cloudpickle.py", line 270, in save_function
    self.save_function_tuple(obj)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/cloudpickle/cloudpickle.py", line 312, in save_function_tuple
    save((code, closure, base_globals))
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 736, in save_tuple
    save(element)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 781, in save_list
    self._batch_appends(obj)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 808, in _batch_appends
    save(tmp[0])
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 521, in save
    self.save_reduce(obj=obj, *rv)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/cloudpickle/cloudpickle.py", line 604, in save_reduce
    save(state)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 821, in save_dict
    self._batch_setitems(obj.items())
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 847, in _batch_setitems
    save(v)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 496, in save
    rv = reduce(self.proto)
TypeError: can't pickle h5py.h5d.DatasetID objects
distributed.protocol.core - CRITICAL - Failed to Serialize
Traceback (most recent call last):
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/distributed/protocol/pickle.py", line 41, in dumps
    result = pickle.dumps(x, protocol=pickle.HIGHEST_PROTOCOL)
AttributeError: Can't pickle local object 'insert_to_ooc.<locals>.store'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/distributed/protocol/core.py", line 47, in dumps
    for key, value in data.items()
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/distributed/protocol/core.py", line 48, in <dictcomp>
    if type(value) is Serialize}
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/distributed/protocol/serialize.py", line 130, in serialize
    header, frames = {}, [pickle.dumps(x)]
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/distributed/protocol/pickle.py", line 54, in dumps
    return cloudpickle.dumps(x, protocol=pickle.HIGHEST_PROTOCOL)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/cloudpickle/cloudpickle.py", line 706, in dumps
    cp.dump(obj)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/cloudpickle/cloudpickle.py", line 146, in dump
    return Pickler.dump(self, obj)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 409, in dump
    self.save(obj)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 751, in save_tuple
    save(element)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/cloudpickle/cloudpickle.py", line 270, in save_function
    self.save_function_tuple(obj)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/cloudpickle/cloudpickle.py", line 312, in save_function_tuple
    save((code, closure, base_globals))
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 736, in save_tuple
    save(element)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 781, in save_list
    self._batch_appends(obj)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 808, in _batch_appends
    save(tmp[0])
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 521, in save
    self.save_reduce(obj=obj, *rv)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/cloudpickle/cloudpickle.py", line 604, in save_reduce
    save(state)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 821, in save_dict
    self._batch_setitems(obj.items())
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 847, in _batch_setitems
    save(v)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 496, in save
    rv = reduce(self.proto)
TypeError: can't pickle h5py.h5d.DatasetID objects
distributed.comm.utils - INFO - Unserializable Message: [{'op': 'update-graph', 'tasks': {"('arange-pow-store-pow-962eaabdc778ca6372a23fde28e7916c', 0)": <Serialize: (<function insert_to_ooc.<locals>.store at 0x7f72d2d1cc80>, (<function apply at 0x7f72f41b6840>, <function partial_by_order at 0x7f72d4787b70>, [(<function arange at 0x7f72e4044bf8>, 0, 3, 1, 3, dtype('int64'))], {'function': <built-in function pow>, 'other': [(1, 2)]}), (slice(0, 3, None),), <unlocked _thread.lock object at 0x7f72d2c25e40>, None)>, "('store-pow-962eaabdc778ca6372a23fde28e7916c', 0)": <Serialize: ('arange-pow-store-pow-962eaabdc778ca6372a23fde28e7916c', 0)>, "('arange-pow-store-pow-962eaabdc778ca6372a23fde28e7916c', 1)": <Serialize: (<function insert_to_ooc.<locals>.store at 0x7f72d2d1cc80>, (<function apply at 0x7f72f41b6840>, <function partial_by_order at 0x7f72d4787b70>, [(<function arange at 0x7f72e4044bf8>, 3, 6, 1, 3, dtype('int64'))], {'function': <built-in function pow>, 'other': [(1, 2)]}), (slice(3, 6, None),), <unlocked _thread.lock object at 0x7f72d2c25e40>, None)>, "('store-pow-962eaabdc778ca6372a23fde28e7916c', 1)": <Serialize: ('arange-pow-store-pow-962eaabdc778ca6372a23fde28e7916c', 1)>}, 'dependencies': {"('arange-pow-store-pow-962eaabdc778ca6372a23fde28e7916c', 0)": [], "('store-pow-962eaabdc778ca6372a23fde28e7916c', 0)": ["('arange-pow-store-pow-962eaabdc778ca6372a23fde28e7916c', 0)"], "('arange-pow-store-pow-962eaabdc778ca6372a23fde28e7916c', 1)": [], "('store-pow-962eaabdc778ca6372a23fde28e7916c', 1)": ["('arange-pow-store-pow-962eaabdc778ca6372a23fde28e7916c', 1)"]}, 'keys': ["('store-pow-962eaabdc778ca6372a23fde28e7916c', 1)", "('store-pow-962eaabdc778ca6372a23fde28e7916c', 0)"], 'restrictions': {}, 'loose_restrictions': None, 'priority': {"('store-pow-962eaabdc778ca6372a23fde28e7916c', 0)": 0, "('arange-pow-store-pow-962eaabdc778ca6372a23fde28e7916c', 0)": 1, "('store-pow-962eaabdc778ca6372a23fde28e7916c', 1)": 2, "('arange-pow-store-pow-962eaabdc778ca6372a23fde28e7916c', 1)": 3}, 'resources': None}]
distributed.comm.utils - ERROR - can't pickle h5py.h5d.DatasetID objects
Traceback (most recent call last):
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/distributed/protocol/pickle.py", line 41, in dumps
    result = pickle.dumps(x, protocol=pickle.HIGHEST_PROTOCOL)
AttributeError: Can't pickle local object 'insert_to_ooc.<locals>.store'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/distributed/comm/utils.py", line 16, in to_frames
    return list(protocol.dumps(msg))
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/distributed/protocol/core.py", line 47, in dumps
    for key, value in data.items()
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/distributed/protocol/core.py", line 48, in <dictcomp>
    if type(value) is Serialize}
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/distributed/protocol/serialize.py", line 130, in serialize
    header, frames = {}, [pickle.dumps(x)]
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/distributed/protocol/pickle.py", line 54, in dumps
    return cloudpickle.dumps(x, protocol=pickle.HIGHEST_PROTOCOL)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/cloudpickle/cloudpickle.py", line 706, in dumps
    cp.dump(obj)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/cloudpickle/cloudpickle.py", line 146, in dump
    return Pickler.dump(self, obj)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 409, in dump
    self.save(obj)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 751, in save_tuple
    save(element)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/cloudpickle/cloudpickle.py", line 270, in save_function
    self.save_function_tuple(obj)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/cloudpickle/cloudpickle.py", line 312, in save_function_tuple
    save((code, closure, base_globals))
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 736, in save_tuple
    save(element)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 781, in save_list
    self._batch_appends(obj)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 808, in _batch_appends
    save(tmp[0])
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 521, in save
    self.save_reduce(obj=obj, *rv)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/cloudpickle/cloudpickle.py", line 604, in save_reduce
    save(state)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 821, in save_dict
    self._batch_setitems(obj.items())
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 847, in _batch_setitems
    save(v)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 496, in save
    rv = reduce(self.proto)
TypeError: can't pickle h5py.h5d.DatasetID objects
distributed.batched - ERROR - Error in batched write
Traceback (most recent call last):
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/distributed/protocol/pickle.py", line 41, in dumps
    result = pickle.dumps(x, protocol=pickle.HIGHEST_PROTOCOL)
AttributeError: Can't pickle local object 'insert_to_ooc.<locals>.store'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/distributed/batched.py", line 85, in _background_send
    nbytes = yield self.comm.write(payload)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/tornado/gen.py", line 1015, in run
    value = future.result()
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/tornado/concurrent.py", line 237, in result
    raise_exc_info(self._exc_info)
  File "<string>", line 3, in raise_exc_info
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/tornado/gen.py", line 270, in wrapper
    result = func(*args, **kwargs)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/types.py", line 248, in wrapped
    coro = func(*args, **kwargs)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/distributed/comm/tcp.py", line 171, in write
    frames = [ensure_bytes(f) for f in to_frames(msg)]
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/distributed/comm/utils.py", line 16, in to_frames
    return list(protocol.dumps(msg))
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/distributed/protocol/core.py", line 47, in dumps
    for key, value in data.items()
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/distributed/protocol/core.py", line 48, in <dictcomp>
    if type(value) is Serialize}
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/distributed/protocol/serialize.py", line 130, in serialize
    header, frames = {}, [pickle.dumps(x)]
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/distributed/protocol/pickle.py", line 54, in dumps
    return cloudpickle.dumps(x, protocol=pickle.HIGHEST_PROTOCOL)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/cloudpickle/cloudpickle.py", line 706, in dumps
    cp.dump(obj)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/cloudpickle/cloudpickle.py", line 146, in dump
    return Pickler.dump(self, obj)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 409, in dump
    self.save(obj)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 751, in save_tuple
    save(element)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/cloudpickle/cloudpickle.py", line 270, in save_function
    self.save_function_tuple(obj)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/cloudpickle/cloudpickle.py", line 312, in save_function_tuple
    save((code, closure, base_globals))
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 736, in save_tuple
    save(element)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 781, in save_list
    self._batch_appends(obj)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 808, in _batch_appends
    save(tmp[0])
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 521, in save
    self.save_reduce(obj=obj, *rv)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/site-packages/cloudpickle/cloudpickle.py", line 604, in save_reduce
    save(state)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 476, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 821, in save_dict
    self._batch_setitems(obj.items())
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 847, in _batch_setitems
    save(v)
  File "/home/tangy/anaconda3/envs/ipykernel_py3/lib/python3.6/pickle.py", line 496, in save
    rv = reduce(self.proto)
TypeError: can't pickle h5py.h5d.DatasetID objects

Here is the code:

import dask.array as da
import numpy as np
import distributed;
client = distributed.Client()
x = da.arange(6,chunks=3)
y = x**2
np.array(y)
y.compute()
da.to_hdf5('myfile.hdf5', '/y', y)

I am running a conda environment with Python 3.6 installed. All schedulers, workers, and clients are running within this environment.

Issue Analytics

  • State:open
  • Created 7 years ago
  • Comments:5 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
rafgonsicommented, Oct 27, 2021

Unfortunately, the issue still exists.

When I run the code

import dask.array as da

darr = da.random.randint(0, 10, size=(100, 500), chunks=(10, 20), dtype="int16")
darr.to_hdf5("tmp.h5", "/x")

it works fine when distributed scheduler is disabled, but crashes when I enable it. Here is code to enable scheduler:

from dask.distributed import Client
c = Client(set_as_default=True)
Here is the error raised by to_hdf5() method
distributed.protocol.core - CRITICAL - Failed to Serialize
Traceback (most recent call last):
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/core.py", line 76, in dumps
    frames[0] = msgpack.dumps(msg, default=_encode_default, use_bin_type=True)
  File "/opt/conda/lib/python3.8/site-packages/msgpack/__init__.py", line 35, in packb
    return Packer(**kwargs).pack(o)
  File "msgpack/_packer.pyx", line 292, in msgpack._cmsgpack.Packer.pack
  File "msgpack/_packer.pyx", line 298, in msgpack._cmsgpack.Packer.pack
  File "msgpack/_packer.pyx", line 295, in msgpack._cmsgpack.Packer.pack
  File "msgpack/_packer.pyx", line 264, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 264, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 285, in msgpack._cmsgpack.Packer._pack
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/core.py", line 57, in _encode_default
    sub_header, sub_frames = serialize_and_split(
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 425, in serialize_and_split
    header, frames = serialize(x, serializers, on_error, context)
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 252, in serialize
    return serialize(
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 298, in serialize
    headers_frames = [
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 299, in <listcomp>
    serialize(
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 349, in serialize
    raise TypeError(msg, str(x)[:10000])
TypeError: ('Could not serialize object of type Dataset.', '<HDF5 dataset "x": shape (100, 500), type "<i2">')
distributed.comm.utils - ERROR - ('Could not serialize object of type Dataset.', '<HDF5 dataset "x": shape (100, 500), type "<i2">')
Traceback (most recent call last):
  File "/opt/conda/lib/python3.8/site-packages/distributed/comm/utils.py", line 33, in _to_frames
    return list(protocol.dumps(msg, **kwargs))
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/core.py", line 76, in dumps
    frames[0] = msgpack.dumps(msg, default=_encode_default, use_bin_type=True)
  File "/opt/conda/lib/python3.8/site-packages/msgpack/__init__.py", line 35, in packb
    return Packer(**kwargs).pack(o)
  File "msgpack/_packer.pyx", line 292, in msgpack._cmsgpack.Packer.pack
  File "msgpack/_packer.pyx", line 298, in msgpack._cmsgpack.Packer.pack
  File "msgpack/_packer.pyx", line 295, in msgpack._cmsgpack.Packer.pack
  File "msgpack/_packer.pyx", line 264, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 264, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 285, in msgpack._cmsgpack.Packer._pack
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/core.py", line 57, in _encode_default
    sub_header, sub_frames = serialize_and_split(
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 425, in serialize_and_split
    header, frames = serialize(x, serializers, on_error, context)
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 252, in serialize
    return serialize(
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 298, in serialize
    headers_frames = [
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 299, in <listcomp>
    serialize(
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 349, in serialize
    raise TypeError(msg, str(x)[:10000])
TypeError: ('Could not serialize object of type Dataset.', '<HDF5 dataset "x": shape (100, 500), type "<i2">')
distributed.batched - ERROR - Error in batched write
Traceback (most recent call last):
  File "/opt/conda/lib/python3.8/site-packages/distributed/batched.py", line 93, in _background_send
    nbytes = yield self.comm.write(
  File "/opt/conda/lib/python3.8/site-packages/tornado/gen.py", line 762, in run
    value = future.result()
  File "/opt/conda/lib/python3.8/site-packages/distributed/comm/tcp.py", line 243, in write
    frames = await to_frames(
  File "/opt/conda/lib/python3.8/site-packages/distributed/comm/utils.py", line 50, in to_frames
    return _to_frames()
  File "/opt/conda/lib/python3.8/site-packages/distributed/comm/utils.py", line 33, in _to_frames
    return list(protocol.dumps(msg, **kwargs))
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/core.py", line 76, in dumps
    frames[0] = msgpack.dumps(msg, default=_encode_default, use_bin_type=True)
  File "/opt/conda/lib/python3.8/site-packages/msgpack/__init__.py", line 35, in packb
    return Packer(**kwargs).pack(o)
  File "msgpack/_packer.pyx", line 292, in msgpack._cmsgpack.Packer.pack
  File "msgpack/_packer.pyx", line 298, in msgpack._cmsgpack.Packer.pack
  File "msgpack/_packer.pyx", line 295, in msgpack._cmsgpack.Packer.pack
  File "msgpack/_packer.pyx", line 264, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 264, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 285, in msgpack._cmsgpack.Packer._pack
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/core.py", line 57, in _encode_default
    sub_header, sub_frames = serialize_and_split(
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 425, in serialize_and_split
    header, frames = serialize(x, serializers, on_error, context)
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 252, in serialize
    return serialize(
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 298, in serialize
    headers_frames = [
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 299, in <listcomp>
    serialize(
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 349, in serialize
    raise TypeError(msg, str(x)[:10000])
TypeError: ('Could not serialize object of type Dataset.', '<HDF5 dataset "x": shape (100, 500), type "<i2">')
---------------------------------------------------------------------------
CancelledError                            Traceback (most recent call last)
/tmp/ipykernel_58/1886317945.py in <module>
      2 
      3 darr = da.random.randint(0, 10, size=(100, 500), chunks=(10, 20), dtype="int16")
----> 4 darr.to_hdf5("tmp.h5", "/x")

/opt/conda/lib/python3.8/site-packages/dask/array/core.py in to_hdf5(self, filename, datapath, **kwargs)
   1594         h5py.File.create_dataset
   1595         """
-> 1596         return to_hdf5(filename, datapath, self, **kwargs)
   1597 
   1598     def to_dask_dataframe(self, columns=None, index=None, meta=None):

/opt/conda/lib/python3.8/site-packages/dask/array/core.py in to_hdf5(filename, *args, **kwargs)
   4929             for dp, x in data.items()
   4930         ]
-> 4931         store(list(data.values()), dsets)
   4932 
   4933 

/opt/conda/lib/python3.8/site-packages/dask/array/core.py in store(sources, targets, lock, regions, compute, return_stored, **kwargs)
   1041     else:
   1042         if compute:
-> 1043             compute_as_if_collection(Array, store_dsk, store_keys, **kwargs)
   1044             return None
   1045         else:

/opt/conda/lib/python3.8/site-packages/dask/base.py in compute_as_if_collection(cls, dsk, keys, scheduler, get, **kwargs)
    313     schedule = get_scheduler(scheduler=scheduler, cls=cls, get=get)
    314     dsk2 = optimization_function(cls)(dsk, keys, **kwargs)
--> 315     return schedule(dsk2, keys, **kwargs)
    316 
    317 

/opt/conda/lib/python3.8/site-packages/distributed/client.py in get(self, dsk, keys, workers, allow_other_workers, resources, sync, asynchronous, direct, retries, priority, fifo_timeout, actors, **kwargs)
   2687                     should_rejoin = False
   2688             try:
-> 2689                 results = self.gather(packed, asynchronous=asynchronous, direct=direct)
   2690             finally:
   2691                 for f in futures.values():

/opt/conda/lib/python3.8/site-packages/distributed/client.py in gather(self, futures, errors, direct, asynchronous)
   1964             else:
   1965                 local_worker = None
-> 1966             return self.sync(
   1967                 self._gather,
   1968                 futures,

/opt/conda/lib/python3.8/site-packages/distributed/client.py in sync(self, func, asynchronous, callback_timeout, *args, **kwargs)
    858             return future
    859         else:
--> 860             return sync(
    861                 self.loop, func, *args, callback_timeout=callback_timeout, **kwargs
    862             )

/opt/conda/lib/python3.8/site-packages/distributed/utils.py in sync(loop, func, callback_timeout, *args, **kwargs)
    324     if error[0]:
    325         typ, exc, tb = error[0]
--> 326         raise exc.with_traceback(tb)
    327     else:
    328         return result[0]

/opt/conda/lib/python3.8/site-packages/distributed/utils.py in f()
    307             if callback_timeout is not None:
    308                 future = asyncio.wait_for(future, callback_timeout)
--> 309             result[0] = yield future
    310         except Exception:
    311             error[0] = sys.exc_info()

/opt/conda/lib/python3.8/site-packages/tornado/gen.py in run(self)
    760 
    761                     try:
--> 762                         value = future.result()
    763                     except Exception:
    764                         exc_info = sys.exc_info()

/opt/conda/lib/python3.8/site-packages/distributed/client.py in _gather(self, futures, errors, direct, local_worker)
   1830                         else:
   1831                             raise exception.with_traceback(traceback)
-> 1832                         raise exc
   1833                     if errors == "skip":
   1834                         bad_keys.add(key)

CancelledError: ('store-9017f62a-3721-11ec-8752-0242ac110002', 7, 2)

The error is still there when I use da.store() instead of to_hdf5() as @mrocklin suggested:

import h5py 
f = h5py.File('myfile.hdf5', mode='a') 
x = da.random.randint(0, 10, size=(100, 500), chunks=(10, 20), dtype="int16")
dset = f.create_dataset('/data', shape=x.shape,
                                 chunks=(10, 20),
                                 dtype=x.dtype) 

x.store(dset) 
Here is the error message
distributed.protocol.core - CRITICAL - Failed to Serialize
Traceback (most recent call last):
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/core.py", line 76, in dumps
    frames[0] = msgpack.dumps(msg, default=_encode_default, use_bin_type=True)
  File "/opt/conda/lib/python3.8/site-packages/msgpack/__init__.py", line 35, in packb
    return Packer(**kwargs).pack(o)
  File "msgpack/_packer.pyx", line 292, in msgpack._cmsgpack.Packer.pack
  File "msgpack/_packer.pyx", line 298, in msgpack._cmsgpack.Packer.pack
  File "msgpack/_packer.pyx", line 295, in msgpack._cmsgpack.Packer.pack
  File "msgpack/_packer.pyx", line 264, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 264, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 285, in msgpack._cmsgpack.Packer._pack
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/core.py", line 57, in _encode_default
    sub_header, sub_frames = serialize_and_split(
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 425, in serialize_and_split
    header, frames = serialize(x, serializers, on_error, context)
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 252, in serialize
    return serialize(
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 298, in serialize
    headers_frames = [
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 299, in <listcomp>
    serialize(
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 349, in serialize
    raise TypeError(msg, str(x)[:10000])
TypeError: ('Could not serialize object of type Dataset.', '<HDF5 dataset "data": shape (100, 500), type "<i2">')
distributed.comm.utils - ERROR - ('Could not serialize object of type Dataset.', '<HDF5 dataset "data": shape (100, 500), type "<i2">')
Traceback (most recent call last):
  File "/opt/conda/lib/python3.8/site-packages/distributed/comm/utils.py", line 33, in _to_frames
    return list(protocol.dumps(msg, **kwargs))
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/core.py", line 76, in dumps
    frames[0] = msgpack.dumps(msg, default=_encode_default, use_bin_type=True)
  File "/opt/conda/lib/python3.8/site-packages/msgpack/__init__.py", line 35, in packb
    return Packer(**kwargs).pack(o)
  File "msgpack/_packer.pyx", line 292, in msgpack._cmsgpack.Packer.pack
  File "msgpack/_packer.pyx", line 298, in msgpack._cmsgpack.Packer.pack
  File "msgpack/_packer.pyx", line 295, in msgpack._cmsgpack.Packer.pack
  File "msgpack/_packer.pyx", line 264, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 264, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 285, in msgpack._cmsgpack.Packer._pack
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/core.py", line 57, in _encode_default
    sub_header, sub_frames = serialize_and_split(
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 425, in serialize_and_split
    header, frames = serialize(x, serializers, on_error, context)
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 252, in serialize
    return serialize(
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 298, in serialize
    headers_frames = [
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 299, in <listcomp>
    serialize(
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 349, in serialize
    raise TypeError(msg, str(x)[:10000])
TypeError: ('Could not serialize object of type Dataset.', '<HDF5 dataset "data": shape (100, 500), type "<i2">')
distributed.batched - ERROR - Error in batched write
Traceback (most recent call last):
  File "/opt/conda/lib/python3.8/site-packages/distributed/batched.py", line 93, in _background_send
    nbytes = yield self.comm.write(
  File "/opt/conda/lib/python3.8/site-packages/tornado/gen.py", line 762, in run
    value = future.result()
  File "/opt/conda/lib/python3.8/site-packages/distributed/comm/tcp.py", line 243, in write
    frames = await to_frames(
  File "/opt/conda/lib/python3.8/site-packages/distributed/comm/utils.py", line 50, in to_frames
    return _to_frames()
  File "/opt/conda/lib/python3.8/site-packages/distributed/comm/utils.py", line 33, in _to_frames
    return list(protocol.dumps(msg, **kwargs))
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/core.py", line 76, in dumps
    frames[0] = msgpack.dumps(msg, default=_encode_default, use_bin_type=True)
  File "/opt/conda/lib/python3.8/site-packages/msgpack/__init__.py", line 35, in packb
    return Packer(**kwargs).pack(o)
  File "msgpack/_packer.pyx", line 292, in msgpack._cmsgpack.Packer.pack
  File "msgpack/_packer.pyx", line 298, in msgpack._cmsgpack.Packer.pack
  File "msgpack/_packer.pyx", line 295, in msgpack._cmsgpack.Packer.pack
  File "msgpack/_packer.pyx", line 264, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 264, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 231, in msgpack._cmsgpack.Packer._pack
  File "msgpack/_packer.pyx", line 285, in msgpack._cmsgpack.Packer._pack
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/core.py", line 57, in _encode_default
    sub_header, sub_frames = serialize_and_split(
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 425, in serialize_and_split
    header, frames = serialize(x, serializers, on_error, context)
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 252, in serialize
    return serialize(
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 298, in serialize
    headers_frames = [
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 299, in <listcomp>
    serialize(
  File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 349, in serialize
    raise TypeError(msg, str(x)[:10000])
TypeError: ('Could not serialize object of type Dataset.', '<HDF5 dataset "data": shape (100, 500), type "<i2">')
---------------------------------------------------------------------------
CancelledError                            Traceback (most recent call last)
/tmp/ipykernel_58/2321990944.py in <module>
      6                                  dtype=x.dtype) 
      7 
----> 8 x.store(dset)

/opt/conda/lib/python3.8/site-packages/dask/array/core.py in store(self, target, **kwargs)
   1552     @wraps(store)
   1553     def store(self, target, **kwargs):
-> 1554         r = store([self], [target], **kwargs)
   1555 
   1556         if kwargs.get("return_stored", False):

/opt/conda/lib/python3.8/site-packages/dask/array/core.py in store(sources, targets, lock, regions, compute, return_stored, **kwargs)
   1041     else:
   1042         if compute:
-> 1043             compute_as_if_collection(Array, store_dsk, store_keys, **kwargs)
   1044             return None
   1045         else:

/opt/conda/lib/python3.8/site-packages/dask/base.py in compute_as_if_collection(cls, dsk, keys, scheduler, get, **kwargs)
    313     schedule = get_scheduler(scheduler=scheduler, cls=cls, get=get)
    314     dsk2 = optimization_function(cls)(dsk, keys, **kwargs)
--> 315     return schedule(dsk2, keys, **kwargs)
    316 
    317 

/opt/conda/lib/python3.8/site-packages/distributed/client.py in get(self, dsk, keys, workers, allow_other_workers, resources, sync, asynchronous, direct, retries, priority, fifo_timeout, actors, **kwargs)
   2687                     should_rejoin = False
   2688             try:
-> 2689                 results = self.gather(packed, asynchronous=asynchronous, direct=direct)
   2690             finally:
   2691                 for f in futures.values():

/opt/conda/lib/python3.8/site-packages/distributed/client.py in gather(self, futures, errors, direct, asynchronous)
   1964             else:
   1965                 local_worker = None
-> 1966             return self.sync(
   1967                 self._gather,
   1968                 futures,

/opt/conda/lib/python3.8/site-packages/distributed/client.py in sync(self, func, asynchronous, callback_timeout, *args, **kwargs)
    858             return future
    859         else:
--> 860             return sync(
    861                 self.loop, func, *args, callback_timeout=callback_timeout, **kwargs
    862             )

/opt/conda/lib/python3.8/site-packages/distributed/utils.py in sync(loop, func, callback_timeout, *args, **kwargs)
    324     if error[0]:
    325         typ, exc, tb = error[0]
--> 326         raise exc.with_traceback(tb)
    327     else:
    328         return result[0]

/opt/conda/lib/python3.8/site-packages/distributed/utils.py in f()
    307             if callback_timeout is not None:
    308                 future = asyncio.wait_for(future, callback_timeout)
--> 309             result[0] = yield future
    310         except Exception:
    311             error[0] = sys.exc_info()

/opt/conda/lib/python3.8/site-packages/tornado/gen.py in run(self)
    760 
    761                     try:
--> 762                         value = future.result()
    763                     except Exception:
    764                         exc_info = sys.exc_info()

/opt/conda/lib/python3.8/site-packages/distributed/client.py in _gather(self, futures, errors, direct, local_worker)
   1830                         else:
   1831                             raise exception.with_traceback(traceback)
-> 1832                         raise exc
   1833                     if errors == "skip":
   1834                         bad_keys.add(key)

CancelledError: ('store-9cbed3de-3722-11ec-8752-0242ac110002', 8, 19)

I’m running the code in daskdev/dask-notebook Docker container. The environment:

Python                    3.8.12

dask                      2021.9.1
dask-core                 2021.9.1
h5py                      3.4.0

Is there a workaround for this problem?

0reactions
burdickjpcommented, Dec 14, 2017

I think I’m running into the same problem as well.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Dask distributed LocalCluster fails with "TypeError: can't pickle ...
Dask distributed LocalCluster fails with "TypeError: can't pickle _thread._local objects" when using dask.array.store to hdf5 file.
Read more >
H5py objects cannot be pickled or slow processing - Dask Array
I have an hdf file which i cannot fit into memory which means it should be read in chunks, so I do this...
Read more >
Reading and writing files - Xarray
Pickling is important because it doesn't require any external libraries and lets you use xarray objects with Python modules like multiprocessing ...
Read more >
Reading and Writing Dask DataFrames and Arrays to HDF5
This blog post explains how to write Dask DataFrames to HDF5 files with to_hdf and how to write Dask Arrays to HDF5 files...
Read more >
Working notes by Matthew Rocklin - SciPy
Typically we use libraries like pickle to serialize Python objects. For dask.frame we really care about doing this quickly so we're going to...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found