question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Issue reading zarr files with Dask distributed

See original GitHub issue

I’m not at all sure that this is kerchunk’s problem to be honest, but would welcome a pointer.

I can kerchunk UK Met Office netCDF files and read them into xarray as zarr, but if I try to read them with Dask workers they all read until they have run out of memory and crash.

import dask
from dask.distributed import Client
client = Client()

import fsspec
import xarray

def open_fsspec_zarr(json_path):
    with open(json_path) as f:
        mapper = fsspec.get_mapper(
                "reference://", 
                fo=json.load(f),
                remote_protocol="s3",
                remote_options={"anon": False})
        return xarray.open_dataset(mapper, engine='zarr', consolidated=False)

If we don’t chunk the dataset, it works

dataset = open_fsspec_zarr('/data/metoffice/000490262cdd067721a34112963bcaa2b44860ab.json')

%%time
slice = dataset.isel(height=5, realization=1)
slice

CPU times: user 659 µs, sys: 0 ns, total: 659 µs
Wall time: 636 µs

<xarray.Dataset>
Dimensions:                  (latitude: 960, longitude: 1280, bnds: 2)
Coordinates:
    forecast_period          timedelta64[ns] 1 days 18:00:00
    forecast_reference_time  datetime64[ns] 2021-11-07T06:00:00
    height                   float32 75.0
  * latitude                 (latitude) float32 -89.91 -89.72 ... 89.72 89.91
  * longitude                (longitude) float32 -179.9 -179.6 ... 179.6 179.9
    realization              float64 18.0
    time                     datetime64[ns] 2021-11-09
Dimensions without coordinates: bnds
Data variables:
    air_pressure             (latitude, longitude) float32 ...
    latitude_bnds            (latitude, bnds) float32 -90.0 -89.81 ... 90.0
    latitude_longitude       float64 nan
    longitude_bnds           (longitude, bnds) float32 -180.0 -179.7 ... 180.0
Attributes:
    Conventions:                  CF-1.7
    history:                      2021-11-07T10:27:38Z: StaGE Decoupler
    institution:                  Met Office
    least_significant_digit:      1
    mosg__forecast_run_duration:  PT198H
    mosg__grid_domain:            global
    mosg__grid_type:              standard
    mosg__grid_version:           1.6.0
    mosg__model_configuration:    gl_ens
    source:                       Met Office Unified Model
    title:                        MOGREPS-G Model Forecast on Global 20 km St...
    um_version:                   11.5

%time
fetched_slice = slice.to_array()[0,...,0]
plt.figure(figsize=(6, 6))
plt.imshow(fetched_slice, origin='lower')    

CPU times: user 3 µs, sys: 0 ns, total: 3 µs
Wall time: 5.72 µs

<matplotlib.image.AxesImage at 0x7f4608324550>

If we chunk it, it is run on the Dask workers, but runs out of memory on the workers and crashes the whole box.

chunked_dataset = dataset.chunk('auto')

%%time
slice = chunked_dataset.isel(height=5, realization=1)
slice

%time
fetched_slice = slice.to_array()[0,...,0]
plt.figure(figsize=(6, 6))
plt.imshow(fetched_slice, origin='lower')    

My assumption is that the reference filesystem is not being properly communicated to the workers as a task graph?

Issue Analytics

  • State:closed
  • Created a year ago
  • Comments:25 (18 by maintainers)

github_iconTop GitHub Comments

1reaction
martindurantcommented, Apr 19, 2022

Not arbitrary chunks: for original [ 18, 33, 960, 1280 ], you could have chunks

  • [ 9, 33, 960, 1280 ]
  • [ 6, 33, 960, 1280 ]
  • [ 3, 33, 960, 1280 ]
  • [ 1, 33, 960, 1280 ]
  • [ 1, 11, 960, 1280 ]
  • [ 1, 3, 960, 1280 ]
0reactions
martindurantcommented, Apr 21, 2022

Well done! What is especially interesting here, is that you can do this “rechunking” after originally creating the references without any need to rescan the original file or add additional arguments.

Agreed that we wouldn’t normally want people to attempt this kind of thing themselves, code ought to be better at being systematically correct 😃

Read more comments on GitHub >

github_iconTop Results From Across the Web

Deserialization errors when using dask-distributed and Zarr files
I have a Zarr array on my local machine that I wish to process on the remote server. The combination of dask.array.from_zarr and...
Read more >
Creating Dask Array from Zarr files using from_zarr - Coiled
This blog post explains how to create Dask Arrays from Zarr files using the from_zarr function. The from_zarr function reads the data into...
Read more >
How to store data from dask.distributed on disk?
Mainly my problem is saving data from distributed computations back to an in-memory Zarr array while using Dask chaching and graph ...
Read more >
dask.array.from_zarr - Dask documentation
Load array from the zarr storage format. See https://zarr.readthedocs.io for details about the format. Parameters. url: Zarr Array or str or MutableMapping.
Read more >
Parallel computing with Dask - Xarray
For a full example of how to use xarray's Dask integration, read the blog post ... Note that writing netCDF files with Dask's...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found