What are correct metadata attributes when adding a zarr array
See original GitHub issueI have an image layer with prechunked data loaded with x,y,z dimension at 4,4,40nm voxel resolution. I want to add a zarr volume which is stored in z,y,x orientation with 40,8,8nm voxel resolution. I get the following source info after adding:
where the zarr array has the following .zattrs
{
"_ARRAY_DIMENSIONS": ["z", "y", "x"],
"offset": [
0,
0,
0
],
"resolution": [
40,
8,
8
],
"scale": 255
}
.zarray
info
{
"chunks": [
15,
304,
304
],
"compressor": {
"id": "gzip",
"level": 5
},
"dtype": "|u1",
"fill_value": 0,
"filters": null,
"order": "C",
"shape": [
7063,
67072,
124416
],
"zarr_format": 2
}
What is the correct way to do this? Perhaps I can add the resolution and bounds information directly to the metadata files.
Issue Analytics
- State:
- Created 2 years ago
- Comments:14 (5 by maintainers)
Top Results From Across the Web
The Array class (zarr.core) — zarr 2.13.3 documentation
If True (default), array configuration metadata will be cached for the lifetime of the object. If False, array metadata will be reloaded prior...
Read more >Tutorial — zarr 2.13.3 documentation - Read the Docs
Zarr arrays and groups support custom key/value attributes, which can be useful for storing application-specific metadata. For example:.
Read more >Zarr storage specification version 2 — zarr 2.13.3 documentation
Each array requires essential configuration metadata to be stored, enabling correct interpretation of the stored data. This metadata is encoded using JSON ...
Read more >zarr.core — zarr 0.1.dev50 documentation - Read the Docs
If False, array metadata will be reloaded prior to all data access and ... optional If True (default), user attributes will be cached...
Read more >Tutorial — zarr 1.0.1.dev0+dirty documentation - Read the Docs
Internally Zarr uses JSON to store array attributes, so attribute values must be JSON serializable. Tips and tricks¶. Copying large arrays¶. Data can...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Yes, we are making a constant effort to un-bake the magic dimension names from the spec 😃
Thanks @d-v-b for pointing out the discussion around ome/ngff. There seems to be a lot going on, and it seems very nice that a lot of the tools already or have the intention of supporting ngff. I think it would make sense to perhaps contribute some of the discussion/proposals here into possibly augmenting a proposal for the axes labeling in the ngff specification. And then go about adding support for OME/ngff to Neuroglancer, instead of another special-purpose zarr metadata spec that is not supported by a wider community.
A few points for discussion may be:
unit
proposal from @jbms , or @d-v-b proposal with separatescale
,units
fields with arrays.''
and nulltranslate
field you have above. There is also the problem of arrays with non-spatial dimension on how to select the relevant axes to apply the affine too. I think there was some discussion about it in the ngff issue, but haven’t seen a conclusive solution yet.translate
field (in world units) would impact indexing operations e.g. intensorstore
. I haven’t checked in detail, but is the idea to support indexing operation in world space/units, v.s. index space of the array data itself?