Trouble setting value for vlen type on compound data
See original GitHub issue- Operating System: Arch Linux
- Where Python was acquired: Arch Linux
$ python -c 'import h5py; print(h5py.version.info)'
Summary of the h5py configuration
---------------------------------
h5py 3.2.1
HDF5 1.12.0
Python 3.9.5 (default, May 24 2021, 12:50:35)
[GCC 11.1.0]
sys.platform linux
sys.maxsize 9223372036854775807
numpy 1.20.3
cython (built with) 0.29.22
numpy (built against) 1.20.1
HDF5 (built against) 1.12.0
Reproducible
import h5py
import numpy as np
f = h5py.File('test.h5','w')
table = f.create_dataset(
'packets',
shape=(1,),
chunks=True,
maxshape=(None,),
dtype=[
('timestamp', np.float64),
('data', h5py.vlen_dtype(np.uint8)),
],
)
table[0, 'timestamp'] = 1.5
table[0, 'data'] = np.frombuffer(b'test', dtype=np.uint8)
f.close()
$ python example.py
Traceback (most recent call last):
File "/home/anubis/git/usbviewer/example.py", line 19, in <module>
table[0, 'data'] = np.frombuffer(b'test', dtype=np.uint8)
File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "/usr/lib/python3.9/site-packages/h5py/_hl/dataset.py", line 848, in __setitem__
val = val.view(numpy.dtype([(names[0], dtype)]))
File "/usr/lib/python3.9/site-packages/numpy/core/_internal.py", line 459, in _view_is_safe
raise TypeError("Cannot change data-type for object array.")
TypeError: Cannot change data-type for object array.
I do not understand what I am doing wrong, the array is the correct type. An using the API incorrectly and not seting the value? Admittedly, I am a bit confused by this API and just trying to figure out how to actually write the data 🙃. (also, is there any way to write the whole row as a tuple, eg. table[0] = (1.5, array)
?)
Issue Analytics
- State:
- Created 2 years ago
- Reactions:1
- Comments:8 (7 by maintainers)
Top Results From Across the Web
Writing to compound dataset with variable length string via ...
I've written to other compound datasets fairly easily, by setting the specific column(s) of the compound dataset as equal to an existing numpy ......
Read more >H5Datatype with variable length: How to set the values?
Hi, i am trying create a compound dataset that uses variable length data as shown in the last 2 columns here: For testing...
Read more >FAQ — h5py 3.7.0 documentation
Below is a complete list of types for which h5py supports reading, writing and creating datasets. Each type is mapped to a native...
Read more >netCDF4 API documentation
Mixtures of compound, vlen and enum data types (such as compound ... By default, the utility nc-config (installed with netcdf-c) will be run ......
Read more >Software for Manipulating or Displaying NetCDF Data - Unidata
It has options for automatically handling missing values, scale factors, ... Mixtures of compound and vlen data types (compound types containing vlens, ...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Well, glancing at
Dataset.__setitem__
, I can see there are a few special cases around vlen data and accessing fields of compound types. I’m guessing they don’t interact correctly:https://github.com/h5py/h5py/blob/a6c659037aea2d2cadaca893116c8ef0c6449fe7/h5py/_hl/dataset.py#L783-L803
https://github.com/h5py/h5py/blob/a6c659037aea2d2cadaca893116c8ef0c6449fe7/h5py/_hl/dataset.py#L808-L813
https://github.com/h5py/h5py/blob/a6c659037aea2d2cadaca893116c8ef0c6449fe7/h5py/_hl/dataset.py#L851-L877
I’ve bumped into a similar issue when trying to insert a compound object made up of vlen strings. I posted it on stackoverflow.com and received an answer that points in a different direction (which worked for me):
In other words, perhaps try changing this:
into this: