Custom HDF5 plugin can't find libhdf5.so when using h5py
See original GitHub issueIn preparation for a future satellite mission, the provider of the data has implemented a compression method based on charls as a plugin to HDF5, FCIDECOMP, last filter on this page: https://support.hdfgroup.org/services/filters.html
The building and using of this filter works as expected with h5dump:
❯ h5dump -d effective_radiance sample.nc
HDF5 "sample.nc" {
DATASET "effective_radiance" {
DATATYPE H5T_STD_I16LE
DATASPACE SIMPLE { ( 60, 30 ) / ( 60, 30 ) }
DATA {
(0,0): 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18,
(0,19): 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29,
...
(here is the test file for completeness: sample.zip, 1.3kB)
However, trying to achieve the same thing from h5py leads to an error:
>>> import h5py
>>> h5f = h5py.File('sample.nc', mode='r')
>>> h5f['effective_radiance'][:]
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "/home/a001673/miniconda3/lib/python3.7/site-packages/h5py/_hl/dataset.py", line 651, in __getitem__
return self._fast_reader.read(args)
File "h5py/_reader.pyx", line 243, in h5py._reader.Reader.read
OSError: Can't read data (can't dlopen:/home/a001673/miniconda3/envs/fcidecomp3/hdf5/lib/plugin/libH5Zjpegls.so: undefined symbol: H5Pmodify_filter)
The HDF_PLUGIN_PATH and LD_LIBRARY_PATH are setup correctly to:
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$CONDA_PREFIX/lib
export HDF5_PLUGIN_PATH=:$CONDA_PREFIX/hdf5/lib/plugin
again, as h5dump has no issue reading the file. (The filter FCIDECOMP was built by me as a conda package)
Running nm -D
on libH5Zjpegls.so
lists of course H5Pmodify_filter
as undefined, but it is present in libhdf5.so
which itself is located on the defined LD_LIBRARY_PATH
.
So, I don’t know how to go further in the debugging here, I would need some assistance 😃
- Have custom filters been tested and reported to work with h5py ?
- Is there some
LD_LIBRARY_PATH
magic happening in the h5py code, and in this case where can I find it ? - Any other idea of things I could try ?
To assist reproducing bugs, please include the following:
- Operating System (e.g. Windows 10, MacOS 10.11, Ubuntu 16.04.2 LTS, CentOS 7) RHEL8
- Python version (e.g. 2.7, 3.5) 3.7.4
- Where Python was acquired (e.g. system Python on MacOS or Linux, Anaconda on Windows) Miniconda
- h5py version (e.g. 2.6) Latest from conda-forge (2.10.0), then github’s master branch, fetched one hour ago
- HDF5 version (e.g. 1.8.17) 1.10.6_nompi, from conda-forge
Summary of the h5py configuration
---------------------------------
h5py 2.10.0
HDF5 1.10.6
Python 3.7.4 (default, Aug 13 2019, 20:35:49)
[GCC 7.3.0]
sys.platform linux
sys.maxsize 9223372036854775807
numpy 1.18.1
Issue Analytics
- State:
- Created 3 years ago
- Reactions:4
- Comments:21 (21 by maintainers)
I can only try point 3…
Are you sure the filter was compiled using the same hdf5 library as h5py was built against?
If so, I had once a similar issue problem compiling an extension module that was traced back to having used different compilers. I had to “conda install gcc” prior to build the module to make sure there were no undefined symbols at run time. Make sure you build your filter in your conda environment with the conda gcc installed to eliminate that possibility.
@mraspaud
A plugin compiled the way you did should work if the whole chain (HDF5 library, h5py, CharLS, FCIDECOMP plugin) is built using the same compiler . The only difficulty I had was that the first version of the plugin sources (the one pointed at by the HDF group) did not have the proper filter ID. Version 1.0.2 had the proper filter ID.