question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Custom HDF5 plugin can't find libhdf5.so when using h5py

See original GitHub issue

In preparation for a future satellite mission, the provider of the data has implemented a compression method based on charls as a plugin to HDF5, FCIDECOMP, last filter on this page: https://support.hdfgroup.org/services/filters.html

The building and using of this filter works as expected with h5dump:

❯ h5dump -d effective_radiance sample.nc
HDF5 "sample.nc" {
DATASET "effective_radiance" {
   DATATYPE  H5T_STD_I16LE
   DATASPACE  SIMPLE { ( 60, 30 ) / ( 60, 30 ) }
   DATA {
   (0,0): 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18,
   (0,19): 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29,
...

(here is the test file for completeness: sample.zip, 1.3kB)

However, trying to achieve the same thing from h5py leads to an error:

>>> import h5py
>>> h5f = h5py.File('sample.nc', mode='r')
>>> h5f['effective_radiance'][:]
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
  File "/home/a001673/miniconda3/lib/python3.7/site-packages/h5py/_hl/dataset.py", line 651, in __getitem__
    return self._fast_reader.read(args)
  File "h5py/_reader.pyx", line 243, in h5py._reader.Reader.read
OSError: Can't read data (can't dlopen:/home/a001673/miniconda3/envs/fcidecomp3/hdf5/lib/plugin/libH5Zjpegls.so: undefined symbol: H5Pmodify_filter)

The HDF_PLUGIN_PATH and LD_LIBRARY_PATH are setup correctly to:

export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$CONDA_PREFIX/lib
export HDF5_PLUGIN_PATH=:$CONDA_PREFIX/hdf5/lib/plugin

again, as h5dump has no issue reading the file. (The filter FCIDECOMP was built by me as a conda package)

Running nm -D on libH5Zjpegls.so lists of course H5Pmodify_filter as undefined, but it is present in libhdf5.so which itself is located on the defined LD_LIBRARY_PATH.

So, I don’t know how to go further in the debugging here, I would need some assistance 😃

  1. Have custom filters been tested and reported to work with h5py ?
  2. Is there some LD_LIBRARY_PATH magic happening in the h5py code, and in this case where can I find it ?
  3. Any other idea of things I could try ?

To assist reproducing bugs, please include the following:

  • Operating System (e.g. Windows 10, MacOS 10.11, Ubuntu 16.04.2 LTS, CentOS 7) RHEL8
  • Python version (e.g. 2.7, 3.5) 3.7.4
  • Where Python was acquired (e.g. system Python on MacOS or Linux, Anaconda on Windows) Miniconda
  • h5py version (e.g. 2.6) Latest from conda-forge (2.10.0), then github’s master branch, fetched one hour ago
  • HDF5 version (e.g. 1.8.17) 1.10.6_nompi, from conda-forge
Summary of the h5py configuration
---------------------------------

h5py    2.10.0
HDF5    1.10.6
Python  3.7.4 (default, Aug 13 2019, 20:35:49) 
[GCC 7.3.0]
sys.platform    linux
sys.maxsize     9223372036854775807
numpy   1.18.1

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Reactions:4
  • Comments:21 (21 by maintainers)

github_iconTop GitHub Comments

1reaction
vasolecommented, Apr 21, 2020

I can only try point 3…

Are you sure the filter was compiled using the same hdf5 library as h5py was built against?

If so, I had once a similar issue problem compiling an extension module that was traced back to having used different compilers. I had to “conda install gcc” prior to build the module to make sure there were no undefined symbols at run time. Make sure you build your filter in your conda environment with the conda gcc installed to eliminate that possibility.

0reactions
vasolecommented, May 6, 2020

@mraspaud

A plugin compiled the way you did should work if the whole chain (HDF5 library, h5py, CharLS, FCIDECOMP plugin) is built using the same compiler . The only difficulty I had was that the first version of the plugin sources (the one pointed at by the HDF group) did not have the proper filter ID. Version 1.0.2 had the proper filter ID.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Error using dynamic compression plugin · Issue #1966 - GitHub
Summary. An error is reported from a custom compression plugin when I attempt to use it to compress a dataset with h5py. Steps...
Read more >
How do I resolve this error? - HDF5 - HDF Forum
major: Plugin for dynamically loaded library minor: Can't open directory or file [TreatmentControlCore.cpp, loadTreatmentFile, 331] ERROR ...
Read more >
hdf5 / h5py ImportError: libhdf5.so.7 - python - Stack Overflow
Any ideas? EDIT: Trying to locate the file only shows it in the location i untarred it: cronburg@rhel:~/Downloads/h5py-2.0 ...
Read more >
3.2.1 PDF - h5py Documentation
The h5py package is a Pythonic interface to the HDF5 binary data format. HDF5 lets you store huge amounts of numerical data, ...
Read more >
HDF5 - GISAXS
To install the version from the repository: sudo apt-get install libhdf5-dev. In some cases, the repository version may not work (or be out...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found