question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

RuntimeError: Unable to get link info (addr overflow, addr = 107864, size = 328, eoa = 2048) w/ v0.5

See original GitHub issue

Hi @kishwarshafin,

I am trying to run variant calling with v0.5. Data is ONT Guppy 5.0.7 Sup. Alignments are generated with minimap2.

run_pepper_margin_deepvariant call_variant -b aln.bam -f asm.fasta -o v1 -t 16 --ont -p s1 -g

I get the following error:


[08-31-2021 08:30:44] INFO: VARIANT CALLING MODULE SELECTED
[08-31-2021 08:30:44] INFO: [1/6] RUNNING THE FOLLOWING COMMAND
-------
mkdir -p v1;
mkdir -p v1/logs;
mkdir -p v1/intermediate_files;
-------
[08-31-2021 08:30:44] INFO: [2/6] RUNNING THE FOLLOWING COMMAND
-------
time pepper_variant call_variant -b aln.bam -f asm.fasta -t 16 -m /opt/pepper_models/PEPPER_VARIANT_R941_ONT_V5.pkl -o v1/pepper/ -s Sample -w 16 -g -bs 2048  2>&1 | tee v1/logs/1_pepper.log
-------
[08-31-2021 08:30:45] INFO: CALL VARIANT MODULE SELECTED
[08-31-2021 08:30:45] INFO: RUN-ID: 08312021_083045
[08-31-2021 08:30:45] INFO: IMAGE OUTPUT: v1/pepper/images_08312021_083045/
[08-31-2021 08:30:45] INFO: STEP 1/3 GENERATING IMAGES:
[08-31-2021 08:30:47] INFO: COMMON CONTIGS FOUND: ['contig_2', 'contig_5', 'contig_7', 'contig_8', 'contig_9', 'contig_11', 'contig_14', 'contig_16', 'contig_17', 'contig_19', 'contig_20', 'contig_22', 'contig_25', 'contig_28', 'contig_29', 'contig_30', 'contig_31', 'contig_34', 'contig_35', 'contig_36', 'contig_37', 'contig_38', 'contig_132', 'contig_190', 'contig_207', 'contig_232', 'contig_236', 'contig_247', 'contig_248', 'contig_250', 'contig_251', 'contig_255', 'contig
_257', 'contig_285', 'contig_296', 'contig_298', 'contig_300', 'contig_301', 'contig_303', 'contig_313', 'contig_315', 'contig_347', 'contig_348', 'contig_358', 'contig_359', 'contig_366', 'contig_373', 'contig_381', 'contig_401', 'contig_408', 'contig_440', 'contig_441', 'contig_458', 'contig_459', 'contig_464', 'contig_466', 'contig_478', 'contig_482', 'contig_489', 'contig_492', 'contig_493', 'contig_497', 'contig_506', 'contig_534', 'contig_535', 'contig_558', 'contig_56
1', 'contig_591', 'contig_598', 'contig_602', 'contig_608', 'contig_619', 'contig_620', 'contig_641', 'contig_645', 'contig_666', 'contig_693', 'contig_694', 'contig_700', 'contig_702', 'contig_714', 'contig_734', 'contig_738', 'contig_741', 'contig_744', 'contig_747', 'contig_764', 'contig_783', 'contig_786', 'contig_795', 'contig_796', 'contig_814', 'contig_822', 'contig_835', 'contig_860', 'contig_867', 'contig_873', 'contig_875', 'contig_882', 'contig_922', 'contig_924',
 'contig_955', 'contig_956', 'contig_961', 'contig_962', 'contig_964', 'contig_969', 'contig_982', 'contig_997', 'contig_998', 'contig_999', 'contig_1003', 'contig_1007', 'contig_1013', 'contig_1016', 'contig_1017', 'contig_1026', 'contig_1030', 'contig_1051', 'contig_1056', 'contig_1058', 'contig_1061', 'contig_1065', 'contig_1081', 'contig_1083', 'contig_1111', 'contig_1116', 'contig_1133', 'contig_1149', 'contig_1178', 'contig_1220', 'contig_1223', 'contig_1262', 'contig_
1268', 'contig_1276', 'contig_1282', 'contig_1283', 'contig_1284', 'contig_1305', 'contig_1308', 'contig_1323', 'contig_1326', 'contig_1327', 'contig_1328', 'contig_1337', 'contig_1344', 'contig_1352', 'contig_1353', 'contig_1379', 'contig_1449', 'contig_1453', 'contig_1465', 'contig_1502', 'contig_1503', 'contig_1506', 'contig_1539', 'contig_1545', 'contig_1564', 'contig_1570', 'contig_1571', 'contig_1596', 'contig_1605', 'contig_1614', 'contig_1631', 'contig_1634', 'contig
_1651', 'contig_1656', 'contig_1658', 'contig_1662', 'contig_1664', 'contig_1680', 'contig_1681', 'contig_1683', 'contig_1685', 'scaffold_15', 'scaffold_18', 'scaffold_21', 'scaffold_26', 'scaffold_27', 'scaffold_650', 'scaffold_782', 'scaffold_907', 'scaffold_983', 'scaffold_1038']
[08-31-2021 08:30:47] INFO: TOTAL CONTIGS: 184 TOTAL INTERVALS: 5962 TOTAL BASES: 584860946
[08-31-2021 08:30:47] INFO: STARTING PROCESS: 0 FOR 373 INTERVALS
ERROR: A process in the process pool was terminated abruptly while the future was running or pending.
ERROR: A process in the process pool was terminated abruptly while the future was running or pending.
ERROR: A process in the process pool was terminated abruptly while the future was running or pending.
ERROR: A process in the process pool was terminated abruptly while the future was running or pending.
ERROR: A process in the process pool was terminated abruptly while the future was running or pending.
ERROR: A process in the process pool was terminated abruptly while the future was running or pending.
ERROR: A process in the process pool was terminated abruptly while the future was running or pending.
ERROR: A process in the process pool was terminated abruptly while the future was running or pending.
ERROR: A process in the process pool was terminated abruptly while the future was running or pending.
ERROR: A process in the process pool was terminated abruptly while the future was running or pending.
ERROR: A process in the process pool was terminated abruptly while the future was running or pending.
ERROR: A process in the process pool was terminated abruptly while the future was running or pending.
ERROR: A process in the process pool was terminated abruptly while the future was running or pending.
ERROR: A process in the process pool was terminated abruptly while the future was running or pending.
ERROR: A process in the process pool was terminated abruptly while the future was running or pending.
ERROR: A process in the process pool was terminated abruptly while the future was running or pending.
[08-31-2021 08:31:11] INFO: FINISHED IMAGE GENERATION
[08-31-2021 08:31:11] INFO: TOTAL ELAPSED TIME FOR GENERATING IMAGES: 0 Min 24 Sec
[08-31-2021 08:31:11] INFO: STEP 2/3 RUNNING INFERENCE
[08-31-2021 08:31:11] INFO: OUTPUT: v1/pepper/predictions_08312021_083045/
[08-31-2021 08:31:11] INFO: TOTAL GPU AVAILABLE: 6
[08-31-2021 08:31:11] INFO: AVAILABLE GPU DEVICES: [0, 1, 2, 3, 4, 5]
[08-31-2021 08:31:11] INFO: TOTAL CALLERS: 16
[08-31-2021 08:31:11] INFO: TOTAL THREADS PER CALLER: 1
Traceback (most recent call last):
  File "/usr/local/bin/pepper_variant", line 8, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.8/dist-packages/pepper_variant/pepper_variant.py", line 55, in main
    call_variant(FLAGS.bam,
  File "/usr/local/lib/python3.8/dist-packages/pepper_variant/modules/python/CallVariant.py", line 112, in call_variant
    run_inference(image_output_directory,
  File "/usr/local/lib/python3.8/dist-packages/pepper_variant/modules/python/RunInference.py", line 146, in run_inference
    distributed_gpu(image_dir,
  File "/usr/local/lib/python3.8/dist-packages/pepper_variant/modules/python/RunInference.py", line 84, in distributed_gpu
    predict_distributed_gpu(image_dir, file_chunks, output_dir, model_path, batch_size, total_callers, threads_per_caller, device_ids, num_workers)
  File "/usr/local/lib/python3.8/dist-packages/pepper_variant/modules/python/models/predict_distributed_gpu.py", line 113, in predict_distributed_gpu
    predict(filepath, output_filepath, model_path, batch_size, num_workers, threads_per_caller)
  File "/usr/local/lib/python3.8/dist-packages/pepper_variant/modules/python/models/predict_distributed_gpu.py", line 33, in predict
    input_data = SequenceDataset(input_filepath)
  File "/usr/local/lib/python3.8/dist-packages/pepper_variant/modules/python/models/dataloader_predict.py", line 45, in __init__
    if 'summaries' in hdf5_file:
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
  File "/usr/local/lib/python3.8/dist-packages/h5py/_hl/group.py", line 415, in __contains__
    return self._e(name) in self.id
  File "h5py/h5g.pyx", line 461, in h5py.h5g.GroupID.__contains__
  File "h5py/h5g.pyx", line 462, in h5py.h5g.GroupID.__contains__
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
  File "h5py/h5g.pyx", line 531, in h5py.h5g._path_valid
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
  File "h5py/h5l.pyx", line 212, in h5py.h5l.LinkProxy.exists
RuntimeError: Unable to get link info (addr overflow, addr = 107864, size = 328, eoa = 2048)

real    0m27.343s
user    2m50.876s
sys     0m15.055s

I also tried to reset -w and -bs to v0.4 values and only run on CPUs:

pepper_variant call_variant -b aln.bam -f asm.fasta -t 16 -m /opt/pepper_models/PEPPER_VARIANT_R941_ONT_V5.pkl -o v1/pepper/ -s Sample -w 8 -bs 512

Any idea where to start looking for the problem?

Thanks a lot!

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:12 (7 by maintainers)

github_iconTop GitHub Comments

1reaction
kishwarshafincommented, Sep 7, 2021

Yes, we are going to recalibrate a little bit and figure out what’s the best way to design the polisher that is broadly usable. As the Shasta assembler is improving, we are trying our best to keep up.

1reaction
kishwarshafincommented, Sep 2, 2021

@fbemm ,

Thank you for opening this issue, I’m not sure how the version we released didn’t fail for this case. However, I have integrated your data as one of the tests internally. Please do a docker pull:

docker pull kishwars/pepper_deepvariant:r0.5

And confirm that PEPPER’s version is updated to 0.5.2:

time docker run -it -v /data:/data \
kishwars/pepper_deepvariant:r0.5 \
pepper_variant --version

PEPPER VERSION:  0.5.2

This should have your issue fixed. I’ll keep this issue opened until next week to follow up with your regarding any more issues you face.

Read more comments on GitHub >

github_iconTop Results From Across the Web

What is this error in PyTables? - Stack Overflow
I am trying to load a file using pandas.read_hdf() but I am getting this ... in H5FD_sec2_read addr overflow, addr = 1108161578, size=7512,...
Read more >
hdf5 file is not be saved - QuantumATK Forum
File "h5py/h5o.pyx", line 190, in h5py.h5o.open KeyError: 'Unable to open object (addr overflow, addr = 31749486, size=168, eoa=30745876)'
Read more >
'addr overflow' in H5FD_sec2_read(); what are my options?
I've been using HDF5 for a while with reasonable success, but lately with full production runs I've been getting a 'addr overflow', ...
Read more >
RuntimeError: Unable to get link info - Giters
The error message looks like an address stored inside the file is pointing beyond the end of the file ('eoa' means end of...
Read more >
HDF5 file created with h5py can't be opened by h5py
File('myfile.hdf5','w') group = f.create_group('a_group') ... OSError: Unable to open file (Addr overflow, addr = 800, size=8336, eoa=2144) HDF5: infinite ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found