question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Calibration failure occurred with no scaling factors detected

See original GitHub issue

Hey,

first of all, thanks a lot for your great work. This repo was already a great help to me.

With your quantization update for INT8, however, I ran into a problem. As soon as I activate --quantization, I get the following error:

[01/14/2022-11:18:37] [TRT] [W] Calibrator is not being used. Users must provide dynamic range for all tensors that are not Int32 or Bool.
[01/14/2022-11:18:37] [TRT] [E] 4: [standardEngineBuilder.cpp::initCalibrationParams::1402] Error Code 4: Internal Error (Calibration failure occurred with no scaling factors detected. This could be due to no int8 calibrator or insufficient custom scales for network layers. Please see int8 sample to setup calibration correctly.)
[01/14/2022-11:18:37] [TRT] [E] 2: [builder.cpp::buildSerializedNetwork::609] Error Code 2: Internal Error (Assertion enginePtr != nullptr failed. )

Traceback (most recent call last):
  File "/data/repos/transformer-deploy/src/transformer_deploy/convert.py", line 326, in <module>
    entrypoint()
  File "/data/repos/transformer-deploy/src/transformer_deploy/convert.py", line 322, in entrypoint
    main(commands=args)
  File "/data/repos/transformer-deploy/src/transformer_deploy/convert.py", line 216, in main
    engine: ICudaEngine = build_engine(
  File "/data/repos/transformer-deploy/src/transformer_deploy/backends/trt_utils.py", line 181, in build_engine
    engine: ICudaEngine = runtime.deserialize_cuda_engine(trt_engine)
TypeError: deserialize_cuda_engine(): incompatible function arguments. The following argument types are supported:
    1. (self: tensorrt.tensorrt.Runtime, serialized_engine: buffer) -> tensorrt.tensorrt.ICudaEngine

Invoked with: <tensorrt.tensorrt.Runtime object at 0x7feb14128e30>, None

The problem in the traceback is then just that the trt_engine will be None. I don’t get any other warnings or errors, so I’m a bit at a loss. I’ve tried with distilroberta-base and also with bert-base-uncased, but I get the same error each time. Did you, by any chance, run into the same problem at some point in time or do you see what the issue may be?

Thanks a lot in advance!

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:6 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
v1nc3nt27commented, Feb 1, 2022

@pommedeterresautee Oh, I totally missed that part about calibration. Thanks for pointing that out!

1reaction
DeDeckerThomascommented, Jan 30, 2022

Oh okay, I didn’t know that. Interesting 🤔 Thank you for your answer! I will give it a try and see what the results are 😃

Read more comments on GitHub >

github_iconTop Results From Across the Web

Calibration failure occurred with no scaling factors detected.
Calibration failure occurred with no scaling factors detected. This could be due to no int8 calibrator or insufficient custom scales for network ......
Read more >
Int8 calibration - TensorRT - NVIDIA Developer Forums
Calibration failure occurred with no scaling factors detected. This could be due to no int8 calibrator or insufficient custom scales for ...
Read more >
Performing Inference In INT8 Using Custom Calibration
This sample, sampleINT8, performs INT8 calibration and inference. ... and the builder will calibrate the network to find appropriate quantization factors to ...
Read more >
TensorRT/INT8 Accuracy - eLinux.org
TensorRT introduces INT8 calibration to solve this problem, that run calibration dataset in FP32 mode to chart the histogram of FP32 and choose ......
Read more >
[0901.0489] Scaling factors for ab initio vibrational frequencies
In this framework, we demonstrate that standard calibration summary statistics, as optimal scaling factor and root mean square, can be safely ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found