ONNX inference with dropout
See original GitHub issueI have a model based on self-attention, I’d like to enable dropout in ONNX model inference.
But I found that when I set training=torch.onnx.TrainingMode.TRAINING,
, and inference with onnxruntime, the model output are identical for each inference with the same input.
here is the code I used to export model:
torch.onnx.export(model, # model being run
onnx_input, # model input (or a tuple for multiple inputs)
onnx_path, # where to save the model (can be a file or file-like object)
training=torch.onnx.TrainingMode.TRAINING, # export in Training mode
verbose=True,
export_params=True, # store the trained parameter weights inside the model file
opset_version=12, # the ONNX version to export the model to
do_constant_folding=False, # whether to execute constant folding for optimization
input_names=input_names, # the model's input names
output_names=output_names, # the model's output names
dynamic_axes={
'speakers': {0: 'batch_size'}, # variable length axes
'phonemes': {0: 'batch_size', 1: 'seq_len'},
'src_lens': {0: 'batch_size'},
'output': {0: 'batch_size', 1: 'mel_len'},
'postnet_output': {0: 'batch_size', 1: 'mel_len'},
'p_predictions': {0: 'batch_size', 1: 'seq_len'},
'e_predictions': {0: 'batch_size', 1: 'seq_len'},
'log_d_predictions': {0: 'batch_size', 1: 'seq_len'},
'd_rounded': {0: 'batch_size', 1: 'seq_len'},
'src_masks': {0: 'batch_size', 1: 'seq_len'},
'mel_masks': {0: 'batch_size', 1: 'mel_len'},
'output_src_lens': {0: 'batch_size'},
'mel_lens': {0: 'batch_size'}
})
print("done")
And actually, the output of the exported ONNX model is similar to Pytorch inference output with model.eval() mode, the output are roughly the same, with only small difference.
This is weird.
When I set training=torch.onnx.TrainingMode.EVAL, the exported ONNX model does not contain dropout layer, and when set training to TRAINING, the exported ONNX model does contain dropout layer, but generate same output each inference time,
It seems that the droput mask is fixed when I export ONNX model with TRAINING mode, or the dropout seed is fixed, how could I make the dropout seed not fixed in the exported ONNX model?
Issue Analytics
- State:
- Created 2 years ago
- Comments:10 (4 by maintainers)
I found a better solution 😃
Thank you very much. Yes, this config solved my issue.
But I need to set with DISABLE:
with this setup, the ORT generates different output each time. Great.