Trace warnings when trying to jit.trace a model
See original GitHub issueFirst of all, hats off for your effort on building and maintaining this. Keep up the good work.
My issue is when I try to jit.trace a model that uses this layer, I get an error similar to this one,
dsntnn.py:47: TracerWarning: Converting a tensor to a Python integer might cause the trace to be incorrect. We can’t record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs! return torch.linspace(first, last, length, device=device)
This also happens when trying to export the onnx model from a model that uses dsntnn, so basically a model that we try to export to onnx with a command like this, will give this trace warning, making it impossible to load the exported model.
torch.onnx.export(model, x, "deployment/ckpts/{0}.onnx".format(model_name), export_params=False, operator_export_type=torch.onnx.OperatorExportTypes.ONNX_ATEN_FALLBACK)
How to reproduce,
model = CoordRegressionNetwork(n_locations=2)
x = Variable(torch.randn(5, 3, 200, 200, requires_grad=True))
traced_script_module = torch.jit.trace(model, x)
Issue Analytics
- State:
- Created 5 years ago
- Comments:11 (3 by maintainers)
Top GitHub Comments
No worries. Fixed this by just loading the weights on the network definition that only returns unnormalized heatmaps in the forward function (so no dsntnn functions are involved in the tracing process).
Closing the issue, Thanks
As of https://github.com/anibali/dsntnn/commit/93acc46e224f9170f2bd719f7baf8531dca177c4 tracing seems to work correctly, and is tested.