Add more options to create_supervised_trainer
See original GitHub issue🚀 Feature
Idea is to add more options to create_supervised_trainer
helper:
- native AMP option similar to https://github.com/lidq92/LinearityIQA/blob/master/modified_ignite_engine.py
- we may either use nvidia/apex if installed or torch native amp if v1.6.0. Otherwise, raise an error.
- grads cliping as proposed #419
For Hacktoberfest/PyDataGlobal contributors, feel free to ask questions for details if any and say that you would like to tackle the issue. Please, take a look at CONTRIBUTING guide.
Issue Analytics
- State:
- Created 3 years ago
- Reactions:1
- Comments:8
Top Results From Across the Web
create_supervised_trainer — PyTorch-Ignite v0.4.10 ...
High-level library to help with training and evaluating neural networks in PyTorch flexibly and transparently.
Read more >Trainer — PyTorch Lightning 1.8.5.post0 documentation
The trainer allows overriding any key part that you don't want automated. Basic use. This is the basic use of the trainer: model ......
Read more >revisiting-self-supervised/trainer.py at master - GitHub
Contribute to google/revisiting-self-supervised development by creating an account on ... """Base trainer class.""" ... TODO(akolesnikov): add more logging.
Read more >PyTorch-Ignite: training and evaluating neural networks ...
model 's trainer is an engine that loops multiple times over the training dataset and updates model parameters. Let's see how we define...
Read more >Create New Features From Existing Features - OpenClassrooms
Train a Supervised Machine Learning Model ... You can use binning to create new target features you want to predict or new input...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Cross-posted from #1589
We discussed this PR and related issue with the team and we think that we should explore a bit different approach. Helper method
create_supervised_trainer
is roughly made of 2 things :update
function definition andEngine
setup.Probably, it would more helpful to provide public methods like:
supervised_training_step
supervised_training_step_tpu
supervised_training_step_apex
supervised_training_step_amp
and inside
create_supervised_trainer
we could setup the trainer according to provided options without lots of if/else. Maybe, we can skip for instance grad norm.Basically, the idea is something like that :
Sorry I was busy from past few days. I ll try to finish it Asap.