question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Add more options to create_supervised_trainer

See original GitHub issue

🚀 Feature

Idea is to add more options to create_supervised_trainer helper:

For Hacktoberfest/PyDataGlobal contributors, feel free to ask questions for details if any and say that you would like to tackle the issue. Please, take a look at CONTRIBUTING guide.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Reactions:1
  • Comments:8

github_iconTop GitHub Comments

1reaction
vfdev-5commented, Feb 12, 2021

Cross-posted from #1589


We discussed this PR and related issue with the team and we think that we should explore a bit different approach. Helper method create_supervised_trainer is roughly made of 2 things : update function definition and Engine setup.

Probably, it would more helpful to provide public methods like:

  • supervised_training_step
  • supervised_training_step_tpu
  • supervised_training_step_apex
  • supervised_training_step_amp

and inside create_supervised_trainer we could setup the trainer according to provided options without lots of if/else. Maybe, we can skip for instance grad norm.

Basically, the idea is something like that :

def get_training_step_1(a):
    def training_step(e, b):
        print(a, e, b)
    return training_step
    
def get_training_step_2(a):
    def training_step(e, b):
        print(a, e, b, "with amp")
    return training_step

def create_supervised_trainer(a, opt):
    training_step = None
    if opt == 1:
        training_step = get_training_step_1(a)
    elif opt == 2:
        training_step = get_training_step_2(a)
        
    e = Engine(training_step)
    return e  
1reaction
apthagowda97commented, Oct 20, 2020

Sorry I was busy from past few days. I ll try to finish it Asap.

Read more comments on GitHub >

github_iconTop Results From Across the Web

create_supervised_trainer — PyTorch-Ignite v0.4.10 ...
High-level library to help with training and evaluating neural networks in PyTorch flexibly and transparently.
Read more >
Trainer — PyTorch Lightning 1.8.5.post0 documentation
The trainer allows overriding any key part that you don't want automated. Basic use. This is the basic use of the trainer: model ......
Read more >
revisiting-self-supervised/trainer.py at master - GitHub
Contribute to google/revisiting-self-supervised development by creating an account on ... """Base trainer class.""" ... TODO(akolesnikov): add more logging.
Read more >
PyTorch-Ignite: training and evaluating neural networks ...
model 's trainer is an engine that loops multiple times over the training dataset and updates model parameters. Let's see how we define...
Read more >
Create New Features From Existing Features - OpenClassrooms
Train a Supervised Machine Learning Model ... You can use binning to create new target features you want to predict or new input...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found