Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Question about Optuna on own pipeline with segmentation model

See original GitHub issue

hi, @juanmc2005 sorry for bothering you! as title, I want to use optuna to tuning parameters, and the segmentation model used in pipeline is my own trained follow training_a_model.ipynb. And i have already saved in “ckpt” format when i followed the “Custom models” part, try to define own segmentation model like

class MySegmentationModel(SegmentationModel):
   def __init__(self):
        self.my_pretrained_model = torch.load("./epoch=0-step=69.ckpt")  **<- put my own ckpt here**
    def __call__(
        waveform: torch.Tensor,
        weights: Optional[torch.Tensor] = None
    ) -> torch.Tensor:
        return self.my_pretrained_model(waveform, weights)

then, i redefine the config like config = PipelineConfig(segmentation=MySegmentationModel()) optimizer = Optimizer(benchmark, config, hparams, p) it comes out the error shown as below image

then i trace back, and it seems that it can detect duration, sample rate, or sth could you tell me how to fix these problems or I can’t use my segmentation model in optuna?

I want to do this because i guess the parameter will be affected in different model, so i want to give it a try

Thanks for your awesome work and help!!! expected your response!

Issue Analytics

  • State:closed
  • Created a year ago
  • Comments:6 (3 by maintainers)

github_iconTop GitHub Comments

Shoawen0213commented, Sep 7, 2022

Hi @juanmc2005 i finally understand what u mean!! thanks a lot!!

juanmc2005commented, Sep 3, 2022

Hi @Shoawen0213,

You shouldn’t modify for this, you can just pass your own PipelineConfig object to OnlineSpeakerDiarization. If you are running diart.tune and not using the python API (which I think is your case) and you’re also running version 0.5.1, then you should be able to do:

diart.tune ... --segmentation /path/to/checkpoint.ckpt

Concerning the documentation, I’m writing it as a GitHub wiki. There’s a “wiki” tab in the repo’s main page. I still need to add a link from the README so it’s harder to miss 😃

Read more comments on GitHub >

github_iconTop Results From Across the Web

Hyperparameter Tuning of Neural Networks with Optuna and ...
In this article, we will use Optuna to tweak the hyperparameters of a neural network model in PyTorch. So let's start with a...
Read more >
FAQ — Optuna 3.0.5 documentation
How to suppress log messages of Optuna? How to save machine learning models trained in objective functions? How can I obtain reproducible optimization...
Read more >
How should Feature Selection and Hyperparameter ...
There are 3 Problems I see here : 1) Tuning feature selection parameters will influence the classifier performance 2) Optimizing hyperparameters ...
Read more >
OPTUNA: RuntimeError: `Study.stop` is supposed to be ...
optuna : 3.0.3 python: 3.8. when I try the example below, runtimeError is occured. But if I change the line while i <...
Read more >
Efficient Hyperparameter Optimization with Optuna Framework
The Objective Function wraps the default model training pipeline in itself. We define our model, configure optimizers and loss functions, evaluate metrics, and ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Post

No results found

github_iconTop Related Hashnode Post

No results found