question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Adding new hyperparameters should be supported

See original GitHub issue

Adding new hyperparameters should be supported

c/o @omoindrot, who says:

3. Adding new hyperparameters is not fully supported. If I add a new parameter new and re-create the event file for the HParams tab, the previous runs that don’t contain new will all disappear if I choose to display new. Ideally I would be able to add values to my old summaries, but I couldn’t find a way to do that. For instance if I add a batch_size parameter, I would need add the default value 32 to all previous runs. In general it would be great to be able to manipulate old summaries easily.

One approach would be to modify the HParamInfo proto to include an optional default_value attribute, which would be set whenever a new hyperparameter is created and all previous runs had a fixed value for the parameter (which should be the common case). Users certainly shouldn’t have to modify the old event files for data from previous runs to still be useful.

Ideally, the visualizations should be able to show partial data for runs where one or more hyperparameters is missing: the table view could show an empty cell to denote a missing value, the parallel coordinates view could omit the line segments through an axis with missing data, and the scatter plot matrix could simply omit some points.

Issue Analytics

  • State:open
  • Created 5 years ago
  • Reactions:7
  • Comments:5 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
PedroUriacommented, Feb 11, 2020

Hi @wchargin @shashvatshahi1998. Has there been any progress on this topic? It would be a really useful feature to have. If not, could you point me to how to modify the event files manually?

0reactions
MaLiN2223commented, Apr 11, 2021

Are there any updates on the topic? It would be great if we could get some information on a workaround

Read more comments on GitHub >

github_iconTop Results From Across the Web

Parameters, Hyperparameters, Machine Learning
Hyperparameters are parameters whose values control the learning process and determine the values of model parameters that a learning algorithm ends up learning ......
Read more >
How to optimize hyperparameter tuning for machine learning ...
Consider hyperparameters as building blocks of AI models. You can tweak the parameters or features that go into a model or what that...
Read more >
Best Practices for Hyperparameter Tuning - Amazon SageMaker
Initially, SageMaker assumes linear scaling for hyperparameters. If hyperparameters are log-scaled, choosing the correct scale will make your search more ...
Read more >
Tune Hyperparameters for Classification Machine Learning ...
The more hyperparameters of an algorithm that you need to tune, the slower the tuning process. Therefore, it is desirable to select a ......
Read more >
Overview of hyperparameter tuning | Vertex AI - Google Cloud
The ConditionalParameterSpec object lets you add hyperparameters to a trial when the value of its parent hyperparameter matches a condition that you specify....
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found