question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Which acquisition function should I use to iteratively improve model accuracy rather than optimize a target value?

See original GitHub issue

It took me a while to track down the issue and comment related to this, so I wanted to surface this in a new issue and immediately close it for better searchability.

I think there are two approaches. One option is to reformulate it as a problem where you minimize an error metric instead of a target value; this requires having a test set or some form of cross-validation that you trust. The other option which seems preferable, especially if one is adding new data or starting from scratch takes from @Balandat’s comment in https://github.com/facebook/Ax/issues/460#issuecomment-758428881 to use the botorch.acquisition.active_learning.qNegIntegratedPosteriorVariance (qNIPV) acquisition function:

Yeah qNIPV is agnostic to the direction, the goal is to minimize a global measure of uncertainty of the model, so there is no better or worse w.r.t. the function values.

If I understand it correctly, this implies no exploitation, but rather pure exploration.

Issue Analytics

  • State:closed
  • Created a year ago
  • Reactions:1
  • Comments:8 (4 by maintainers)

github_iconTop GitHub Comments

3reactions
iandoxseecommented, Oct 19, 2022

Hi @eytan, this looks like a really exciting approach that directly gets to what we’re after. Thanks for your input!

2reactions
eytancommented, Oct 12, 2022

@iandoxsee , you may be interested in https://botorch.org/tutorials/constraint_active_search which aims to cover all designs which exceed some pre-specified threshold across multiple outcomes (constraints).

Read more comments on GitHub >

github_iconTop Results From Across the Web

Bayesian Optimization: A step by step approach
In short, acquisition function uses “Exploration vs Exploitation” strategy to decide optimal parameter search in an iterative manner. Inside ...
Read more >
how to force models to make more exploration #460 - GitHub
Which acquisition function should I use to iteratively improve model accuracy rather than optimize a target value? #930.
Read more >
How to Implement Bayesian Optimization from Scratch in Python
The acquisition function is responsible for scoring or estimating the likelihood that a given candidate sample (input) is worth evaluating with ...
Read more >
botorch.acquisition - Bayesian Optimization in PyTorch
Computes classic Expected Improvement over the current best observed value, using the analytic formula for a Normal posterior distribution. Unlike the MC-based ...
Read more >
Exploring Bayesian Optimization - Distill.pub
To effectively use these algorithms, we need to pick good hyperparameter values. In this article, we talk about Bayesian Optimization, a suite ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found