IncrementalSearchCV runs an adaptive algorithm by default
See original GitHub issueThere are two dimensions in which model selection searches can vary:
Does the search…
- use
partial_fit
orfit
?- use previous evaluations to choose which models to evaluate?
(adapted from https://github.com/dask/dask-ml/pull/370#discussion_r221819839, which is a response to https://github.com/dask/dask-ml/pull/370#discussion_r221573023)
I think (1) should be named “incremental” and (2) should be named “adaptive”.
Currently, IncrementalSearchCV
implements
- a complex adaptive algorithm
- stopping on plateau
- an incremental search that’s used with
_incremental.fit
.
I’d like to clean the naming of IncrementalSearchCV
.
Issue Analytics
- State:
- Created 5 years ago
- Comments:13 (13 by maintainers)
Top Results From Across the Web
KFold or RepeatedKFold with Incremental estimator #421
As I understand, IncrementalSearchCV will yield a set of hyperparameters ... IncrementalSearchCV runs an adaptive algorithm by default #388.
Read more >dask_ml.model_selection.IncrementalSearchCV - Dask-ML
4.0: This implementation of an adaptive algorithm that uses decay_rate has moved to InverseDecaySearchCV . patienceint, default False. If specified, training ...
Read more >Adaptive algorithm - Wikipedia
An adaptive algorithm is an algorithm that changes its behavior at the time it is run, based on information available and on a...
Read more >Algorithm - MATLAB & Simulink - MathWorks
Default: Nonadaptive. Adaptive. Use an improved zero-crossing algorithm which dynamically activates and deactivates zero-crossing bracketing.
Read more >[1904.11244] On adaptive algorithms for maximum matching
The fastest algorithm for this problem, due to Micali and Vazirani, runs in time \mathcal{O}(\sqrt{n}m) and stands unbeaten since 1980.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I don’t think this should block the release (which I’m going to do today). We can easily change the name (and provide a shim for backwards compat) if we want to in the future.
I think the hyper-parameter search documentation can re-structured to make it clear what Dask-ML offers for hyper-parameter optimization. I’ve implemented this restructure in #221.
I’ve changed the default
decay_rate
to 0 in d13384fc63d894a2ba080c291e69d3c6a740e955 as part of #221. This will helpIncrementalSearchCV
serve hyper-parameter searches that memory constrained but not compute constrained.Functionally, this is the same as
GridSearchCV(Incremental(model), params)
but there are apparently some issues with that: https://github.com/dask/dask-ml/issues/421#issuecomment-435212089.