[Feature Request] Arbitrary base learners for GradientBoostingRegressor/Classifier
See original GitHub issueDescribe the workflow you want to enable / Describe your proposed solution
add an base_estimator
argument to GradientBoostingRegressor/Classifier and HistGradientBoostingRegressor/Classifier, allowing them to do gradient boosting on any sklearn base estimator that supports weights, while also supporting GradientBoostingRegressor like early stopping, custom objective functions, and early stopping.
Describe alternatives you’ve considered, if relevant
sklearn.ensemble.AdaBoostClassifier supports specifying an arbitrary base_estimator It’d be pretty cool to have that for GradientBoostingRegressor/Classifier and HistGradientBoostingRegressor/Classifier
Additional context
Issue Analytics
- State:
- Created 3 years ago
- Comments:19 (15 by maintainers)
Top Results From Across the Web
1.11. Ensemble methods — scikit-learn 1.2.0 documentation
The goal of ensemble methods is to combine the predictions of several base estimators built with a given learning algorithm in order to...
Read more >How to Develop a Gradient Boosting Machine Ensemble in ...
This is a type of ensemble machine learning model referred to as boosting. Models are fit using any arbitrary differentiable loss function and ......
Read more >Gradient Boosting with a OLS Base Learner - python
In order to compute this step for an arbitrary base learner instead, ... if feature importances are requested when using a base learner...
Read more >1.11. Ensemble methods — scikit-learn 0.16.1 documentation
The goal of ensemble methods is to combine the predictions of several base estimators built with a given learning algorithm in order to...
Read more >Understanding Gradient Boosting Tree for Binary Classification
I'll assume the base learner is a tree, which is by far the most common practice. In this situation, the algorithm is also...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Sorry I hadn’t seen your comment, Zach. If there are compelling use cases, I think it would be reasonable to support, except that we have made it difficult to name this meta estimator when we have already used GradientBoostingClassifier.
Via the
Base
parameter. See documentation here. .Note their implementation is quite slow.