question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[Feature Request] Arbitrary base learners for GradientBoostingRegressor/Classifier

See original GitHub issue

Describe the workflow you want to enable / Describe your proposed solution

add an base_estimator argument to GradientBoostingRegressor/Classifier and HistGradientBoostingRegressor/Classifier, allowing them to do gradient boosting on any sklearn base estimator that supports weights, while also supporting GradientBoostingRegressor like early stopping, custom objective functions, and early stopping.

Describe alternatives you’ve considered, if relevant

sklearn.ensemble.AdaBoostClassifier supports specifying an arbitrary base_estimator It’d be pretty cool to have that for GradientBoostingRegressor/Classifier and HistGradientBoostingRegressor/Classifier

Additional context

Issue Analytics

  • State:open
  • Created 3 years ago
  • Comments:19 (15 by maintainers)

github_iconTop GitHub Comments

1reaction
jnothmancommented, Jun 24, 2020

Sorry I hadn’t seen your comment, Zach. If there are compelling use cases, I think it would be reasonable to support, except that we have made it difficult to name this meta estimator when we have already used GradientBoostingClassifier.

0reactions
kmedvedcommented, Jul 22, 2021

Via the Base parameter. See documentation here. .

Note their implementation is quite slow.

Read more comments on GitHub >

github_iconTop Results From Across the Web

1.11. Ensemble methods — scikit-learn 1.2.0 documentation
The goal of ensemble methods is to combine the predictions of several base estimators built with a given learning algorithm in order to...
Read more >
How to Develop a Gradient Boosting Machine Ensemble in ...
This is a type of ensemble machine learning model referred to as boosting. Models are fit using any arbitrary differentiable loss function and ......
Read more >
Gradient Boosting with a OLS Base Learner - python
In order to compute this step for an arbitrary base learner instead, ... if feature importances are requested when using a base learner...
Read more >
1.11. Ensemble methods — scikit-learn 0.16.1 documentation
The goal of ensemble methods is to combine the predictions of several base estimators built with a given learning algorithm in order to...
Read more >
Understanding Gradient Boosting Tree for Binary Classification
I'll assume the base learner is a tree, which is by far the most common practice. In this situation, the algorithm is also...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found