Flaky e2e test for the XGBoost model with the 'gblinear' booster
See original GitHub issueContext from @StrikerRUS:
Now Go is failing (refer to https://github.com/BayesWitnesses/m2cgen/pull/200#issuecomment-624063683):
=================================== FAILURES ===================================
_ test_e2e[xgboost_XGBClassifier - go_lang - train_model_classification_binary2] _
estimator = XGBClassifier(base_score=0.6, booster='gblinear', colsample_bylevel=None,
colsample_bynode=None, colsamp...ambda=0, scale_pos_weight=1, subsample=None,
tree_method=None, validate_parameters=False, verbosity=None)
executor_cls = <class 'tests.e2e.executors.go.GoExecutor'>
...
expected=[0.04761511 0.9523849 ], actual=[0.047615, 0.952385]
expected=[0.06296992 0.9370301 ], actual=[0.06297, 0.93703]
expected=[0.12447995 0.87552005], actual=[0.124479, 0.875521]
expected=[0.0757848 0.9242152], actual=[0.075784, 0.924216]
expected=[0.8092151 0.19078489], actual=[0.809212, 0.190788]
BTW, in attempts to check my guess from https://github.com/BayesWitnesses/m2cgen/pull/200#issuecomment-624067250, I found that coefs in gblinear are also float32: https://github.com/dmlc/xgboost/blob/67d267f9da3b15a6e5a8393afae9be921a4e224b/src/gbm/gblinear_model.h#L110
and from #188 (comment) we know that bst_float is actually float https://github.com/dmlc/xgboost/blob/8d06878bf9b778db68ae98f68d99a3557c7ea885/include/xgboost/base.h#L110-L111
Issue Analytics
- State:
- Created 3 years ago
- Comments:6 (6 by maintainers)
Top Results From Across the Web
Predictions from XGBClassifier with gblinear booster doesn ...
Hi there! I'm trying to reproduce prediction results from simple dumped JSON model, but my calculations doesn't match results produced by ...
Read more >What exactly is the gblinear booster in XGBoost?
I would like to know which exact model is used as base learner, and how the algorithm is different from the normal tree...
Read more >XGBOOST: Differences between gbtree and gblinear
The booster parameter sets the type of learner. Usually this is either a tree or a linear function. In the case of trees,...
Read more >XGBoost - Understanding gblinear - CHALLENGE
After I train a linear regression model and an xgboost model with 1 round and parameters { booster="gblinear" , objective="reg:linear" , eta=1 ,...
Read more >Understanding a bit xgboost's Generalized Linear Model ...
Laurae: This post is about xgboost's gblinear and its parameters. ... model and an xgboost model with 1 round and parameters {`booster=”gblinear”`, ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I think we can wait for a while for any reply in https://github.com/dmlc/xgboost/issues/5634. Let say two or three days, and only then take any actions.
Have no time to read this right now, but looks like related to our issue: https://github.com/numpy/numpy/pull/9941.