question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Changing the num_features parameter in explain_instance gives very different results

See original GitHub issue

I have recently started using LIME and I’m using it to explain the outputs of a LGBMClassifier.

I have an issue with the num_features parameter. For example if I run it with 5, the output is something like this:

exp = explainer.explain_instance(X_train.values[idx], predict_model_lgbm.predict_proba, num_features=5) exp.as_list()

[(‘entitlement_number <= 1.00’, -0.41398972465855205), (‘Annual=0’, 0.33586503353921526), (‘Daily=0’, -0.23173429118453642), (‘hist_total_tenurea <= 0.00’, 0.210909651734952), (‘numrenewals > 1.00’, 0.12207210974714447)]

But if I run it with num_features = 10, I get something like this:

[(‘SAILING <= 0.00’, 0.0), (‘LIVE ONLY <= 1.00’, 0.0), (‘MODERN PENTATHLON <= 0.00’, 0.0), (‘JET SKIING <= 0.00’, 0.0), (‘VOLLEYBALL <= 0.00’, 0.0), (‘HANDBALL <= 0.00’, 0.0), (‘SHOW JUMPING <= 0.00’, 0.0), (‘One Time=0’, 0.0), (‘ARTISTIC GYMNASTICS <= 0.00’, 0.0), (‘XTREME SPORTS <= 0.00’, 0.0)]

I would have thought that increasing the features from 5 to 10 would keep the original 5 plus adding another 5 that are less important. Am I using this parameter correctly?

On another note, the features shown by the explainer are very different to the ones I get when plotting model’s feature importances (using plot_importance from Lightgbm). Is that something I should expect?

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Comments:5 (2 by maintainers)

github_iconTop GitHub Comments

6reactions
pancodiacommented, Apr 19, 2019

Any update on this issue?

0reactions
sandymoenscommented, Jul 28, 2020

Hi Marco, first of, thanks for the implementation 👍. Secondly, I am using lime==0.2.0.1 and am getting a similar problem as Fernando. For num_features < 7 the explanations are generally incremental and intuitive. Using num_features between 7 and 20 i get zeros on explanations and going even higher, I get either zeros or unintuitive features explanations, i.e., explanations altering from one class to another and values larger than the first 6 “good” explanations. The dataset is non public so unfortunately i cannot pass a copy for reproduction purposes.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Classification Model's parameters produce different results
I'm working on SVC model for classification and I faced different accuracy result in each time I changed the values of the parameters...
Read more >
lime package — lime 0.1 documentation
Generates images and predictions in the neighborhood of this image. Parameters: image – 3d numpy array, the image; fudged_image – 3d numpy array,...
Read more >
HashingTF — PySpark 3.3.1 documentation - Apache Spark
Since a simple modulo is used to transform the hash function to a column index, it is advisable to use a power of...
Read more >
(PDF) Explaining Interpretable Machine Learning: Theory ...
PDF | This working paper aims at providing a structured and accessible introduction to the topic of interpretable machine learning.
Read more >
Understanding LightGBM Parameters (and How to Tune Them)
How to tune lightGBM parameters in python? Gradient boosting methods. With LightGBM, you can run different types of Gradient boosting methods.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found