Run with pre-computed SHAP values/interactions
See original GitHub issueHello,
Getting familiar with your great tool, I was wondering how/if I could prevent computing SHAP values. I have already computed and saved SHAP values and interaction values on a GPU cluster and was wondering if there was a way to feed them to an Explainer
and bypass their calculation when I run the explainer dashboard on my laptop.
Thanks a lot
Issue Analytics
- State:
- Created 2 years ago
- Comments:6 (2 by maintainers)
Top Results From Across the Web
Analysing Interactions with SHAP - Towards Data Science
Specifically, we start by explaining what SHAP interaction values are and how they can be used to explain individual predictions. We then dive ......
Read more >Basic SHAP Interaction Value Example in XGBoost
This notebook shows how the SHAP interaction values for a very simple function are computed. We start with a simple linear function, and...
Read more >FastTreeSHAP: Accelerating SHAP value computation for trees
Figure 1 shows a typical example of SHAP values of two individual samples ... running time of FastTreeSHAP v1 is reduced to 25%...
Read more >Analysing Interactions with SHAP - Kaggle
SHAP values are used to explain individual predictions made by a model. It does this by giving the contributions of each factor to...
Read more >linkedin/FastTreeSHAP: Fast SHAP value computation for ...
Version 1 builds a parallel for-loop over all trees, which requires memory allocation (each thread has its own matrices to store both SHAP...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Hey, forgot to loop back on this, but actually added methods
explainer.set_shap_values()
andexplainer.set_shap_interaction_values()
that allow you to import pre-calculated shap values in the last release.Could you test if these work for you?
Hey @oegedijk ,
After doing a few tests, the SHAP values entered manually don’t make much sense. I calculate the SHAP values following this example and then input these shape values in the explainer. Attached are two pictures, the first showing SHAP values computed by the explainer, the second showing pre-computed Shap values entered with your suggestion.
Can you make any sense of that? I am messing up things?
Alternatively, is there any way to activate SHAP’s
GPUTree
explainer?Thanks a lot