question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Why not scale values when attribution values smaller than 1e-5?

See original GitHub issue

When displaying the attribution, you normalise and scale the values.

However, do you skip normalising if the scaling factor (which is the max value after the outliers) is below 1e-5?

def _normalize_scale(attr: ndarray, scale_factor: float):
    if abs(scale_factor) < 1e-5:
        warnings.warn(
            "Attempting to normalize by value approximately 0, skipping normalization."
            "This likely means that attribution values are all close to 0."
        )
     ....

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:8 (4 by maintainers)

github_iconTop GitHub Comments

2reactions
andreimargeloiucommented, Jun 8, 2020

Use-case My use-case is interpreting robust model (they are trained using adversarial training [1]). Such models are trained on adversarial inputs.

On robust models, the gradients with respect to the input are very small (see picture below), where the s axis represents the attributions before rescaling. Notice that the range is around 1e-3. Using SmoothGrad, the gradients are around 1e-5, 1e-6 -> which creates issues with Captum.

image

Issue with current warning For people investigation interpretability on robust models, it’s essential to be able to plot them, despite potential errors associated with floating-point arithmetic.

In Jupyter this warning wasn’t printed, which took me hours to dig into Captum and understand why the saliency map was essentially white (because the inputs weren’t scaled)

Potential solution: It would be good to allow power-user to bypass this warning (either through a parameter), or simply disable the check.

[1] https://arxiv.org/pdf/1706.06083.pdf

1reaction
andreimargeloiucommented, Aug 24, 2020

Thank you for the heads up! @vivekmig, please go ahead as you initially proposed and plan this change for a future release 😃

Read more comments on GitHub >

github_iconTop Results From Across the Web

R Visualization Tips: Using the scales package - Bookdown
A scaling factor: x will be multiplied by scale before formating. This is useful if the underlying data is very small or very...
Read more >
normalization - scale a number between a range
Show activity on this post. Your scaling will need to take into account the possible range of the original number. There is a...
Read more >
Bayesian Inference via Markov Chain Monte Carlo (MCMC)
This is inherently a Bayesian question: it is about probability of parameters. “Actually best” refers to the true unknown parameter values, and ...
Read more >
How many primes are there?
For the smaller values of x in this table (say to 10,000,000,000) the value of π(x) can be found by finding and counting...
Read more >
Normalization of ChIP-seq data with control - PMC - NCBI
SPP estimates the background regions by excluding highly “enriched” regions with a small p-value either in the ChIP sample or the control sample...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found