question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Comment on IG tutorial on text

See original GitHub issue

According to a recommendation from the authors of Integrated Gradients, it is written that we should use the softmax output for Integrated Gradients.

or multi-class classification models, the prediction head is typically a softmax operator on a ‘logits’ tensor. The attribution must be computed from this softmax output and not the ‘logits’ tensor. — See [1]'s Identifying the output tensor.

However, on IMDB TorchText Interpret, we use the logit value, i.e. we use model as the forward function.

Should we update according to be consistent with the recommendation? What do you think?

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Reactions:1
  • Comments:15 (8 by maintainers)

github_iconTop GitHub Comments

1reaction
wangyongjie-ntucommented, Jul 5, 2020

@NarineK Thanks very much for your patient reply.

Why do I want to use DeepLIFT on prob? I proposed a draft method on probability and selected the IG, deepLIFT etc. as baselines. To fairly compare the performance, I want to compute the importance score on the probability function.

I just remove the torch.max operation in above snippets and directly feed the fc into torch.nn.softmax. Currently, deepLIFT works.

image

1reaction
wangyongjie-ntucommented, Jun 30, 2020

@NarineK Do you mean this kind of code?

`` class Mobile(nn.Module):

def __init__(self):
    super(Mobile, self).__init__()
    self.model = nn.Sequential(
            nn.Linear(20, 16),
            nn.ReLU(),
            nn.Linear(16, 12),
            nn.ReLU(),
            nn.Linear(12, 4)
            )
    self.softmax = nn.Softmax(dim = 1)

def forward(self, x):

    fc = self.model(x)
    max_factor, _ = torch.max(fc, dim = 1)
    max_factor = max_factor.expand(4, len(max_factor)).t()
    normed_fc = fc - max_factor
    prob = self.softmax(normed_fc)
    prob = prob + 10e-10

    return prob

``

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to Add and Edit Text in Instagram Reels - YouTube
Want to add text to part of a video on Instagram Reels? Watch this tutorial to learn how to add text to your...
Read more >
How to ADD TEXT to Instagram Stories! - YouTube
If you're new to Instagram Stories, or you want to improve your posts, learning the text tool in Instagram is an absolute must....
Read more >
SIMPLE: Add & Edit Text on Instagram Reels 2021 - YouTube
FREE CLASS! How to Grow Your Reach, Followers, & Customers With Reels (in just 2 hours every month): ...
Read more >
Instagram Reels TUTORIAL + How To Add Text To ... - YouTube
Instagram Reel TUTORIAL + How To Add Text To Instagram Reels🎉 Enroll to our Instagram Reels BOOTCAMP 🎉 https://bit.ly/IGreelsbootcampThis ...
Read more >
How to Use Instagram Reels Video Comment Replies
Instagram now allows users to reply to comments on Reels with videos, similar to TikTok's video comment reply feature. Watch this video to ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found