question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Visualise attention for translation

See original GitHub issue

Hi, first of all, thank you for a great tool.

My question is how to visualize attention for translation? I would like to see how much particular input word attended the choice of output word. I am using Seq2Seq model and its output has three types od attention values: encoder, decoder and cross attentions. https://huggingface.co/transformers/main_classes/output.html#seq2seqlmoutput

So to plot it, It would mean that words on the right side of head_view come from the translated sequence.

Is it how seq2seq attention can be visualised?

Thank you even for directing me in advance. Below is a code I used to get Seq2SeqLMOutput

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
from bertviz import model_view

tokenizer = AutoTokenizer.from_pretrained("Helsinki-NLP/opus-mt-pl-en")
model = AutoModelForSeq2SeqLM.from_pretrained("Helsinki-NLP/opus-mt-pl-en")

text = "Kot nie przekroczył drogi bo była za szeroka"

inputs = tokenizer.encode(text, return_tensors="pt")
outputs = model.generate(inputs)

tokens = tokenizer.convert_ids_to_tokens(inputs[0])
output = model(inputs, decoder_input_ids=inputs, output_attentions=True)

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:9 (6 by maintainers)

github_iconTop GitHub Comments

2reactions
jessevigcommented, May 8, 2021
1reaction
jessevigcommented, Dec 17, 2022

You would just take the output of the generation and plug it into the aforementioned script. Because this is an autoregressive (left-to-right generting) model, this will show you the complete history of attention that would have been used to generate the output. Let me know if that doesn’t make sense.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Visualizing A Neural Machine Translation Model (Mechanics ...
A solution was proposed in Bahdanau et al., 2014 and Luong et al., 2015. These papers introduced and refined a technique called “Attention”, ......
Read more >
Neural machine translation with attention | Text - TensorFlow
This tutorial demonstrates how to train a sequence-to-sequence (seq2seq) model for Spanish-to-English translation roughly based on Effective ...
Read more >
Visualizing NLP Attention Based Models Using Custom Charts
A quick introduction into using a custom chart to visualize attention models in an NLP application - Neural Machine Translation.
Read more >
Building and Visualizing Machine Language Translation from ...
Learn and understand to build machine language translation, deep learning models, with TensorFlow (Using Seq2seq Models with Attention).
Read more >
Interactive Visualization and Manipulation of Attention-based ...
While neural machine translation (NMT) provides high-quality translation, it is still hard to interpret and analyze its behavior. We present an interactive ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found