Visualise attention for translation
See original GitHub issueHi, first of all, thank you for a great tool.
My question is how to visualize attention for translation? I would like to see how much particular input word attended the choice of output word. I am using Seq2Seq model and its output has three types od attention values: encoder, decoder and cross attentions. https://huggingface.co/transformers/main_classes/output.html#seq2seqlmoutput
So to plot it, It would mean that words on the right side of head_view
come from the translated sequence.
Is it how seq2seq attention can be visualised?
Thank you even for directing me in advance.
Below is a code I used to get Seq2SeqLMOutput
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
from bertviz import model_view
tokenizer = AutoTokenizer.from_pretrained("Helsinki-NLP/opus-mt-pl-en")
model = AutoModelForSeq2SeqLM.from_pretrained("Helsinki-NLP/opus-mt-pl-en")
text = "Kot nie przekroczył drogi bo była za szeroka"
inputs = tokenizer.encode(text, return_tensors="pt")
outputs = model.generate(inputs)
tokens = tokenizer.convert_ids_to_tokens(inputs[0])
output = model(inputs, decoder_input_ids=inputs, output_attentions=True)
Issue Analytics
- State:
- Created 2 years ago
- Comments:9 (6 by maintainers)
Top Results From Across the Web
Visualizing A Neural Machine Translation Model (Mechanics ...
A solution was proposed in Bahdanau et al., 2014 and Luong et al., 2015. These papers introduced and refined a technique called “Attention”, ......
Read more >Neural machine translation with attention | Text - TensorFlow
This tutorial demonstrates how to train a sequence-to-sequence (seq2seq) model for Spanish-to-English translation roughly based on Effective ...
Read more >Visualizing NLP Attention Based Models Using Custom Charts
A quick introduction into using a custom chart to visualize attention models in an NLP application - Neural Machine Translation.
Read more >Building and Visualizing Machine Language Translation from ...
Learn and understand to build machine language translation, deep learning models, with TensorFlow (Using Seq2seq Models with Attention).
Read more >Interactive Visualization and Manipulation of Attention-based ...
While neural machine translation (NMT) provides high-quality translation, it is still hard to interpret and analyze its behavior. We present an interactive ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Encoder-decoder models now supported in Bertviz 1.1.0: https://github.com/jessevig/bertviz/blob/master/notebooks/head_view_encoder_decoder.ipynb
You would just take the output of the generation and plug it into the aforementioned script. Because this is an autoregressive (left-to-right generting) model, this will show you the complete history of attention that would have been used to generate the output. Let me know if that doesn’t make sense.