question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Implementing parameter sharing (Universal Transformers)

See original GitHub issue

What is your question?

I am trying to set up Universal Transformer in the codebase and wanted some help and insight on it. Basically Universal Transformers are just normal transformer models with parameter sharing across layers.

Code

I am changing the following line to have the same initialized layer within the loop in self.layers.extend for both the encoder and the decoder. Is that all the change that is required to get it set up because I am not able to get good results with it.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:23 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
sarthmitcommented, Feb 22, 2021

Thank you very much!

0reactions
sarthmitcommented, Feb 25, 2021

Thanks @takase! That solved the problem!

Read more comments on GitHub >

github_iconTop Results From Across the Web

Implementing parameter sharing (Universal Transformers)
I am trying to set up Universal Transformer in the codebase and wanted some help and insight on it. Basically Universal Transformers are...
Read more >
Lessons on Parameter Sharing across Layers in Transformers
The proposed approach relaxes a widely used technique, which shares parameters for one layer with all layers such as Universal Transformers ...
Read more >
Lessons on Parameter Sharing across Layers in Transformers
This work proposes a novel parameter sharing method for Transformers that relaxes a widely used technique, which shares the parameters of ...
Read more >
Parameter Sharing Methods for Multilingual Self-Attentional ...
parameter sharing strategies for the Transformer model using MTL, mainly for one-to-many multi- lingual translation. Here, we will use the symbol Θ.
Read more >
UNIVERSAL TRANSFORMERS - OpenReview
Then, by applying a transition function. (shared across position and time) to the outputs of the self-attention mechanism, independently at each position. As ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found