question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Composed linear layers?

See original GitHub issue

Hey @tatp22 great repo!

I’m having trouble wrapping my head around the w_q, w_k, and w_v linear layers in the LinearAttentionHead module. Are they needed? There’s no activation between the previous linear layers, to_q, to_k, to_v in MHAttention, and those weights so they wouldn’t add any expressivity to the model since you would just be multiplying two matrices together which is equivalent to one linear layer. The E and F projections also seem like they’re being composed with w_k, and w_v without a non-linearity.

Looking at Eq. 7 from the paper your implementation seems correct though.

Any thoughts on this?

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:5 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
tatp22commented, Jun 28, 2020

No prob 😃 Merged, and the latest version is available with version 0.10.0 on pip

1reaction
apeguero1commented, Jun 27, 2020

Great sounds good, I’ll take a look!

Read more comments on GitHub >

github_iconTop Results From Across the Web

Neural Network Layer: Linear Layer - Sanjaya's Blog
In the network, the hidden layer and output layer is composed made up of Dense layer. Hidden layer has 3 nodes and output...
Read more >
Designing Optimal Implementations of Linear Layers (Full ...
and the optimal implementation procedure of invertible linear layers. Here, “optimal” ... are composed by confusion layers and diffusion layers.
Read more >
Linear Layer - Introduction to Neural Networks - AlgoDaily
This is known as a linear layer. In deep learning, there will be a lot of layers joined together. Let's see how we...
Read more >
[1511.05946] ACDC: A Structured Efficient Linear Layer - arXiv
We present theoretical results showing how deep cascades of ACDC layers approximate linear layers. ACDC is, however, a stand-alone module and can be...
Read more >
Convolutional Layers vs Fully Connected Layers
In fully connected layers, the neuron applies a linear transformation to the input vector through a weights matrix. A non-linear transformation ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found