question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Protein Embedding with last activation layers?

See original GitHub issue

Is it possible to obtain the last activation values using AlphaFold?

Something like ESM allows with the model.forward method.

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Reactions:5
  • Comments:9

github_iconTop GitHub Comments

4reactions
ricomnlcommented, Jul 26, 2021

@xinformatics The first section of the article The AlphaFold2 Method Paper: A Fount of Good Ideas suggests that s_i is the embedding you want to use. This would correspond to the single key in the prediction_result[‘representations’] dict.

At every step of the process, {s_i} is kept updated, communicating back and forth with {z_{ij}}, so that whatever is built up in {z_{ij}} is made accessible to {s_i}. As a result {s_i} is front and center in all the major modules. And at the end, in the structure module, it is ultimately {s_i}, not {z_{ij}}, that encodes the structure (where the quaternions get extracted to generate the structure). This avoids the awkwardness of having to project the 2D representation onto 3D space.

3reactions
xinformaticscommented, Jul 21, 2021

@tfgg Could you suggest which representation would be a good choice as an protein embedding for downstream tasks? since i get 5 different representations from the prediction result?

Read more comments on GitHub >

github_iconTop Results From Across the Web

FoldHSphere: deep hyperspherical embeddings for protein ...
The last layer performs a linear classification of the 512-dimensional embeddings using K output units. Here, K is the number of fold classes ......
Read more >
Improving protein succinylation sites prediction using ... - Nature
Embedding layers in Keras work by treating peptides as documents and individual amino acids within that peptide as words. Initially, each amino ...
Read more >
Application of Sequence Embedding in Protein ... - arXiv
Here, we review different approaches of protein sequence embeddings and ... extracting embeddings derived from the hidden state of the last attention layer....
Read more >
Incorporating Deep Learning With Word Embedding to Identify ...
The first method is embedding layer in neural network (Neishi et al., 2017); the essence of embedding layer is a fully connected neural...
Read more >
Bayesian neural network with pretrained protein embedding ...
In the classifier block, the feature vector x passes fully connected layers with ReLU activation to output the final prediction value.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found