question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Example of an embedding loader?

See original GitHub issue

Hi,

is the any chance of an example of a pretrained word embeddings loader?

A single example of how to quickly load say, word2vec or glove, would be really cool. I guess once, people see a common example and use it, it should be straightforward to adapt the loader to other pretrained embeddings.

Thanks a lot 👍

PS - I saw this thread on the opennmt forum, but I couldn’t get it to work?

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Comments:7 (1 by maintainers)

github_iconTop GitHub Comments

11reactions
mataneycommented, Apr 27, 2017

This might help as well: Assuming you have a model variable that is the word2vec or the glove embedding lookup dictionary. model is probably a word->vec mapping. Lets first create a index->vec mapping and call it pretrained_embeddings_matrix. (Just loop over the word indices for insert the vec in the index location)

Now let’s say you have your embedding in a variable called embedding you can just: embedding.weight.data.copy_(torch.from_numpy(pretrained_embeddings_matrix))

5reactions
AjayTalaticommented, Apr 29, 2017

Hi Mataney, @mataney

thanks a lot for the hint 😃

I just saw this repo which seems to do what I want, just thought it would helpful to post it here,

https://github.com/iamalbert/pytorch-wordemb

Read more comments on GitHub >

github_iconTop Results From Across the Web

Word embeddings in 2020. Review with code examples
In this article we will study word embeddings — digital representation of words suitable for processing by machine learning algorithms.
Read more >
Getting Started With Embeddings - Hugging Face
An embedding is a numerical representation of a piece of information, for example, text, documents, images, audio, etc.
Read more >
Word Embedding and Word2Vec Model with Example - Guru99
1. The first Step to implement any machine learning model or implementing natural language processing is data collection 2. It is very important...
Read more >
How to Use Word Embedding Layers for Deep Learning with ...
For example, below we define an Embedding layer with a vocabulary of 200 (e.g. integer encoded words from 0 to 199, inclusive), a...
Read more >
Word embeddings | Text - TensorFlow
For text or sequence problems, the Embedding layer takes a 2D tensor of integers, of shape (samples, sequence_length) , where each entry is...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found