question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Help Requested for Embeddings Trained on Specific Texts

See original GitHub issue

Hello, I am looking into the embedding model generation but I was trying to figure out if there was a way to take this module and apply it to just train on a specific amount of text similar to how model= gensim.models.Word2Vec(abc.sents()) works. Then I can make my own model and not use the general one for all words. Thank you. https://docs.cltk.org/en/latest/_modules/cltk/embeddings/embeddings.html#CLTKWord2VecEmbeddings

Issue Analytics

  • State:closed
  • Created 9 months ago
  • Reactions:1
  • Comments:7 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
clemsciencescommented, Dec 20, 2022

I’m closing this issue, but you can still ask questions.

1reaction
egrevencommented, Dec 19, 2022

Thank you very much!

Read more comments on GitHub >

github_iconTop Results From Across the Web

Getting Started With Embeddings - Hugging Face
1. Embedding a dataset ... The first step is selecting an existing pre-trained model for creating the embeddings. We can choose a model...
Read more >
Use Pre-trained Word Embedding to detect real disaster tweets
The task consists in predicting whether or not a given tweet is about a real disaster. To address this text classification task we...
Read more >
What Are Word Embeddings for Text?
Word embeddings are in fact a class of techniques where individual words are represented as real-valued vectors in a predefined vector space.
Read more >
Training, Visualizing, and Understanding Word Embeddings
We found a model to create embeddings: We used some example code for the Word2Vec model to help us understand how to create...
Read more >
Sentiment Specific Word Embedding - Kaggle
In simple terms, Word Embedding is a way of converting texts into numbers for the machine to understand that text. When applying one-hot...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found