question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Add unscaled option for embeddings (at least ASE/LSE)

See original GitHub issue

Is your feature request related to a problem? Please describe.

Working on something now where I often want the unscaled version of the ASE/LSE embedding, that is, just the left or right singular vectors without scaling each by sqrt(singular value).

Describe the solution you’d like

Would like a scaled=False keyword argument, default kept as it is.

Describe alternatives you’ve considered

Obviously easy to unscale after getting the current embedding, just less convenient. Does make the embeddings even closer to just SVD, but ASE at least has diag_aug, and LSE has some preprocessing to form the Laplacian.

Provide references (if applicable)

At least one of the very early embedding papers had unscaled as an option: https://arxiv.org/abs/1108.2228

Additional context

Would also make code in MASE simpler I think? https://github.com/microsoft/graspologic/blob/dev/graspologic/embed/mase.py#L158 could just use the new ASE, passing down scaled

Issue Analytics

  • State:open
  • Created 3 years ago
  • Comments:22 (13 by maintainers)

github_iconTop GitHub Comments

3reactions
rajpratyushcommented, Feb 19, 2021

@bdpedigo i dont care if graspologic comes in gsoc or not i will be actively contributing to it just i hoped with gsoc i will be able to showcase my work to my resume

1reaction
daxprycecommented, Feb 19, 2021

@rajpratyush if there’s any other ways we can help you highlight your contributions for your resume, let us know. We really appreciate your contributions and work on this project with us - you’ve cleaned up so many of our “we want to get to this but don’t know when we’ll have the time” tasks that we’d be way worse off without all the help you’ve given.

So if there’s some other way we can assist, please reach out - it’s the least we can do!

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to Use Word Embedding Layers for Deep Learning with ...
Hi Jason,. I have a set of documents(1200 text of movie Scripts) and i want to use pretrained embeddings. But i want to...
Read more >
Embedding — PyTorch 1.13 documentation
A simple lookup table that stores embeddings of a fixed dictionary and size. This module is often used to store word embeddings and...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found