question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Hello, thanks for this project. I was able to correctly train a structshot using the train script. Could you show how to correctly run the inference for an input sequence? In my understanding, the loading would look like

import os
import torch

from fewnerd.util.word_encoder import BERTWordEncoder
from fewnerd.model.proto import Proto
from fewnerd.model.nnshot import NNShot

# cache dir
cache_dir = os.getenv("cache_dir", "../../models")
model_path = 'structshot-inter-5-5-seed0.pth.tar'
model_name = 'structshot'
pretrain_ckpt = 'bert-base-uncased'
max_length = 100

# BERT word encoder
word_encoder = BERTWordEncoder(
        pretrain_ckpt,
        max_length)

if model_name == 'proto':
    # use dot instead of L2 distance for proto
    model = Proto(word_encoder, dot=True)
elif model_name == 'nnshot':
    model = NNShot(word_encoder, dot=False)
elif model_name == 'structshot':
    model = NNShot(word_encoder, dot=False)

model.load_state_dict(torch.load(os.path.join(cache_dir, model_path)))

but I get some errors on the state dicts (RuntimeError: Error(s) in loading state_dict for NNShot:) in this way…

Thank you in advance!

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:9

github_iconTop GitHub Comments

1reaction
pratikchhapolikacommented, Sep 2, 2022

@cyl628 For a single input sequence, how should we write the inference script?

0reactions
loretoparisicommented, Sep 13, 2022

@loretoparisi are you able to do inferencing on the trained model?

thanks for the question, it was time ago, but I think so with some trick, I have to check the notebook for the code…

Ok. Thanks. Could you please attach your notebook code and data please. @loretoparisi

I have something here I will add the inference code there!

@loretoparisi Have you uploaded the inference script?

not at this time, I will put it in the repo check updates, thanks.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Inference Definition & Meaning - Merriam-Webster
The meaning of INFERENCE is something that is inferred; especially : a conclusion or opinion that is formed because of known facts or ......
Read more >
Inference - Wikipedia
Inferences are steps in reasoning, moving from premises to logical consequences; etymologically, the word infer means to "carry forward".
Read more >
Inference - Definition, Meaning & Synonyms - Vocabulary.com
An inference is an idea or conclusion that's drawn from evidence and reasoning. An inference is an educated guess. We learn about some...
Read more >
INFERENCE | definition in the Cambridge English Dictionary
a guess that you make or an opinion that you form based on the information that you have: They were warned to expect...
Read more >
Inference Definition & Meaning - Dictionary.com
the process of deriving the strict logical consequences of assumed premises. · the process of arriving at some conclusion that, though it is...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found