question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Loading NER model from dump doesn't work

See original GitHub issue

I’m using the EntityRecognizer.load method to load my model. This is seems to run successfully, but the resulting EntityRecognizer doesn’t work properly; whereas the original one does.

Whereas https://github.com/explosion/spaCy/blob/master/examples/training/train_ner.py outputs:

Who WP  2
is VBZ  2
Shaka NNP PERSON 3
Khan NNP PERSON 1
? .  2

The loaded one outputs:

$ python3 load_ner.py
Who WP  2
is VBZ  2
Shaka NNP  2
Khan NNP  2
? .  2

(notice the missing ent_type_!)

I’ve included code to repro here: https://github.com/savvopoulos/spaCy/commit/943f979ce4b80837b873fd3d10091c6db5b4b484

Issue Analytics

  • State:closed
  • Created 7 years ago
  • Comments:6 (5 by maintainers)

github_iconTop GitHub Comments

5reactions
honnibalcommented, Dec 1, 2016

Hey,

Sorry about the lack of clarity on this. I think others have been having trouble with this too.

There are a few problems with your scripts.

  1. EntityRecognizer.load() is a classmethod — it returns a new instance of EntityRecognizer. It doesn’t modify the instance in place. So, your line ner.load() does nothing. What you need is ner = EntityRecognizer.load()

  2. You should save and load the vocabulary as well as the entity recognition model.

  3. You should either save and load the tagger, or not use the tagger when running the NER.

These seem to work for me. I’ll work on getting the docs fixed.

# Train NER

# encoding: utf8
from __future__ import unicode_literals, print_function
import ujson as json
import pathlib
import random

import spacy
from spacy.pipeline import EntityRecognizer
from spacy.gold import GoldParse
from spacy.tagger import Tagger


def train_ner(nlp, train_data, entity_types):
    for raw_text, _ in train_data:
        doc = nlp.make_doc(raw_text)
        for word in doc:
            _ = nlp.vocab[word.orth]
    ner = EntityRecognizer(nlp.vocab, entity_types=entity_types)
    for itn in range(5):
        random.shuffle(train_data)
        for raw_text, entity_offsets in train_data:
            doc = nlp.make_doc(raw_text)
            gold = GoldParse(doc, entities=entity_offsets)
            ner.update(doc, gold)
    ner.model.end_training()
    return ner


def main(model_dir=None):
    if model_dir is not None:
        model_dir = pathlib.Path(model_dir)
        if not model_dir.exists():
            model_dir.mkdir()
        assert model_dir.is_dir()

    nlp = spacy.load('en', parser=False, entity=False, add_vectors=False)

    # v1.1.2 onwards
    if nlp.tagger is None:
        print('---- WARNING ----')
        print('Data directory not found')
        print('please run: `python -m spacy.en.download --force all` for better performance')
        print('Using feature templates for tagging')
        print('-----------------')
        nlp.tagger = Tagger(nlp.vocab, features=Tagger.feature_templates)

    train_data = [
        (
            'Who is Shaka Khan?',
            [(len('Who is '), len('Who is Shaka Khan'), 'PERSON')]
        ),
        (
            'I like London and Berlin.',
            [(len('I like '), len('I like London'), 'LOC'),
            (len('I like London and '), len('I like London and Berlin'), 'LOC')]
        )
    ]
    ner = train_ner(nlp, train_data, ['PERSON', 'LOC'])

    doc = nlp.make_doc('Who is Shaka Khan?')
    #nlp.tagger(doc)
    ner(doc)
    for word in doc:
        print(word.text, word.orth, word.lower, word.tag_, word.ent_type_, word.ent_iob)

    if model_dir is not None:
        with (model_dir / 'config.json').open('wb') as file_:
            json.dump(ner.cfg, file_)
        ner.model.dump(str(model_dir / 'model'))
        if not (model_dir / 'vocab').exists():
            (model_dir / 'vocab').mkdir()
        ner.vocab.dump(str(model_dir / 'vocab' / 'lexemes.bin'))
        with (model_dir / 'vocab' / 'strings.json').open('w', encoding='utf8') as file_:
            ner.vocab.strings.dump(file_)


if __name__ == '__main__':
    main('ner')
    # Who "" 2
    # is "" 2
    # Shaka "" PERSON 3
    # Khan "" PERSON 1
    # ? "" 2
# Load NER
from __future__ import unicode_literals
import spacy
import pathlib
from spacy.pipeline import EntityRecognizer
from spacy.vocab import Vocab


nlp = spacy.load('en', parser=False, entity=False, add_vectors=False)
vocab_dir = pathlib.Path('ner/vocab')
with (vocab_dir / 'strings.json').open('r', encoding='utf8') as file_:
    nlp.vocab.strings.load(file_)
nlp.vocab.load_lexemes(vocab_dir / 'lexemes.bin')
ner = EntityRecognizer.load(pathlib.Path("ner"), nlp.vocab, require=True)
doc = nlp.make_doc('Who is Shaka Khan?')
#nlp.tagger(doc)
ner(doc)
for word in doc:
    print(word.text, word.orth, word.lower, word.ent_type_)
0reactions
lock[bot]commented, May 9, 2018

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Saving and Loading · spaCy Usage Documentation
Loading a custom pipeline package​​ To load a pipeline from a data directory, you can use spacy. load() with the local path. This...
Read more >
Loading Customized Ner model -Fails - Stack Overflow
This usually happens when spaCy calls `nlp.create_pipe` with a component name that's not built in - for example, when constructing the pipeline ...
Read more >
How to Save and Load Your Keras Deep Learning Model
In this post, you will discover how to save your Keras models to files and load them up again to make predictions. After...
Read more >
How to serialize/pickle a spacy ner model?
Yes - Here is how to pickle in Python: import pickle pickle.dump(nlp, open( "nlp.p", "wb" )).
Read more >
Train and serve a TensorFlow model with TensorFlow Serving
Import the Fashion MNIST dataset; Train and evaluate your model ... To load our trained model into TensorFlow Serving we first need to...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found