question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. ItΒ collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

I have SpaCy 3 installed:

$ spacy info

============================== Info about spaCy ==============================

spaCy version    3.0.3                         
Location         /opt/anaconda3/envs/XXX/lib/python3.8/site-packages/spacy
Platform         Linux-5.4.0-65-generic-x86_64-with-glibc2.10
Python version   3.8.5                         
Pipelines        en_core_web_sm (3.0.0)        

I have installed spacy-wordnet:

$ pip install spacy-wordnet

Collecting spacy-wordnet
  Downloading spacy-wordnet-0.0.4.tar.gz (648 kB)
     |β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 648 kB 6.5 MB/s 
Collecting nltk<3.4,>=3.3
  Downloading nltk-3.3.0.zip (1.4 MB)
     |β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 1.4 MB 9.8 MB/s 
Requirement already satisfied: six in /opt/anaconda3/envs/syllabus/lib/python3.8/site-packages (from nltk<3.4,>=3.3->spacy-wordnet) (1.15.0)
Building wheels for collected packages: spacy-wordnet, nltk
  Building wheel for spacy-wordnet (setup.py) ... done
  Created wheel for spacy-wordnet: filename=spacy_wordnet-0.0.4-py2.py3-none-any.whl size=650293 sha256=73e3b3c9921a3a9fb841638b1fd3ab4d6442563a7cab510c5c663fa74849e242
  Stored in directory: /home/XXX/.cache/pip/wheels/d5/4a/26/0311c16a5294b36a6e018c0816f9e61a5377287fdd276e0f5c
  Building wheel for nltk (setup.py) ... done
  Created wheel for nltk: filename=nltk-3.3-py3-none-any.whl size=1394469 sha256=75e771dc87340388bb5d475c1732827b2987617b059307b5ff6d123bb75bf656
  Stored in directory: /home/XXX/.cache/pip/wheels/19/1d/3a/0a8c14c30132b4f9ffd796efbb6746f15b3d6bcfc1055a9346
Successfully built spacy-wordnet nltk
Installing collected packages: nltk, spacy-wordnet
Successfully installed nltk-3.3 spacy-wordnet-0.0.4

And tried to use it in Python:

import spacy
from spacy_wordnet.wordnet_annotator import WordnetAnnotator

nlp = spacy.load("en_core_web_sm")
nlp.add_pipe('WordnetAnnotator', after="tagger")
---------------------------------------------------------------------------                          
ValueError                                Traceback (most recent call last)
<ipython-input-13-a2750d534e73> in <module>
----> 1 nlp.add_pipe('WordnetAnnotator', after="tagger")

/opt/anaconda3/envs/XXX/lib/python3.8/site-packages/spacy/language.py in add_pipe(self, factory_name, name, before, after, first, last, source, config, raw_config, validate)
    765                     lang_code=self.lang,
    766                 )                         
--> 767             pipe_component = self.create_pipe(
    768                 factory_name,
    769                 name=name,

/opt/anaconda3/envs/XXX/lib/python3.8/site-packages/spacy/language.py in create_pipe(self, factory_name, name, config, raw_config, validate)
    636                 lang_code=self.lang,
    637             )                             
--> 638             raise ValueError(err)
    639         pipe_meta = self.get_factory_meta(factory_name)
    640         config = config or {}

ValueError: [E002] Can't find factory for 'WordnetAnnotator' for language English (en). This usually happens when spaCy calls `nlp.create_pipe` with a custom component name that's not registered on the current language class. If you're using a Transformer, make sure to install 'spacy-transformers'. If you're using a custom component, make sure you've added the decorator `@Language.component` (for function components) or `@Language.factory` (for class components).                                        

Available factories: attribute_ruler, tok2vec, merge_noun_chunks, merge_entities, merge_subtokens, token_splitter, parser, beam_parser, entity_linker, ner, beam_ner, entity_ruler, lemmatizer, tagger, morphologizer, senter, sentencizer, textcat, textcat_multilabel, en.lemmatizer                          

I suppose that SpaCy 3 is not yet supported.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:6 (1 by maintainers)

github_iconTop GitHub Comments

5reactions
philipp-kohlcommented, Apr 9, 2021

Hi,

spacy changed their custom pipeline creation method. A wrapper worked for me (see @Language.component(β€œwordnet”) and nlp.add_pipe(…)):

from spacy import Language
import spacy
from spacy_wordnet.wordnet_annotator import WordnetAnnotator


# Load an spacy model (supported models are "es" and "en")
nlp = spacy.load('en_core_web_sm')

spacy_wordnet_annotator = WordnetAnnotator(nlp.lang)
@Language.component("wordnet")
def spacy_wordnet_wrapper(doc):
    return spacy_wordnet_annotator(doc)

nlp.add_pipe("wordnet", after='tagger')
token = nlp('prices')[0]

# wordnet object link spacy token with nltk wordnet interface by giving acces to
# synsets and lemmas
token._.wordnet.synsets()
token._.wordnet.lemmas()

# And automatically tags with wordnet domains
token._.wordnet.wordnet_domains()
0reactions
frascuchoncommented, Apr 13, 2021

Fixed in #10

Read more comments on GitHub >

github_iconTop Results From Across the Web

What's New in v3.0 Β· spaCy Usage Documentation
spaCy's transformer support interoperates with PyTorch and the HuggingFace transformers library, giving you access to thousands of pretrained models for yourΒ ...
Read more >
Introducing spaCy v3.0 - Explosion AI
spaCy's transformer support interoperates with PyTorch and the HuggingFace transformers library, giving you access to thousands ofΒ ...
Read more >
explosion/spacy-models - GitHub
Models for the spaCy Natural Language Processing (NLP) library - GitHub - explosion/spacy-models: ... For example, 3 for spaCy v2.3.x. c : Model...
Read more >
Enterprise-class NLP with spaCy v3 - Domino Data Lab
The latest release, spaCy 3.0, brings many improvements to help build, ... Here, we parse three separate pieces of text into a single...
Read more >
spaCy - Wikipedia
spaCy also supports deep learning workflows that allow connecting statistical models trained by popular machine learning libraries like TensorFlow,Β ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found