SpaCy 3 support
See original GitHub issueI have SpaCy 3 installed:
$ spacy info
============================== Info about spaCy ==============================
spaCy version 3.0.3
Location /opt/anaconda3/envs/XXX/lib/python3.8/site-packages/spacy
Platform Linux-5.4.0-65-generic-x86_64-with-glibc2.10
Python version 3.8.5
Pipelines en_core_web_sm (3.0.0)
I have installed spacy-wordnet
:
$ pip install spacy-wordnet
Collecting spacy-wordnet
Downloading spacy-wordnet-0.0.4.tar.gz (648 kB)
|ββββββββββββββββββββββββββββββββ| 648 kB 6.5 MB/s
Collecting nltk<3.4,>=3.3
Downloading nltk-3.3.0.zip (1.4 MB)
|ββββββββββββββββββββββββββββββββ| 1.4 MB 9.8 MB/s
Requirement already satisfied: six in /opt/anaconda3/envs/syllabus/lib/python3.8/site-packages (from nltk<3.4,>=3.3->spacy-wordnet) (1.15.0)
Building wheels for collected packages: spacy-wordnet, nltk
Building wheel for spacy-wordnet (setup.py) ... done
Created wheel for spacy-wordnet: filename=spacy_wordnet-0.0.4-py2.py3-none-any.whl size=650293 sha256=73e3b3c9921a3a9fb841638b1fd3ab4d6442563a7cab510c5c663fa74849e242
Stored in directory: /home/XXX/.cache/pip/wheels/d5/4a/26/0311c16a5294b36a6e018c0816f9e61a5377287fdd276e0f5c
Building wheel for nltk (setup.py) ... done
Created wheel for nltk: filename=nltk-3.3-py3-none-any.whl size=1394469 sha256=75e771dc87340388bb5d475c1732827b2987617b059307b5ff6d123bb75bf656
Stored in directory: /home/XXX/.cache/pip/wheels/19/1d/3a/0a8c14c30132b4f9ffd796efbb6746f15b3d6bcfc1055a9346
Successfully built spacy-wordnet nltk
Installing collected packages: nltk, spacy-wordnet
Successfully installed nltk-3.3 spacy-wordnet-0.0.4
And tried to use it in Python:
import spacy
from spacy_wordnet.wordnet_annotator import WordnetAnnotator
nlp = spacy.load("en_core_web_sm")
nlp.add_pipe('WordnetAnnotator', after="tagger")
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-13-a2750d534e73> in <module>
----> 1 nlp.add_pipe('WordnetAnnotator', after="tagger")
/opt/anaconda3/envs/XXX/lib/python3.8/site-packages/spacy/language.py in add_pipe(self, factory_name, name, before, after, first, last, source, config, raw_config, validate)
765 lang_code=self.lang,
766 )
--> 767 pipe_component = self.create_pipe(
768 factory_name,
769 name=name,
/opt/anaconda3/envs/XXX/lib/python3.8/site-packages/spacy/language.py in create_pipe(self, factory_name, name, config, raw_config, validate)
636 lang_code=self.lang,
637 )
--> 638 raise ValueError(err)
639 pipe_meta = self.get_factory_meta(factory_name)
640 config = config or {}
ValueError: [E002] Can't find factory for 'WordnetAnnotator' for language English (en). This usually happens when spaCy calls `nlp.create_pipe` with a custom component name that's not registered on the current language class. If you're using a Transformer, make sure to install 'spacy-transformers'. If you're using a custom component, make sure you've added the decorator `@Language.component` (for function components) or `@Language.factory` (for class components).
Available factories: attribute_ruler, tok2vec, merge_noun_chunks, merge_entities, merge_subtokens, token_splitter, parser, beam_parser, entity_linker, ner, beam_ner, entity_ruler, lemmatizer, tagger, morphologizer, senter, sentencizer, textcat, textcat_multilabel, en.lemmatizer
I suppose that SpaCy 3 is not yet supported.
Issue Analytics
- State:
- Created 3 years ago
- Comments:6 (1 by maintainers)
Top Results From Across the Web
What's New in v3.0 Β· spaCy Usage Documentation
spaCy's transformer support interoperates with PyTorch and the HuggingFace transformers library, giving you access to thousands of pretrained models for yourΒ ...
Read more >Introducing spaCy v3.0 - Explosion AI
spaCy's transformer support interoperates with PyTorch and the HuggingFace transformers library, giving you access to thousands ofΒ ...
Read more >explosion/spacy-models - GitHub
Models for the spaCy Natural Language Processing (NLP) library - GitHub - explosion/spacy-models: ... For example, 3 for spaCy v2.3.x. c : Model...
Read more >Enterprise-class NLP with spaCy v3 - Domino Data Lab
The latest release, spaCy 3.0, brings many improvements to help build, ... Here, we parse three separate pieces of text into a single...
Read more >spaCy - Wikipedia
spaCy also supports deep learning workflows that allow connecting statistical models trained by popular machine learning libraries like TensorFlow,Β ...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Hi,
spacy changed their custom pipeline creation method. A wrapper worked for me (see @Language.component(βwordnetβ) and nlp.add_pipe(β¦)):
Fixed in #10