MemoryError on spacy.load('en')
See original GitHub issueThe parser model fails to load with the error:
Traceback (most recent call last):
File "run.py", line 11, in <module>
nlp = spacy.load('en')
File "C:\Python27\lib\site-packages\spacy\__init__.py", line 47, in load
return cls(path=path, **overrides)
File "C:\Python27\lib\site-packages\spacy\language.py", line 274, in __init__
if 'parser' not in overrides \
File "C:\Python27\lib\site-packages\spacy\language.py", line 102, in create_parser
return DependencyParser.load(nlp.path / 'deps', nlp.vocab)
File "spacy/syntax/parser.pyx", line 99, in spacy.syntax.parser.Parser.load (spacy/syntax/parser.cpp:5322)
self.model.load(str(path / 'model'))
File "thinc/linear/avgtron.pyx", line 89, in thinc.linear.avgtron.AveragedPerceptron.load (thinc/linear/avgtron.cpp:3092)
self.weights = PreshMap(reader.nr_feat)
File "preshed/maps.pyx", line 23, in preshed.maps.PreshMap.__init__ (preshed/maps.cpp:1200)
map_init(self.mem, self.c_map, initial_size)
File "preshed/maps.pyx", line 90, in preshed.maps.map_init (preshed/maps.cpp:2876)
map_.cells = <Cell*>mem.alloc(length, sizeof(Cell))
File "cymem/cymem.pyx", line 42, in cymem.cymem.Pool.alloc (cymem/cymem.cpp:1091)
raise MemoryError("Error assigning %d bytes" % number * elem_size)
MemoryError: Error assigning 16777216 bytesError assigning 16777216 bytesError assigning 16777216 bytesError assigning 16777216 bytesError assigning 16777216 bytesError assigning 16777216 bytesError assigning 16777216 bytesError assigning 16777216 bytesError assigning 16777216 bytesError assigning 16777216 bytesError assigning 16777216 bytesError assigning 16777216 bytesError assigning 16777216 bytesError assigning 16777216 bytesError assigning 16777216 bytesError assigning 16777216 bytes
Your Environment
- Operating System: Windows 7
- Python Version Used: Python 2.7.12
- spaCy Version Used: 1.2.0
- Environment Information: a virtual environment with only spacy and its dependencies
Issue Analytics
- State:
- Created 7 years ago
- Comments:12 (5 by maintainers)
Top Results From Across the Web
Spacy MemoryError - python - Stack Overflow
I managed to install spacy but when trying to use nlp then I am getting a MemoryError for some weird reason. The code...
Read more >explosion/spaCy - Gitter
Hi. I'm piping a lot of documents (millions) and memory usage seems to be gradually increasing over time, eventually resulting in spaCy throwing...
Read more >spacy 1.5.1 - PyPI
spaCy is a library for advanced natural language processing in Python and Cython. spaCy is built on the very latest research, but it...
Read more >Complete Guide to Python Memory Error - eduCBA
Most often, Memory Error occurs when the program creates any number of objects, and the memory of the RAM runs out. When working...
Read more >spaCy 101: Everything you need to know
Once you've downloaded and installed a trained pipeline, you can load it via ... like a segmentation fault or memory error, is always...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I get the same error with v1.2.0 on Windows with 8GB. I guess it’s an issue with the latest release according to the other comments.
Any ideas?
UPDATE: I switched my python installation to 64bit and it solved it.
This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.