Dependency version check fails for tokenizers
See original GitHub issueEnvironment info
transformers
version: 4.5.0- Platform: Linux-4.15.0-134-generic-x86_64-with-glibc2.10
- Python version: 3.8.5
- PyTorch version (GPU?): 1.8.1 (False)
- Tensorflow version (GPU?): 2.4.0 (False)
- Using GPU in script?: N/A
- Using distributed or parallel set-up in script?: N/A
tokenizers
version: 0.10.2 (checked also 0.10.1)
Who can help
Information
When importing transformers
, the new dependency version check code (#11061) seems to fail for the tokenizers library:
importlib.metadata.version('tokenizers')
returns None instead of the version string.
The problem arises when using:
- the official example scripts: (give details below)
- my own modified scripts: (give details below)
To reproduce
Steps to reproduce the behavior:
import transformers
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/guyrosin/miniconda3/envs/pt/lib/python3.8/site-packages/transformers/__init__.py", line 43, in <module>
from . import dependency_versions_check
File "/home/guyrosin/miniconda3/envs/pt/lib/python3.8/site-packages/transformers/dependency_versions_check.py", line 41, in <module>
require_version_core(deps[pkg])
File "/home/guyrosin/miniconda3/envs/pt/lib/python3.8/site-packages/transformers/utils/versions.py", line 101, in require_version_core
return require_version(requirement, hint)
File "/home/guyrosin/miniconda3/envs/pt/lib/python3.8/site-packages/transformers/utils/versions.py", line 92, in require_version
if want_ver is not None and not ops[op](version.parse(got_ver), version.parse(want_ver)):
File "/home/guyrosin/miniconda3/envs/pt/lib/python3.8/site-packages/packaging/version.py", line 57, in parse
return Version(version)
File "/home/guyrosin/miniconda3/envs/pt/lib/python3.8/site-packages/packaging/version.py", line 296, in __init__
match = self._regex.search(version)
TypeError: expected string or bytes-like object
The root problem is this:
from importlib.metadata import version
version('tokenizers') # returns None
Expected behavior
importlib.metadata.version('tokenizers')
should return its version string.
Issue Analytics
- State:
- Created 2 years ago
- Comments:6 (6 by maintainers)
Top Results From Across the Web
Installation Error - Failed building wheel for tokenizers #2831
1 by manually install the last version of tokenizers (0.6.0) instead of 0.5.2 that is required in the transformer package. pip install ...
Read more >Could not build wheels for tokenizers, which is required to ...
I got the same error when I tried to install the transformers library and I was using python 3.6 but when I upgraded...
Read more >ERROR: Could not build wheels for tokenizers in Python
To solve the error "Could not build wheels for tokenizers", run the pip install --upgrade pip command to upgrade your pip version, ...
Read more >Migrating from previous packages
This means that the tokenizers that depend on the SentencePiece library will not be available with a standard transformers installation. This includes the...
Read more >Install spaCy · spaCy Usage Documentation
spaCy is a free open-source library for Natural Language Processing in Python. It features NER, POS tagging, dependency parsing, word vectors and more....
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Yay! Glad it worked, @guyrosin!
OK, it seems like my environment was corrupted indeed - there was a
tokenizers-0.9.4.dist-info
folder inside my env’ssite-packages
folder… After deleting it (manually) and reinstallingtokenizers
, everything works! Thanks a lot for your help @stas00!