ImportError: tokenizers>=0.10.1,<0.11 is required for a normal functioning of this module, but found tokenizers==0.8.1rc1.
See original GitHub issueEnvironment info
transformers
version: 4.6.1- Platform: Linux Mint Tricia 19.3 (ubuntu 18.04)
- Python version: 3.8.8
- PyTorch version (GPU?): 1.7.0, gpu yes
- Tensorflow version (GPU?):
- Using GPU in script?: yes
- Using distributed or parallel set-up in script?: no
Who can help
tokenizer: @LysandreJik
Information
Model I am using (Bert, XLNet …): GPT2
The problem arises when using:
- my own modified scripts: (give details below)
- my own task or dataset: (give details below) text generation
After upgrade to 4.6.1 (same error in 4.6.0), I have an error when I load tokenizer.
What I have tried
I searched for a similar issue and thought that this is a possible duplicate of this issue, but there was no change after I apply the solution.
I uninstalled transformers and tokenizers package, reinstall those, and still there is the same issue.
To reproduce
Steps to reproduce the behavior:
- Import tokenizer (like below)
from transformers import (PreTrainedTokenizerFast,
GPT2Tokenizer,)
Error message
---------------------------------------------------------------------------
ImportError Traceback (most recent call last)
<ipython-input-5-dc540cd053e1> in <module>
----> 1 from transformers import (PreTrainedTokenizerFast,
2 PreTrainedTokenizer,
3 AutoTokenizer,
4 GPT2Tokenizer,)
5
/opt/conda/lib/python3.8/site-packages/transformers/__init__.py in <module>
41
42 # Check the dependencies satisfy the minimal versions required.
---> 43 from . import dependency_versions_check
44 from .file_utils import (
45 _BaseLazyModule,
/opt/conda/lib/python3.8/site-packages/transformers/dependency_versions_check.py in <module>
39 continue # not required, check version only if installed
40
---> 41 require_version_core(deps[pkg])
42 else:
43 raise ValueError(f"can't find {pkg} in {deps.keys()}, check dependency_versions_table.py")
/opt/conda/lib/python3.8/site-packages/transformers/utils/versions.py in require_version_core(requirement)
118 """require_version wrapper which emits a core-specific hint on failure"""
119 hint = "Try: pip install transformers -U or pip install -e '.[dev]' if you're working with git master"
--> 120 return require_version(requirement, hint)
121
122
/opt/conda/lib/python3.8/site-packages/transformers/utils/versions.py in require_version(requirement, hint)
112 if want_ver is not None:
113 for op, want_ver in wanted.items():
--> 114 _compare_versions(op, got_ver, want_ver, requirement, pkg, hint)
115
116
/opt/conda/lib/python3.8/site-packages/transformers/utils/versions.py in _compare_versions(op, got_ver, want_ver, requirement, pkg, hint)
47 raise ValueError("want_ver is None")
48 if not ops[op](version.parse(got_ver), version.parse(want_ver)):
---> 49 raise ImportError(
50 f"{requirement} is required for a normal functioning of this module, but found {pkg}=={got_ver}.{hint}"
51 )
ImportError: tokenizers>=0.10.1,<0.11 is required for a normal functioning of this module, but found tokenizers==0.8.1rc1.
Try: pip install transformers -U or pip install -e '.[dev]' if you're working with git master
Expected behavior
Just work like before!
Issue Analytics
- State:
- Created 2 years ago
- Comments:7 (1 by maintainers)
Top Results From Across the Web
No results found
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I have the same problem.
I fixed it by update python’s version from 3.6 to 3.9.
Re-install transformers with a proper version will be ok. I solve it by the command:
pip install transformers==4.11.3
.