question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

ImportError: tokenizers>=0.10.1,<0.11 is required for a normal functioning of this module, but found tokenizers==0.8.1rc1.

See original GitHub issue

Environment info

  • transformers version: 4.6.1
  • Platform: Linux Mint Tricia 19.3 (ubuntu 18.04)
  • Python version: 3.8.8
  • PyTorch version (GPU?): 1.7.0, gpu yes
  • Tensorflow version (GPU?):
  • Using GPU in script?: yes
  • Using distributed or parallel set-up in script?: no

Who can help

tokenizer: @LysandreJik

Information

Model I am using (Bert, XLNet …): GPT2

The problem arises when using:

  • my own modified scripts: (give details below)
  • my own task or dataset: (give details below) text generation

After upgrade to 4.6.1 (same error in 4.6.0), I have an error when I load tokenizer.

What I have tried

I searched for a similar issue and thought that this is a possible duplicate of this issue, but there was no change after I apply the solution.

I uninstalled transformers and tokenizers package, reinstall those, and still there is the same issue.

To reproduce

Steps to reproduce the behavior:

  1. Import tokenizer (like below)
from transformers import (PreTrainedTokenizerFast,
                          GPT2Tokenizer,)

Error message

---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
<ipython-input-5-dc540cd053e1> in <module>
----> 1 from transformers import (PreTrainedTokenizerFast,
      2                           PreTrainedTokenizer,
      3                           AutoTokenizer,
      4                           GPT2Tokenizer,)
      5 

/opt/conda/lib/python3.8/site-packages/transformers/__init__.py in <module>
     41 
     42 # Check the dependencies satisfy the minimal versions required.
---> 43 from . import dependency_versions_check
     44 from .file_utils import (
     45     _BaseLazyModule,

/opt/conda/lib/python3.8/site-packages/transformers/dependency_versions_check.py in <module>
     39                 continue  # not required, check version only if installed
     40 
---> 41         require_version_core(deps[pkg])
     42     else:
     43         raise ValueError(f"can't find {pkg} in {deps.keys()}, check dependency_versions_table.py")

/opt/conda/lib/python3.8/site-packages/transformers/utils/versions.py in require_version_core(requirement)
    118     """require_version wrapper which emits a core-specific hint on failure"""
    119     hint = "Try: pip install transformers -U or pip install -e '.[dev]' if you're working with git master"
--> 120     return require_version(requirement, hint)
    121 
    122 

/opt/conda/lib/python3.8/site-packages/transformers/utils/versions.py in require_version(requirement, hint)
    112     if want_ver is not None:
    113         for op, want_ver in wanted.items():
--> 114             _compare_versions(op, got_ver, want_ver, requirement, pkg, hint)
    115 
    116 

/opt/conda/lib/python3.8/site-packages/transformers/utils/versions.py in _compare_versions(op, got_ver, want_ver, requirement, pkg, hint)
     47         raise ValueError("want_ver is None")
     48     if not ops[op](version.parse(got_ver), version.parse(want_ver)):
---> 49         raise ImportError(
     50             f"{requirement} is required for a normal functioning of this module, but found {pkg}=={got_ver}.{hint}"
     51         )

ImportError: tokenizers>=0.10.1,<0.11 is required for a normal functioning of this module, but found tokenizers==0.8.1rc1.
Try: pip install transformers -U or pip install -e '.[dev]' if you're working with git master

Expected behavior

Just work like before!

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:7 (1 by maintainers)

github_iconTop GitHub Comments

3reactions
dxc0apcommented, Jan 20, 2022

I have the same problem.

I fixed it by update python’s version from 3.6 to 3.9.

2reactions
ccx1997commented, Jul 28, 2022

Re-install transformers with a proper version will be ok. I solve it by the command: pip install transformers==4.11.3.

Read more comments on GitHub >

github_iconTop Results From Across the Web

No results found

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found