fail to import import transformers.trainer due to libssl.so.10: cannot open shared object file: No such file or directory
See original GitHub issueSystem Info
Traceback (most recent call last): File “/home/xxie92/anaconda3/envs/sema/lib/python3.8/site-packages/transformers/utils/import_utils.py”, line 1002, in _get_module return importlib.import_module(“.” + module_name, self.name) File “/home/xxie92/anaconda3/envs/sema/lib/python3.8/importlib/init.py”, line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File “<frozen importlib._bootstrap>”, line 1014, in _gcd_import File “<frozen importlib._bootstrap>”, line 991, in _find_and_load File “<frozen importlib._bootstrap>”, line 975, in _find_and_load_unlocked File “<frozen importlib._bootstrap>”, line 671, in _load_unlocked File “<frozen importlib._bootstrap_external>”, line 843, in exec_module File “<frozen importlib._bootstrap>”, line 219, in _call_with_frames_removed File “/home/xxie92/anaconda3/envs/sema/lib/python3.8/site-packages/transformers/trainer.py”, line 66, in <module> from .data.data_collator import DataCollator, DataCollatorWithPadding, default_data_collator File “/home/xxie92/anaconda3/envs/sema/lib/python3.8/site-packages/transformers/data/init.py”, line 19, in <module> from .data_collator import ( File “/home/xxie92/anaconda3/envs/sema/lib/python3.8/site-packages/transformers/data/data_collator.py”, line 21, in <module> from …models.bert import BertTokenizer, BertTokenizerFast File “/home/xxie92/anaconda3/envs/sema/lib/python3.8/site-packages/transformers/models/init.py”, line 19, in <module> from . import ( File “/home/xxie92/anaconda3/envs/sema/lib/python3.8/site-packages/transformers/models/mt5/init.py”, line 40, in <module> from …t5.tokenization_t5_fast import T5TokenizerFast File “/home/xxie92/anaconda3/envs/sema/lib/python3.8/site-packages/transformers/models/t5/tokenization_t5_fast.py”, line 23, in <module> from …tokenization_utils_fast import PreTrainedTokenizerFast File “/home/xxie92/anaconda3/envs/sema/lib/python3.8/site-packages/transformers/tokenization_utils_fast.py”, line 24, in <module> import tokenizers.pre_tokenizers as pre_tokenizers_fast File “/home/xxie92/anaconda3/envs/sema/lib/python3.8/site-packages/tokenizers/init.py”, line 79, in <module> from .tokenizers import ( ImportError: libssl.so.10: cannot open shared object file: No such file or directory
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File “test.py”, line 3, in <module> from transformers import Trainer, TrainingArguments, EvalPrediction File “<frozen importlib._bootstrap>”, line 1039, in _handle_fromlist File “/home/xxie92/anaconda3/envs/sema/lib/python3.8/site-packages/transformers/utils/import_utils.py”, line 992, in getattr module = self._get_module(self._class_to_module[name]) File “/home/xxie92/anaconda3/envs/sema/lib/python3.8/site-packages/transformers/utils/import_utils.py”, line 1004, in _get_module raise RuntimeError( RuntimeError: Failed to import transformers.trainer because of the following error (look up to see its traceback): libssl.so.10: cannot open shared object file: No such file or directory
Who can help?
No response
Information
- The official example scripts
- My own modified scripts
Tasks
- An officially supported task in the
examples
folder (such as GLUE/SQuAD, …) - My own task or dataset (give details below)
Expected behavior
I installed transformers following the official online website. steps: I create a new env using conda. And install it using conda. it give this error. i also tried from pip to install. the same error appear.
From the error message, it seems tokenizer package may be the problem. But I am not sure how to solve it.
Issue Analytics
- State:
- Created a year ago
- Comments:6
Top GitHub Comments
Got the same error with
conda install -c huggingface transfformers
. And thank you @Utopiah , the pip works.Got the same error with conda install -c huggingface transfformers.