Initial install: No module named 'tensorflow.python.keras.engine.keras_tensor'
See original GitHub issueEnvironment info
Output of transformers-cli env is an error ending with:
RuntimeError: Failed to import transformers.modeling_tf_utils because of the following error (look up to see its traceback): No module named ‘tensorflow.python.keras.engine.keras_tensor’
transformers
version:- Platform: Linux CentOS7
- Python version: 3.6.13
- PyTorch version (GPU?): 1.9.1.post3
- Tensorflow version (GPU?): 2.1.0 gpu
Who can help
Library:
- Pipelines: @Narsil
To reproduce
Steps to reproduce the behavior:
Installation with Mamba using conda recipe for transformers:
micromamba create -y -p <path> mamba python=3.6 cudatoolkit=10.0 cudnn=7.6.0 pytorch micromamba install -y -p <path> pandas seaborn plotly bokeh scikit-learn statsmodels scipy matplotlib simpleitk -c simpleitk micromamba install -y -p <path> transformers=4.12.3 source <path>/bin/activate base python -m pip install --upgrade pip python -m pip install tensorflow-gpu==2.1.0
Output of sample given in installation docs:
./python -c “from transformers import pipeline; print(pipeline(‘sentiment-analysis’)(‘I hate you’))”
`Traceback (most recent call last): File “/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/site-packages/transformers/file_utils.py”, line 2150, in _get_module return importlib.import_module(“.” + module_name, self.name) File “/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/importlib/init.py”, line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File “<frozen importlib._bootstrap>”, line 994, in _gcd_import File “<frozen importlib._bootstrap>”, line 971, in _find_and_load File “<frozen importlib._bootstrap>”, line 955, in _find_and_load_unlocked File “<frozen importlib._bootstrap>”, line 665, in _load_unlocked File “<frozen importlib._bootstrap_external>”, line 678, in exec_module File “<frozen importlib._bootstrap>”, line 219, in _call_with_frames_removed File “/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/site-packages/transformers/modeling_tf_utils.py”, line 30, in <module> from tensorflow.python.keras.engine.keras_tensor import KerasTensor ModuleNotFoundError: No module named ‘tensorflow.python.keras.engine.keras_tensor’
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File “/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/site-packages/transformers/file_utils.py”, line 2150, in _get_module return importlib.import_module(“.” + module_name, self.name) File “/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/importlib/init.py”, line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File “<frozen importlib._bootstrap>”, line 994, in _gcd_import File “<frozen importlib._bootstrap>”, line 971, in _find_and_load File “<frozen importlib._bootstrap>”, line 955, in _find_and_load_unlocked File “<frozen importlib._bootstrap>”, line 665, in _load_unlocked File “<frozen importlib._bootstrap_external>”, line 678, in exec_module File “<frozen importlib._bootstrap>”, line 219, in _call_with_frames_removed File “/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/site-packages/transformers/pipelines/init.py”, line 25, in <module> from …models.auto.configuration_auto import AutoConfig File “/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/site-packages/transformers/models/init.py”, line 19, in <module> from . import ( File “/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/site-packages/transformers/models/layoutlm/init.py”, line 22, in <module> from .configuration_layoutlm import LAYOUTLM_PRETRAINED_CONFIG_ARCHIVE_MAP, LayoutLMConfig File “/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/site-packages/transformers/models/layoutlm/configuration_layoutlm.py”, line 22, in <module> from …onnx import OnnxConfig, PatchingSpec File “/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/site-packages/transformers/onnx/init.py”, line 17, in <module> from .convert import export, validate_model_outputs File “/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/site-packages/transformers/onnx/convert.py”, line 23, in <module> from … import PreTrainedModel, PreTrainedTokenizer, TensorType, TFPreTrainedModel, is_torch_available File “<frozen importlib._bootstrap>”, line 1020, in _handle_fromlist File “/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/site-packages/transformers/file_utils.py”, line 2140, in getattr module = self._get_module(self._class_to_module[name]) File “/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/site-packages/transformers/file_utils.py”, line 2154, in _get_module ) from e RuntimeError: Failed to import transformers.modeling_tf_utils because of the following error (look up to see its traceback): No module named ‘tensorflow.python.keras.engine.keras_tensor’
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File “<string>”, line 1, in <module> File “<frozen importlib._bootstrap>”, line 1020, in _handle_fromlist File “/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/site-packages/transformers/file_utils.py”, line 2140, in getattr module = self._get_module(self._class_to_module[name]) File “/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/site-packages/transformers/file_utils.py”, line 2154, in _get_module ) from e RuntimeError: Failed to import transformers.pipelines because of the following error (look up to see its traceback): Failed to import transformers.modeling_tf_utils because of the following error (look up to see its traceback): No module named ‘tensorflow.python.keras.engine.keras_tensor’`
Expected behavior
Expected model output ending with: [{‘label’: ‘NEGATIVE’, ‘score’: 0.9991129040718079}]
Issue Analytics
- State:
- Created 2 years ago
- Reactions:1
- Comments:7 (4 by maintainers)
Hi @james-vincent ,
It seems the version of tensorflow you’re using is not supported anymore (https://github.com/huggingface/transformers/blob/master/setup.py#L155) . You need at least TF 2.3 to use transformers.
Are you able to upgrgade your dependency ?
I met the same problem, and upgraded tf2.1 to tf2.3. It solved. Thank you. @Narsil