Cannot import transformers with TF version 2.1.0
See original GitHub issueThe installation README says that transformers library requires Tensorflow version >2.0, but I can’t seem to import the latest transformers 3.2 release even with Tensorflow v2.1.
>>> import transformers
wandb: WARNING W&B installed but not logged in. Run `wandb login` or set the WANDB_API_KEY env variable.
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/amog/dev/ray/lib/python3.7/site-packages/transformers/__init__.py", line 121, in <module>
from .pipelines import (
File "/Users/amog/dev/ray/lib/python3.7/site-packages/transformers/pipelines.py", line 47, in <module>
from .modeling_tf_auto import (
File "/Users/amog/dev/ray/lib/python3.7/site-packages/transformers/modeling_tf_auto.py", line 45, in <module>
from .modeling_tf_albert import (
File "/Users/amog/dev/ray/lib/python3.7/site-packages/transformers/modeling_tf_albert.py", line 24, in <module>
from .activations_tf import get_tf_activation
File "/Users/amog/dev/ray/lib/python3.7/site-packages/transformers/activations_tf.py", line 53, in <module>
"swish": tf.keras.activations.swish,
AttributeError: module 'tensorflow_core.python.keras.api._v2.keras.activations' has no attribute 'swish'
Upgrading to TF 2.2 works fine, but I think this should be made more clear in the docs.
Environment info
transformers
version: 3.2.0- Platform: Mac OS
- Python version: 3.7.7
- PyTorch version (GPU?):
- Tensorflow version (GPU?): 2.1.0. On CPU only.
- Using GPU in script?:
- Using distributed or parallel set-up in script?:
Who can help
Information
Model I am using (Bert, XLNet …):
The problem arises when using:
- the official example scripts: (give details below)
- my own modified scripts: (give details below)
The tasks I am working on is:
- an official GLUE/SQUaD task: (give the name)
- my own task or dataset: (give details below)
To reproduce
Steps to reproduce the behavior:
Expected behavior
Issue Analytics
- State:
- Created 3 years ago
- Reactions:16
- Comments:8 (3 by maintainers)
Top Results From Across the Web
Installation — transformers 4.7.0 documentation - Hugging Face
Transformers is tested on Python 3.6+, and PyTorch 1.1.0+ or TensorFlow 2.0+. You should install Transformers in a virtual environment.
Read more >transformers · PyPI
Here is the PyTorch version: >>> from transformers import AutoTokenizer, AutoModel >>> tokenizer = AutoTokenizer.from_pretrained("bert-base-uncased") ...
Read more >cannot import name 'get_config' from 'tensorflow.python.eager ...
I think there is a conflict of keras.models.load_model and the base version of tensorflow you are using. Try running - import tensorflow as...
Read more >ray.rllib.utils.framework — Ray 2.1.0 - the Ray documentation
Args: error: Whether to raise an error if tf cannot be imported. Returns: Tuple containing 1) tf1.x ... 3) The actually installed tf...
Read more >Cannot import nengo_dl on ms azure - Nengo forum
thanks, I was able to import nengo_dl (after run pip install nengo_dl==3.4.0 ) ... Also, if you have installed tensorflow-gpu (version 2.2.0), ...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
If you get this message error:
It means you don’t have at least TF 2.2 installed.
This breaks at least a couple of the tutorial notebooks. Even with TF 2.3.0 I get the same error.