question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Compatibility with Tensorflow 2.0

See original GitHub issue

I’m trying to create a model using keras-self-attentionon Google colab, and since the default Tensorflow version is 2.0 now, this error prompt :

model = models.Sequential()
model.add( Embedding(max_features, 32))
model.add(Bidirectional( LSTM(32, return_sequences=True)))
# adding an attention layer
model.add(SeqWeightedAttention())
---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
/usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py in _get_default_graph()
     65     try:
---> 66         return tf.get_default_graph()
     67     except AttributeError:

AttributeError: module 'tensorflow' has no attribute 'get_default_graph'

During handling of the above exception, another exception occurred:

RuntimeError                              Traceback (most recent call last)
4 frames
<ipython-input-7-9c4e625938a2> in <module>()
      3 model.add(Bidirectional( LSTM(32, return_sequences=True)))
      4 # adding an attention layer
----> 5 model.add(SeqWeightedAttention())

/usr/local/lib/python3.6/dist-packages/keras_self_attention/seq_weighted_attention.py in __init__(self, use_bias, return_attention, **kwargs)
     10 
     11     def __init__(self, use_bias=True, return_attention=False, **kwargs):
---> 12         super(SeqWeightedAttention, self).__init__(**kwargs)
     13         self.supports_masking = True
     14         self.use_bias = use_bias

/usr/local/lib/python3.6/dist-packages/keras/engine/base_layer.py in __init__(self, **kwargs)
    130         if not name:
    131             prefix = self.__class__.__name__
--> 132             name = _to_snake_case(prefix) + '_' + str(K.get_uid(prefix))
    133         self.name = name
    134 

/usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py in get_uid(prefix)
     84     """
     85     global _GRAPH_UID_DICTS
---> 86     graph = _get_default_graph()
     87     if graph not in _GRAPH_UID_DICTS:
     88         _GRAPH_UID_DICTS[graph] = defaultdict(int)

/usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py in _get_default_graph()
     67     except AttributeError:
     68         raise RuntimeError(
---> 69             'It looks like you are trying to use '
     70             'a version of multi-backend Keras that '
     71             'does not support TensorFlow 2.0. We recommend '

**RuntimeError: It looks like you are trying to use a version of multi-backend Keras that does not support TensorFlow 2.0. We recommend using `tf.keras`, or alternatively, downgrading to TensorFlow 1.14.**

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Reactions:3
  • Comments:10 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
CyberZHGcommented, Jul 14, 2020

@grkhr @Juhong-Namgung

The codes were always tested with the latest version of TensorFlow: https://travis-ci.org/github/CyberZHG/keras-self-attention/jobs/707084673#L2342

What were your error log?

0reactions
Juhong-Namgungcommented, Jul 16, 2020

@mohamedScikitLearn My working code is as follows:

from tensorflow.keras.layers import *
import os
os.environ['TF_KERAS'] = '1'

model = tf.keras.Sequential()
    model.add(Embedding(input_dim=max_vocab_len, output_dim=emb_dim, input_length=max_len, embeddings_regularizer=W_reg))
    model.add(Dropout(0.2))
    model.add(LSTM(units=128, return_sequences=True))
    model.add(Dropout(0.5))
    model.add(Layer(SeqSelfAttention(attention_activation='relu')))
    model.add(Flatten())
    model.add(Dense(9472, activation='relu'))
    model.add(Dropout(0.5))
    model.add(Dense(21, activation='softmax'))

I think you should add tensorflow.keras.layers.Layer()

Read more comments on GitHub >

github_iconTop Results From Across the Web

TensorFlow version compatibility
TensorFlow 1.2 might support GraphDef versions 4 to 7. · TensorFlow 1.3 could add GraphDef version 8 and support versions 4 to 8....
Read more >
Install TensorFlow 2
Learn how to install TensorFlow on your system. Download a pip package, run in a Docker container, or build from source.
Read more >
TensorFlow 1.x vs TensorFlow 2 - Behaviors and APIs
To be TF2 compatible, your code must be compatible with the full set of TF2 behaviors. During migration, you can enable or disable...
Read more >
Effective Tensorflow 2 | TensorFlow Core
This guide provides a list of best practices for writing code using TensorFlow 2 (TF2), it is written for users who have recently...
Read more >
Migrate to TensorFlow 2
Upgrade your training, evaluation and model saving code to TF2 equivalents. (Optional) Migrate your TF2-compatible tf.compat.v1 APIs including ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found