Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

keras-bert relies on features removed in keras-transformer 0.34.0

See original GitHub issue

Describe the Bug

The get_model function in the keras_bert.bert module passes a use_adapter parameter to the keras_transformers.get_encoders function:

However, the latest commit to keras-transformer, corresponding to the v0.34.0 release, removes this parameter from the API:

keras-bert specifies its keras-transformer dependency as >=0.30.0. This means that new or upgraded installs will automatically use 0.34.0 despite the breaking change.

This means that calling get_model fails with the following error:

TypeError: get_encoders() got an unexpected keyword argument 'use_adapter'

The problem can be worked around by pinning the keras-transformer version at v0.33.0.

Version Info

  • I’m using the latest version

Minimal Codes To Reproduce

The problem can be seen by running the demo.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:7 (2 by maintainers)

github_iconTop GitHub Comments

magrathjcommented, Jun 1, 2020

Experiencing the same error when calling predict function - followed advise and changed keras-transformer==0.33.0, worked after this.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Pretraining BERT with Hugging Face Transformers - Keras
BERT makes use of Transformer, an attention mechanism that learns contextual relations between words (or subwords) in a text.
Read more >
BERT Text Classification using Keras | by Swatimeena - Medium
The BERT (Bidirectional Encoder Representations from Transformers) model was proposed in BERT: Pre-training of Deep Bidirectional Transformers for Language ...
Read more >
Simple Text Multi Classification Task Using Keras BERT!
BERT relies on a Transformer (the attention mechanism that learns contextual relationships between words in a text).
Read more >
COMP4901K BERT-BILSTM-CRF-best_0.001 | Kaggle
!pip install keras-bert !pip install bert-for-tf2 !pip install tf2crf ... (from keras-bert) (2.4.3) Collecting keras-transformer>=0.38.0 Downloading ...
Read more >
Posit AI Blog: BERT from R - RStudio
Today, we're happy to feature a guest post written by Turgut Abdullayev, ... There are several methods to install keras-bert in Python.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Post

No results found

github_iconTop Related Hashnode Post

No results found