keras-bert relies on features removed in keras-transformer 0.34.0
See original GitHub issueDescribe the Bug
The get_model
function in the keras_bert.bert
module passes a use_adapter
parameter to the keras_transformers.get_encoders
function:
https://github.com/CyberZHG/keras-bert/blob/26bdfe3c36e77fa0524902f31263a920ccd62efb/keras_bert/bert.py#L124
However, the latest commit to keras-transformer
, corresponding to the v0.34.0 release, removes this parameter from the API:
https://github.com/CyberZHG/keras-transformer/commit/60f1a0968caba6adb914a14b743f0ffbdc176a18
keras-bert
specifies its keras-transformer
dependency as >=0.30.0
. This means that new or upgraded installs will automatically use 0.34.0 despite the breaking change.
This means that calling get_model
fails with the following error:
TypeError: get_encoders() got an unexpected keyword argument 'use_adapter'
The problem can be worked around by pinning the keras-transformer
version at v0.33.0.
Version Info
- I’m using the latest version
Minimal Codes To Reproduce
The problem can be seen by running the load_and_predict.py
demo.
Issue Analytics
- State:
- Created 3 years ago
- Comments:7 (2 by maintainers)
Top GitHub Comments
Experiencing the same error when calling predict function - followed advise and changed keras-transformer==0.33.0, worked after this.
@joseph-wakeling-frequenz Added one: https://pypi.org/project/keras-bert/0.81.1/