st.cache fails for TF2 SavedModels (HuggingFace, Tensorflow Hub, etc.)
See original GitHub issueThere is a particular TensorFlow type that st.cache
fails to cache: tensorflow.python.util.object_identity._ObjectIdentityWrapper
. This is apparently a very common type and used in TF2 SavedModels, including HuggingFace tokenizers and TensorFlow Hub models.
I tried overriding hash_funcs
with id
and lamba _: None
, but both resulted in the program hanging. Without overriding hash_funcs
, I get a Streamlit error saying that Streamlit doesn’t know how to cache the type.
Issue Analytics
- State:
- Created 3 years ago
- Comments:7 (1 by maintainers)
Top Results From Across the Web
Common issues | TensorFlow Hub
This error frequently arises when loading models in TF1 Hub format with the hub.load() API in TF2. Adding the correct signature should fix...
Read more >Load model from cache or disk not working - Transformers
I am trying to load a model and tokenizer - ProsusAI/finbert (already cached on disk by an earlier run in ~/.cache/huggingface/transformers/) ...
Read more >Full text of "Python Ebooks" - Internet Archive
EXPERT INSIGHT Deep Learning with TensorFlow 2 and Keras Regression, ConvNets, GANs, RNNs NLP, and more with TensorFlow 2 and the Keras API...
Read more >zb1486966459725的个人空间- OSCHINA - 中文开源技术交流 ...
... sudo tee /etc/apt/sources.list.d/tensorflow-serving.list && \$ curl ... =/home/ubuntu/Desktop/Medium/TF2.0/SavedModel/inceptionv3_128_tf_flowers/ ...
Read more >TensorFlow Hub error when Saving model as H5 or ...
It's been a while, but assuming you have migrated to the TF2, this can easily be accomplished with the most recent model version...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@sai-krishna-msk I might but can’t recall how. We eventually moved to expose an HTTP API that the Streamlit app calls. FastAPI was our choice on the server side and httpx the choice on the client side.
But, Streamlit recently release .89 which has two new cache decorators that might interest you. See: https://docs.streamlit.io/en/stable/changelog.html#version-0-89-0
Closing since this is resolved by using our newer caching methods.