Keras model errors out on prediction
See original GitHub issueHi,
I’m trying to replicate a CSRNET model for counting people in images, it works good for the first input , but errors out when I change to second image.
I did add the allow_mutation_inputs
tag to the cache which removed the st.cache warnings
def run_model(path):
@st.cache(allow_output_mutation=True)
def load_model():
# Function to load and return neural network model
json_file = open('/models/Model.json', 'r')
loaded_model_json = json_file.read()
json_file.close()
loaded_model = model_from_json(loaded_model_json)
loaded_model.load_weights("CSRNet-keras/weights/weights1.hdf5")
return loaded_model
Below is the error I get:
Output Console Log:
Not sure if this is because of caching or I’m missing something. It works fine separately on a Jupyter Notebook.
Debug info
- Streamlit version: 0.50.2
- Python version: 3.6.5
- Using Conda
- OS version: Ubuntu 16.04
- Browser version: Google Chrome Version 79
Thanks !
Issue Analytics
- State:
- Created 4 years ago
- Comments:6
Top GitHub Comments
Also, somtimes it break giving “Tensor X is not a part of the graph…”. Guessing this is because of multi-threading in Keras. Running the
loaded_model._make_predict_function()
after loading the model helps to avoid the error.@st.cache(allow_output_mutation=True)
def load_model():
json_file = open('Model.json', 'r')
loaded_model_json = json_file.read()
json_file.close()
loaded_model = model_from_json(loaded_model_json)
loaded_model.load_weights("weights/epoch200.hdf5")
loaded_model._make_predict_function()
***session = K.get_session()
return loaded_model, session
@carolmanderson works like a charm. Thanks a lot. I was trying to reset the session for some reason and that did not work. Just passing the current TF session solved it.