question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Export model config

See original GitHub issue

Bug Description

Can’t export config properly from Autokeras classifier.

Bug Reproduction

Code for reproducing the bug:

from sklearn.datasets import load_iris
import autokeras as ak
import tensorflow as tf

X, y = load_iris(return_X_y=True)

# Initialize the image classifier.
clf = ak.StructuredDataClassifier(max_trials=2)
# Feed the image classifier with training data.
clf.fit(X, y, epochs=2)
# Export as a Keras Model.
clf.evaluate(X, y)
model = clf.export_model()

config = model.get_config()
yaml_config = model.to_yaml()
json_config = model.to_json()

json_model = tf.keras.models.model_from_json(json_config, custom_objects=ak.CUSTOM_OBJECTS)
# yaml_model = tf.keras.models.model_from_yaml(yaml_config, custom_objects=ak.CUSTOM_OBJECTS)
# model = tf.keras.models.model_from_config(config, custom_objects=ak.CUSTOM_OBJECTS)
json_model.compile(loss='categorical_crossentropy', metrics=['accuracy'])

binary_y = tf.keras.utils.to_categorical(y)
json_model.fit(X, binary_y)

Error messages:

Rebuild model from yaml is a comment, because it failed with the following message:

C:\PyCharmProjects\datascience-phd-automl\venv\lib\site-packages\tensorflow_core\python\keras\saving\model_config.py:76: YAMLLoadWarning: calling yaml.load() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.
  config = yaml.load(yaml_string)
Traceback (most recent call last):
  File "C:/PyCharmProjects/datascience-phd-automl/autokeras_test.py", line 20, in <module>
    yaml_model = tf.keras.models.model_from_yaml(yaml_config, custom_objects=ak.CUSTOM_OBJECTS)
  File "C:\PyCharmProjects\datascience-phd-automl\venv\lib\site-packages\tensorflow_core\python\keras\saving\model_config.py", line 76, in model_from_yaml
    config = yaml.load(yaml_string)
  File "C:\PyCharmProjects\datascience-phd-automl\venv\lib\site-packages\yaml\__init__.py", line 114, in load
    return loader.get_single_data()
  File "C:\PyCharmProjects\datascience-phd-automl\venv\lib\site-packages\yaml\constructor.py", line 43, in get_single_data
    return self.construct_document(node)
  File "C:\PyCharmProjects\datascience-phd-automl\venv\lib\site-packages\yaml\constructor.py", line 52, in construct_document
    for dummy in generator:
  File "C:\PyCharmProjects\datascience-phd-automl\venv\lib\site-packages\yaml\constructor.py", line 405, in construct_yaml_map
    value = self.construct_mapping(node)
  File "C:\PyCharmProjects\datascience-phd-automl\venv\lib\site-packages\yaml\constructor.py", line 210, in construct_mapping
    return super().construct_mapping(node, deep=deep)
  File "C:\PyCharmProjects\datascience-phd-automl\venv\lib\site-packages\yaml\constructor.py", line 135, in construct_mapping
    value = self.construct_object(value_node, deep=deep)
  File "C:\PyCharmProjects\datascience-phd-automl\venv\lib\site-packages\yaml\constructor.py", line 92, in construct_object
    data = constructor(self, node)
  File "C:\PyCharmProjects\datascience-phd-automl\venv\lib\site-packages\yaml\constructor.py", line 421, in construct_undefined
    node.start_mark)
yaml.constructor.ConstructorError: could not determine a constructor for the tag 'tag:yaml.org,2002:python/object/apply:tensorflow.python.training.tracking.data_structures.ListWrapper'
  in "<unicode string>", line 23, column 17:
          encoding: !!python/object/apply:tensorflow ... 

Rebuild model from config is a comment, because it failed with the following message:

Traceback (most recent call last):
  File "C:/PyCharmProjects/datascience-phd-automl/autokeras_test.py", line 21, in <module>
    model = tf.keras.models.model_from_config(config, custom_objects=ak.CUSTOM_OBJECTS)
  File "C:\PyCharmProjects\datascience-phd-automl\venv\lib\site-packages\tensorflow_core\python\keras\saving\model_config.py", line 55, in model_from_config
    return deserialize(config, custom_objects=custom_objects)
  File "C:\PyCharmProjects\datascience-phd-automl\venv\lib\site-packages\tensorflow_core\python\keras\layers\serialization.py", line 98, in deserialize
    layer_class_name = config['class_name']
KeyError: 'class_name'

Rebuilding model with json architecture seems to be ok, but I have the following error message when fitting the model:

 32/150 [=====>........................] - ETA: 1sTraceback (most recent call last):
  File "C:/PyCharmProjects/datascience-phd-automl/autokeras_test.py", line 25, in <module>
    json_model.fit(X, binary_y)
  File "C:\PyCharmProjects\datascience-phd-automl\venv\lib\site-packages\tensorflow_core\python\keras\engine\training.py", line 819, in fit
    use_multiprocessing=use_multiprocessing)
  File "C:\PyCharmProjects\datascience-phd-automl\venv\lib\site-packages\tensorflow_core\python\keras\engine\training_v2.py", line 342, in fit
    total_epochs=epochs)
  File "C:\PyCharmProjects\datascience-phd-automl\venv\lib\site-packages\tensorflow_core\python\keras\engine\training_v2.py", line 128, in run_one_epoch
    batch_outs = execution_function(iterator)
  File "C:\PyCharmProjects\datascience-phd-automl\venv\lib\site-packages\tensorflow_core\python\keras\engine\training_v2_utils.py", line 98, in execution_function
    distributed_function(input_fn))
  File "C:\PyCharmProjects\datascience-phd-automl\venv\lib\site-packages\tensorflow_core\python\eager\def_function.py", line 568, in __call__
    result = self._call(*args, **kwds)
  File "C:\PyCharmProjects\datascience-phd-automl\venv\lib\site-packages\tensorflow_core\python\eager\def_function.py", line 632, in _call
    return self._stateless_fn(*args, **kwds)
  File "C:\PyCharmProjects\datascience-phd-automl\venv\lib\site-packages\tensorflow_core\python\eager\function.py", line 2363, in __call__
    return graph_function._filtered_call(args, kwargs)  # pylint: disable=protected-access
  File "C:\PyCharmProjects\datascience-phd-automl\venv\lib\site-packages\tensorflow_core\python\eager\function.py", line 1611, in _filtered_call
    self.captured_inputs)
  File "C:\PyCharmProjects\datascience-phd-automl\venv\lib\site-packages\tensorflow_core\python\eager\function.py", line 1692, in _call_flat
    ctx, args, cancellation_manager=cancellation_manager))
  File "C:\PyCharmProjects\datascience-phd-automl\venv\lib\site-packages\tensorflow_core\python\eager\function.py", line 545, in call
    ctx=ctx)
  File "C:\PyCharmProjects\datascience-phd-automl\venv\lib\site-packages\tensorflow_core\python\eager\execute.py", line 67, in quick_execute
    six.raise_from(core._status_to_exception(e.code, message), None)
  File "<string>", line 3, in raise_from
tensorflow.python.framework.errors_impl.UnimplementedError:  Cast double to string is not supported
	 [[node Cast (defined at /PyCharmProjects/datascience-phd-automl/autokeras_test.py:25) ]] [Op:__inference_distributed_function_5726]

Function call stack:
distributed_function

Expected Behavior

Create an unweighted model, with the hyperparameters find by Autokeras.

Setup Details

Include the details about the versions of:

  • OS type and version: Windows 10
  • Python: 3.7.6
  • autokeras: 1.0.2
  • keras-tuner: 1.0.1
  • scikit-learn: 0.22
  • numpy: 1.18.1
  • pandas: 1.0.1
  • tensorflow: 2.1.0

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Reactions:1
  • Comments:7 (2 by maintainers)

github_iconTop GitHub Comments

5reactions
StephenGoedhartcommented, Jun 9, 2020

Got the exact same issue.

...
network = ak.StructuredDataClassifier(max_trials=1, num_classes=2) #1 trial to speed the process up for testing purposes
network.fit(x_train, y_train, epochs=1) #1  epoch to speed the process up for testing purposes

network.evaluate(x_test, y_test) #works fine

model = network.export_model()
model.save("model_autokeras.h5")

loaded_model = load_model("./model_autokeras.h5", custom_objects=ak.CUSTOM_OBJECTS)
loaded_model.evaluate(x_test, y_test) #UnimplementedError: Cast double to string is not supported
2reactions
Jacobsjj2commented, Sep 30, 2020

I currently have the same issue. The model seems to work just fine when not exported, but the loaded model complains of casting involving strings and floats - neither of which are present in my training data after checking.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Export model configuration parameters. - MATLAB Answers
It is possible to find the configuration parameters of a model via command line using the functions 'getConfigSet' and 'attachConfigSet'.
Read more >
Export to ONNX - Transformers
In this guide, we'll show you how to export Transformers models to ONNX (Open ... These configuration objects come ready made for a...
Read more >
models/exporting_models.md at master · tensorflow/models
NOTE: We are configuring our exported model to ingest 4-D image tensors. We can also configure the exported model to take encoded images...
Read more >
Import or export a product configuration model
You can import and export a product configuration model as an XML file. This is useful, for example, when you want to share...
Read more >
Export model artifacts for prediction | Vertex AI
If you want to enable request batching for a Model that uses a TensorFlow pre-built container to serve predictions, then include config/ ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found