question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Cannot save TFSwinForImageClassification as SavedModel

See original GitHub issue

System Info

  • transformers version: 4.20.1
  • Platform: Linux-5.4.188±x86_64-with-Ubuntu-18.04-bionic
  • Python version: 3.7.13
  • Huggingface_hub version: 0.8.1
  • PyTorch version (GPU?): 1.12.0+cu113 (True)
  • Tensorflow version (GPU?): 2.8.2 (True)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Using GPU in script?: No
  • Using distributed or parallel set-up in script?:No

Who can help?

@Rocketknight1 @sgugger

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, …)
  • My own task or dataset (give details below)

Reproduction

from transformers import AutoFeatureExtractor, TFSwinForImageClassification
from PIL import Image
import requests

url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)

feature_extractor = AutoFeatureExtractor.from_pretrained(swinModel)
model = TFSwinForImageClassification.from_pretrained(swinModel)

inputs = feature_extractor(images=image, return_tensors="tf")
outputs = model(inputs.pixel_values)
logits = outputs.logits
# model predicts one of the 1000 ImageNet classes
predicted_class_idx = tf.math.argmax(logits,-1).numpy()[0]
print("Predicted class:", model.config.id2label[predicted_class_idx])

class MySwin(TFSwinForImageClassification):
    @tf.function(
        input_signature=[
            {
                "pixel_values": tf.TensorSpec((None, None,None,None), tf.float32, name="serving1_pixel_values"),
            }
        ]
    )
    def serving1(self, inputs):
        outputs = self.call(pixel_values=inputs["pixel_values"])
        return self.serving_output(outputs)
    
myswin = MySwin.from_pretrained(swinModel)
tf.saved_model.save(myswin, swin_EXPORT_PATH, signatures={
    "serving1": myswin.serving1,
    # "serving2": mygpt2.serving2
})


All model checkpoint layers were used when initializing MySwin.

All the layers of MySwin were initialized from the model checkpoint at microsoft/swin-tiny-patch4-window7-224.
If your task is similar to the task the model of the checkpoint was trained on, you can already use MySwin for predictions without further training.
---------------------------------------------------------------------------
OperatorNotAllowedInGraphError            Traceback (most recent call last)
[<ipython-input-13-b219bb00369a>](https://localhost:8080/#) in <module>()
      1 myswin = MySwin.from_pretrained(swinModel)
      2 tf.saved_model.save(myswin, swin_EXPORT_PATH, signatures={
----> 3     "serving1": myswin.serving1,
      4     # "serving2": mygpt2.serving2
      5 })

14 frames
[/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/func_graph.py](https://localhost:8080/#) in autograph_handler(*args, **kwargs)
   1145           except Exception as e:  # pylint:disable=broad-except
   1146             if hasattr(e, "ag_error_metadata"):
-> 1147               raise e.ag_error_metadata.to_exception(e)
   1148             else:
   1149               raise

OperatorNotAllowedInGraphError: in user code:

    File "<ipython-input-11-84a42b1aca69>", line 10, in serving1  *
        outputs = self.call(pixel_values=inputs["pixel_values"])
    File "/usr/local/lib/python3.7/dist-packages/transformers/modeling_tf_utils.py", line 1426, in run_call_with_unpacked_inputs  *
        return func(self, **unpacked_inputs)
    File "/usr/local/lib/python3.7/dist-packages/transformers/models/swin/modeling_tf_swin.py", line 1439, in call  *
        outputs = self.swin(
    File "/usr/local/lib/python3.7/dist-packages/keras/utils/traceback_utils.py", line 67, in error_handler  **
        raise e.with_traceback(filtered_tb) from None

    OperatorNotAllowedInGraphError: Exception encountered when calling layer "swin" (type TFSwinMainLayer).
    
    in user code:
    
        File "/usr/local/lib/python3.7/dist-packages/transformers/modeling_tf_utils.py", line 1426, in run_call_with_unpacked_inputs  *
            return func(self, **unpacked_inputs)
        File "/usr/local/lib/python3.7/dist-packages/transformers/models/swin/modeling_tf_swin.py", line 1142, in call  *
            encoder_outputs = self.encoder(
        File "/usr/local/lib/python3.7/dist-packages/keras/utils/traceback_utils.py", line 67, in error_handler  **
            raise e.with_traceback(filtered_tb) from None
    
        OperatorNotAllowedInGraphError: Exception encountered when calling layer "encoder" (type TFSwinEncoder).
        
        in user code:
        
            File "/usr/local/lib/python3.7/dist-packages/transformers/models/swin/modeling_tf_swin.py", line 906, in call  *
                layer_outputs = layer_module(
            File "/usr/local/lib/python3.7/dist-packages/keras/utils/traceback_utils.py", line 67, in error_handler  **
                raise e.with_traceback(filtered_tb) from None
        
            OperatorNotAllowedInGraphError: Exception encountered when calling layer "layers.0" (type TFSwinStage).
            
            in user code:
            
                File "/usr/local/lib/python3.7/dist-packages/transformers/models/swin/modeling_tf_swin.py", line 838, in call  *
                    layer_outputs = layer_module(
                File "/usr/local/lib/python3.7/dist-packages/keras/utils/traceback_utils.py", line 67, in error_handler  **
                    raise e.with_traceback(filtered_tb) from None
            
                OperatorNotAllowedInGraphError: Exception encountered when calling layer "blocks.0" (type TFSwinLayer).
                
                in user code:
                
                    File "/usr/local/lib/python3.7/dist-packages/transformers/models/swin/modeling_tf_swin.py", line 733, in call  *
                        self.set_shift_and_window_size(input_dimensions)
                    File "/usr/local/lib/python3.7/dist-packages/transformers/models/swin/modeling_tf_swin.py", line 672, in set_shift_and_window_size  *
                        if min(input_resolution) <= self.window_size:
                
                    OperatorNotAllowedInGraphError: using a `tf.Tensor` as a Python `bool` is not allowed: AutoGraph did convert this function. This might indicate you are trying to use an unsupported feature.
                
                
                Call arguments received:
                  • hidden_states=tf.Tensor(shape=(None, None, 96), dtype=float32)
                  • input_dimensions=('tf.Tensor(shape=(), dtype=int32)', 'tf.Tensor(shape=(), dtype=int32)')
                  • head_mask=None
                  • output_attentions=False
                  • training=False
            
            
            Call arguments received:
              • hidden_states=tf.Tensor(shape=(None, None, 96), dtype=float32)
              • input_dimensions=('tf.Tensor(shape=(), dtype=int32)', 'tf.Tensor(shape=(), dtype=int32)')
              • head_mask=None
              • output_attentions=False
              • training=False
        
        
        Call arguments received:
          • hidden_states=tf.Tensor(shape=(None, None, 96), dtype=float32)
          • input_dimensions=('tf.Tensor(shape=(), dtype=int32)', 'tf.Tensor(shape=(), dtype=int32)')
          • head_mask=['None', 'None', 'None', 'None']
          • output_attentions=False
          • output_hidden_states=False
          • return_dict=True
          • training=False
    
    
    Call arguments received:
      • self=tf.Tensor(shape=(None, None, None, None), dtype=float32)
      • pixel_values=None
      • bool_masked_pos=None
      • head_mask=None
      • output_attentions=False
      • output_hidden_states=False
      • return_dict=True
      • training=False


Expected behavior

It is supposed to make a SavedModel but instead, I get this error mentioned above. The SavedModel is needed for TensorFlow Serving .

Issue Analytics

  • State:closed
  • Created a year ago
  • Comments:7 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
amyerobertscommented, Jul 18, 2022

@gante Yep - I believe so. I’ve opened a PR here: https://github.com/huggingface/transformers/pull/18153

0reactions
amyerobertscommented, Jul 22, 2022

Following merging of #18153 the reproduction snippet runs on main without error.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Cannot save TFDebertaV2ForSequenceClassification as ...
Steps to reproduce the behavior: Load model via TFDebertaV2ForSequenceClassification; Use saved_model=True to save as TensorFlow SavedModel.
Read more >
Can't save in SavedModel format Tensorflow - Stack Overflow
If you would like to use tensorflow saved model format, then use: tms_model = tf.saved_model.save(model,"export/1").
Read more >
Using the SavedModel format | TensorFlow Core
You can save and load a model in the SavedModel format using the following APIs: Low-level tf.saved_model API. This document describes how to...
Read more >
Serialization and saving - Keras
When saving the model and its layers, the SavedModel format stores ... Cannot serialize the ops generated from the mask argument (i.e. if...
Read more >
Saving and Loading a TensorFlow model using the ... - Medium
The SavedModel API allows you to save a trained model into a format that can be easily loaded in Python, Java, (soon JavaScript), ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found