question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

When using ELWC - OP_REQUIRES failed at example_parsing_ops.cc:91 : Invalid argument: Could not parse example input

See original GitHub issue

Hello Team,

I trained a TF ranking model (basing my training on the following example: https://github.com/tensorflow/ranking/blob/master/tensorflow_ranking/examples/tf_ranking_tfrecord.py) and saved it using estimator.export_saved_model('my_model', serving_input_receiver_fn), the model was trained successfully & saved without any warnings/errors.

I deployed the model to a local TensorFlow ModelServer and made an call to it over HTTP using cURL as described on https://www.tensorflow.org/tfx/serving/api_rest#request_format. Unfortunately I see the following error after making the request:

W external/org_tensorflow/tensorflow/core/framework/op_kernel.cc:1655] OP_REQUIRES failed at example_parsing_ops.cc:91 : Invalid argument: Could not parse example input, value: '

ctx_f0

{ "error": "Could not parse example input, value: \'\n\035\n\021ctx_f0\022\010\022\006\n\004\000\000\340@\'\n\t [[{{node ParseExample/ParseExample}}]]" }

I understand that this is a problem that may be related to serialization where my input was not properly serialized, but saving the model by generating serving_input_receiver_fn & using it produced no errors/warnings, so I am not sure where to start looking to resolve this.

I am providing some details below, please let me know if you need more information.

Details

TF framework module versions
  • tensorflow-serving-api==2.0.0
  • tensorflow==2.0.0
  • tensorflow-ranking==0.2.0
Some training parameters and functions
  • _CONTEXT_FEATURES = {'ctx_f0'}
  • _DOCUMENT_FEATURES = {'f0', 'f1', 'f2'}
  • _DATA_FORMAT = tfr.data.ELWC
  • _PADDING_LABEL = -1
def example_feature_columns():
    spec = {}
    for f in _DOCUMENT_FEATURES:
        spec[f] = tf.feature_column.numeric_column(f, shape=(1,), default_value=_PADDING_LABEL, dtype=tf.float32)
    return spec
def context_feature_columns():
    spec = {}
    for f in _CONTEXT_FEATURES:
        spec[f] = tf.feature_column.numeric_column(f, shape=(1,), default_value=_PADDING_LABEL, dtype=tf.float32)
    return spec
Creating the serving_input_receiver_fn
context_feature_spec = tf.feature_column.make_parse_example_spec(context_feature_columns().values())
example_feature_spec = tf.feature_column.make_parse_example_spec(example_feature_columns().values())

serving_input_receiver_fn = tfr.data.build_ranking_serving_input_receiver_fn(
        data_format=_DATA_FORMAT,
        list_size=20,
        default_batch_size=None,
        receiver_name="input_ranking_data",
        context_feature_spec=context_feature_spec,
        example_feature_spec=example_feature_spec)

When making a REST API to a local TensorFlow ModelServer using the following cURL request

curl -H "Content-Type: application/json" \
-X POST http://192.168.99.100:8501/v1/models/my_model/versions/1587842143:regress \
-d '{"context": {"ctx_f0": 7.2}, "examples":[{"f0":[35.92],"f1":[5.258],"f2":[5.261]},{"f0":[82.337],"f1":[2.06],"f2":[2.068]}]}'

The error is as follows:

W external/org_tensorflow/tensorflow/core/framework/op_kernel.cc:1655] OP_REQUIRES failed at example_parsing_ops.cc:91 : Invalid argument: Could not parse example input, value: '

ctx_f0

{ "error": "Could not parse example input, value: \'\n\035\n\021ctx_f0\022\010\022\006\n\004\000\000\340@\'\n\t [[{{node ParseExample/ParseExample}}]]" }

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Reactions:1
  • Comments:28 (6 by maintainers)

github_iconTop GitHub Comments

6reactions
azagniotovcommented, May 20, 2020

Hi @ramakumar1729 ,

Sorry to comment on this closed issue, but I thought I would share my latest (positive & successful) findings regarding making requests to predict serving REST API. Perhaps others may find them useful:

TL;DR

I was able to successfully POST serialized & base64 encoded ELWC proto to the predict REST API and get the expected predictions. These predictions match exactly the predictions that I get if I make a gRPC request using the same ELWC proto to the TensorFlow model server over gRPC.

This gave me the confidence in behavior parity that making inference requests over HTTP vs gRPC produces consistent results for the same ELWC.

Details

Previously in https://github.com/tensorflow/ranking/issues/189#issuecomment-620256362, I was creating a tensor_proto out of the serialized ELWC proto, then serializing the tensor_proto & base64 that, which then I was POSTing to the API.

I found out that I should not create tensor_proto out of serialized ELWC proto string. What needs to be serialized & base64 encoded is the ELWC proto itself. As a result, my cURL looks like as before before, the difference is that the b64 string holds the ELWC proto:

curl -H "Content-Type: application/json" -X POST \
http://192.168.99.100:8501/v1/models/tf_ranking_v10/versions/1589680164:predict \
-d '{"instances": [{"b64": "CqABCp0BCiQK.... TRUNCATED"}]}'

We can be a little more descriptive by specifying the signature_name and the receiver_name (whatever was defined when creating serving_input_receiver_fn):

curl -H "Content-Type: application/json" -X POST \
http://192.168.99.100:8501/v1/models/tf_ranking_v10/versions/1589680164:predict \
-d '{"signature_name": "predict", "instances": [{"input_ranking_data": {"b64": "CqABCp0BCiQK.... TRUNCATED"}}]}'

The predictions from the above cURL requests match the gRPC response prediction after making the gRPC request as follows:

import grpc
from tensorflow_serving.apis import predict_pb2
from tensorflow_serving.apis import prediction_service_pb2_grpc
from tensorflow_serving.apis import input_pb2
from google.protobuf import text_format
from google.protobuf.json_format import MessageToDict


EXAMPLE_LIST_WITH_CONTEXT_PROTO = text_format.Parse(
      """
     examples {
          ....
     }
    context {
         .....
    }
    """, input_pb2.ExampleListWithContext())

example_list_with_context_proto = EXAMPLE_LIST_WITH_CONTEXT_PROTO.SerializeToString()
tensor_proto = tf.make_tensor_proto(example_list_with_context_proto, dtype=tf.string, shape=[1])

timeout_in_secs = 3
request = predict_pb2.PredictRequest()
request.inputs['input_ranking_data'].CopyFrom(tensor_proto)
request.model_spec.signature_name = 'predict' 
request.model_spec.name = 'tf_ranking_v10'

channel = grpc.insecure_channel("0.0.0.0:8500")
stub = prediction_service_pb2_grpc.PredictionServiceStub(channel)                            

grpc_response = stub.Predict(request, timeout_in_secs)
unpacked_grpc_response = MessageToDict(grpc_response, preserving_proto_field_name = True)

print(unpacked_grpc_response['outputs']['output']['float_val'])
5reactions
ramakumar1729commented, May 20, 2020

@azagniotov Thanks for sharing your experience in successfully serving the ranking model with serialized ELWC inputs~ This will be a useful reference for others trying to do the same.

Read more comments on GitHub >

github_iconTop Results From Across the Web

ml-engine predict argument parsing errors - Stack Overflow
{...."Prediction failed: Error during model execution: AbortionError(code=StatusCode.INVALID_ARGUMENT, details=\"Could not parse example input, ...
Read more >
Could not parse input for watch . failed to parse - Opster
The simple search is just a GET API request to the _search endpoint. The search query can either be provided in query string...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found