When using ELWC - OP_REQUIRES failed at example_parsing_ops.cc:91 : Invalid argument: Could not parse example input
See original GitHub issueHello Team,
I trained a TF ranking model (basing my training on the following example: https://github.com/tensorflow/ranking/blob/master/tensorflow_ranking/examples/tf_ranking_tfrecord.py) and saved it using estimator.export_saved_model('my_model', serving_input_receiver_fn), the model was trained successfully & saved without any warnings/errors.
I deployed the model to a local TensorFlow ModelServer and made an call to it over HTTP using cURL as described on https://www.tensorflow.org/tfx/serving/api_rest#request_format. Unfortunately I see the following error after making the request:
W external/org_tensorflow/tensorflow/core/framework/op_kernel.cc:1655] OP_REQUIRES failed at example_parsing_ops.cc:91 : Invalid argument: Could not parse example input, value: '
ctx_f0
{ "error": "Could not parse example input, value: \'\n\035\n\021ctx_f0\022\010\022\006\n\004\000\000\340@\'\n\t [[{{node ParseExample/ParseExample}}]]" }
I understand that this is a problem that may be related to serialization where my input was not properly serialized, but saving the model by generating serving_input_receiver_fn & using it produced no errors/warnings, so I am not sure where to start looking to resolve this.
I am providing some details below, please let me know if you need more information.
Details
TF framework module versions
tensorflow-serving-api==2.0.0tensorflow==2.0.0tensorflow-ranking==0.2.0
Some training parameters and functions
_CONTEXT_FEATURES={'ctx_f0'}_DOCUMENT_FEATURES={'f0', 'f1', 'f2'}_DATA_FORMAT=tfr.data.ELWC_PADDING_LABEL=-1
def example_feature_columns():
spec = {}
for f in _DOCUMENT_FEATURES:
spec[f] = tf.feature_column.numeric_column(f, shape=(1,), default_value=_PADDING_LABEL, dtype=tf.float32)
return spec
def context_feature_columns():
spec = {}
for f in _CONTEXT_FEATURES:
spec[f] = tf.feature_column.numeric_column(f, shape=(1,), default_value=_PADDING_LABEL, dtype=tf.float32)
return spec
Creating the serving_input_receiver_fn
context_feature_spec = tf.feature_column.make_parse_example_spec(context_feature_columns().values())
example_feature_spec = tf.feature_column.make_parse_example_spec(example_feature_columns().values())
serving_input_receiver_fn = tfr.data.build_ranking_serving_input_receiver_fn(
data_format=_DATA_FORMAT,
list_size=20,
default_batch_size=None,
receiver_name="input_ranking_data",
context_feature_spec=context_feature_spec,
example_feature_spec=example_feature_spec)
When making a REST API to a local TensorFlow ModelServer using the following cURL request
curl -H "Content-Type: application/json" \
-X POST http://192.168.99.100:8501/v1/models/my_model/versions/1587842143:regress \
-d '{"context": {"ctx_f0": 7.2}, "examples":[{"f0":[35.92],"f1":[5.258],"f2":[5.261]},{"f0":[82.337],"f1":[2.06],"f2":[2.068]}]}'
The error is as follows:
W external/org_tensorflow/tensorflow/core/framework/op_kernel.cc:1655] OP_REQUIRES failed at example_parsing_ops.cc:91 : Invalid argument: Could not parse example input, value: '
ctx_f0
{ "error": "Could not parse example input, value: \'\n\035\n\021ctx_f0\022\010\022\006\n\004\000\000\340@\'\n\t [[{{node ParseExample/ParseExample}}]]" }
Issue Analytics
- State:
- Created 3 years ago
- Reactions:1
- Comments:28 (6 by maintainers)

Top Related StackOverflow Question
Hi @ramakumar1729 ,
Sorry to comment on this closed issue, but I thought I would share my latest (positive & successful) findings regarding making requests to
predictserving REST API. Perhaps others may find them useful:TL;DR
I was able to successfully POST serialized & base64 encoded ELWC proto to the
predictREST API and get the expected predictions. These predictions match exactly the predictions that I get if I make a gRPC request using the same ELWC proto to the TensorFlow model server over gRPC.This gave me the confidence in behavior parity that making inference requests over HTTP vs gRPC produces consistent results for the same ELWC.
Details
Previously in https://github.com/tensorflow/ranking/issues/189#issuecomment-620256362, I was creating a
tensor_protoout of the serialized ELWC proto, then serializing the tensor_proto & base64 that, which then I was POSTing to the API.I found out that I should not create
tensor_protoout of serialized ELWC proto string. What needs to be serialized & base64 encoded is the ELWC proto itself. As a result, my cURL looks like as before before, the difference is that theb64string holds the ELWC proto:We can be a little more descriptive by specifying the
signature_nameand thereceiver_name(whatever was defined when creatingserving_input_receiver_fn):The predictions from the above cURL requests match the gRPC response prediction after making the gRPC request as follows:
@azagniotov Thanks for sharing your experience in successfully serving the ranking model with serialized ELWC inputs~ This will be a useful reference for others trying to do the same.