question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

reference custom tflite model from inside graph definition

See original GitHub issue

Plugin Version or Commit ID

6b8c6743f23539f7604e74dc260b01e0f58f1707

Unity Version

2020.3.26f1

Your Host OS

Windows 10 Pro

Target Platform

UnityEditor

Description

I’d like to adapt the HelloWorld Scene with a graph that:

  • accepts a simple float array as input
  • converts the array into a tflite tensor
  • runs a tflite model
  • returns the output of the network (likewise a float array)

While this little toy example runs fine in mediapipe with some example driver code in c++, I cannot get it working in MediaPipeUnityPlugin.

In C++ I just had to specify a relative path to the tflite model in the TfLiteInferenceCalculator section of the graph definition

model_path: ""mediapipe/models/my_model.tflite""

Obviously this line has to be changed when switching to Unity, but how?
What are the exact steps to reference the tflite model correctly?

Code to Reproduce the issue

The graph definition looks like this:

var configText = @"
        input_stream: ""MATRIX:in""
        output_stream: ""FLOATS:out""

        node 
        {
          calculator: ""TfLiteConverterCalculator""
          input_stream: ""MATRIX:in""
          output_stream: ""TENSORS:image_tensor""
          options: 
          {
              [mediapipe.TfLiteConverterCalculatorOptions.ext]
              {
              zero_center: false
              }
          }
        }

        node 
        {
          calculator: ""TfLiteInferenceCalculator""
          input_stream: ""TENSORS:image_tensor""
          output_stream: ""TENSORS:tensor_features""
          options: 
          {
            [mediapipe.TfLiteInferenceCalculatorOptions.ext] 
            {
              model_path: ""mediapipe/models/my_model.tflite""
            }
          }
        }

        node 
        {
          calculator: ""TfLiteTensorsToFloatsCalculator""
          input_stream: ""TENSORS:tensor_features""
          output_stream: ""FLOATS:out""
        }
      ";

Here is the (not yet tested) driver code of the above graph:

      var graph = new CalculatorGraph(configText);
      var poller = graph.AddOutputStreamPoller<float[]>("out").Value();

      // load tflite model from assets
     // ... (not sure what to do here)

      Debug.Log("StartRun");
      graph.StartRun().AssertOk();
      for (var i = 0; i < 10; i++)
      {
        var floatArray = new float[6] { 0, 1, 2, 3, 4, 5 };
        var input = new FloatArrayPacket(floatArray, new Timestamp(i));
        graph.AddPacketToInputStream("in", input).AssertOk();
      }
      graph.CloseInputStream("in").AssertOk();

      Debug.Log("Poll output");
      var output = new FloatArrayPacket();
      while (poller.Next(output))
      {
        Debug.Log(output.Get());
      }


      graph.WaitUntilDone().AssertOk();
      graph.Dispose();

      Debug.Log("Done");
    }

Additional Context

I tried

  • to rename my_model.tflite to StreamingAssets folder (where apparantly all other tflite model reside) and rename it to my_model.bytes but it’s still unclear to me how to reference it from there in the graph definition.

Issue Analytics

  • State:closed
  • Created a year ago
  • Comments:22 (21 by maintainers)

github_iconTop GitHub Comments

4reactions
mgarbadecommented, Jul 13, 2022

puhh finally I got things working! 😄 🚀 🍻 thanks a lot for your help @homuler !

I will try to clean the code a bit, feel free to comment the code quality in case you are interested in me creating a pull request out of this

1reaction
mgarbadecommented, Jul 7, 2022

ups, sorry. I just repaired the link. it should work now

Read more comments on GitHub >

github_iconTop Results From Across the Web

Object Detection with TensorFlow Lite Model Maker
Load the trained TFLite model and define some visualization functions; Run object detection and show the detection results.
Read more >
TFLite Object Detection with TFLite Model Maker
TFLite Model Maker Object Detection Prediction Example ... This guide walks you through creating a custom object detector and deploying it on Android....
Read more >
Custom operators | TensorFlow Lite
Using custom operators consists of four steps. Create a TensorFlow Model. Make sure the Saved Model (or Graph Def) refers to the correctly ......
Read more >
Visualize TFLite graph and get intermediate values of a ...
The mechanism of TF-Lite makes the whole process of inspecting the graph and getting the intermediate values of inner nodes a bit tricky....
Read more >
MediaPipe with Custom tflite Model | by Swati Modi
With MediaPipe, a perception pipeline can be built as a graph of modular components, including model inference, media processing algorithms and data ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found