question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Questions about ragged batching of higher dimensional tensors

See original GitHub issue

After reading the ragged batching documentation a couple of things still remain unclear to me.

The example provided is really quite helpful, but I would like to know how I can adapt it to my situation in which I have higher dimensional tensors. Let us say that, like the example in my situation I also send 3 requests but instead of sending shapes [1, 3], [1, 4], [1, 5] I have shapes [1, 3, 32, 32, 1], [1, 4, 32, 32, 1], [1, 5, 32, 32, 1].

  1. Is ragged batching of higher dimensional tensors like this supported?
  2. How will it then batch the examples? Will it concatenate the examples in the first (batch) dimension: [12, 32, 32, 1]? Or will it concatenate and flatten all values into a big 1-dimensional tensor: [12*32*32*1] and should I modify my model to reshape the ragged input tensor back to shape of [12, 32, 32, 1]?

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Reactions:1
  • Comments:7 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
GuanLuocommented, Feb 24, 2022

I agree that your suggestion can solve this particular higher dimensional tensor scenario with existing ragged batch expressiveness, but as you mentioned this is not the generic solution as it can’t describe tensors with multiple variable dimensions properly. And since there is already the plan for adding the new kind of batch input, I would incline to keep the existing behavior and let the new batch input to address the multi-dimensional ragged batching.

0reactions
stengoescommented, Feb 28, 2022

Ok thanks for sharing your insights and plans for the future support of ragged batching of higher dimensional tensors.

I will close this issue now.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Ragged tensors | TensorFlow Core
Batching and unbatching Datasets with ragged tensors. Datasets with ragged tensors can be batched (which combines n consecutive elements into a single elements) ......
Read more >
Introduction to ragged tensors | m2hycon Blog
This is necessary because TensorFlow groups batches of data together which must have the same shape in every dimension. A batch of the...
Read more >
Newest 'ragged-tensors' Questions - Stack Overflow
I am training a deep learning model on stacks of images with variable dimensions. (Shape = [Batch, None, 256, 256, 1]), where None...
Read more >
Compilation for Ragged Tensors with Minimal Padding - arXiv
In all cases, CORA is significantly better than the fully padded gemm operations, which perform worse at higher batch sizes as there is...
Read more >
Ragged tensors - | notebook.community
RaggedTensor s supports multidimensional indexing and slicing, with one restriction: indexing into a ragged dimension is not allowed. This case is problematic ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found