question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Make DiagLazyTensor work with any batch size

See original GitHub issue

Currently, DiagLazyTensor assumes either a vector or matrix as input. This has the following weird behavior:

> d1 = DiagLazyTensor(torch.ones(4))
> d1.ndimension(), d1.shape
2, (4, 4)

> d2 = DiagLazyTensor(torch.ones(3, 4))
> d2.ndimension(), d2.shape
3, (3, 4, 4)

> d3 = DiagLazyTensor(torch.ones(2, 3, 4))
> d3.ndimension(), d3.shape
2, (4, 4)

Given @gpleiss’s previous work on supporting general batch shapes, I suggest we modify DiagLazyTensor to support input tensors of general shapes. The last dimension will be interpreted as the diagonal, and the previous dimensions as batches. That is, a DiagLazyTensor(torch.ones(2, 3, 4)) would have 4 dimensions and a batch shape of [2, 3].

I need this for implementing Heteroskedastic Likelihoods for batched Multitask GPs.

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Comments:8 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
jacobrgardnercommented, Nov 9, 2018

I’m in agreement with this change for sure, a DiagLazyTensor created with a 2 x 3 x 4 tensor should represent a 2 x 3 x 4 x 4 matrix.

Do our other LazyTensors already support multiple batch dimensions already? I remember @gpleiss working on this, but wasn’t 100% sure whether it was fully done.

0reactions
Balandatcommented, Nov 15, 2018

this was done in #362

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to include batch size in pytorch basic example?
So you just need to modify N currently its set to 64. So you have in every training batch 64 vectors with size...
Read more >
What is batch size in neural network? - Cross Validated
The batch size defines the number of samples that will be propagated through the network. For instance, let's say you have 1050 training...
Read more >
Effect of batch size on training dynamics | by Kevin Shen
Some works in the optimization literature have shown that increasing the learning rate can compensate for larger batch sizes. With this in mind, ......
Read more >
How to maximize GPU utilization by finding the right batch size
The fact that MXnet saves all batch size data in the GPU RAM and then releases ... the data set by making carefully...
Read more >
SqlBulkCopy.BatchSize Property (Microsoft.Data.SqlClient)
The integer value of the BatchSize property, or zero if no value has been set. ... For an example illustrating how BatchSize works...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found