Initialization of num_data parameter in mll.VariationalELBO
See original GitHub issueIn the Large-Scale Stochastic Variational GP Regression (CUDA) (w/ KISS-GP) notebook (https://gpytorch.readthedocs.io/en/latest/examples/05_Scalable_GP_Regression_Multidimensional/SVDKL_Regression_GridInterp_CUDA.html), shouldn’t the num_data parameter be initialized with batch_size rather than train_y.size(0)?
mll = gpytorch.mlls.VariationalELBO(likelihood, model.gp_layer, num_data=train_y.size(0), combine_terms=False)
Issue Analytics
- State:
- Created 5 years ago
- Comments:13 (5 by maintainers)
Top Results From Across the Web
gpytorch.mlls — GPyTorch 1.6.0 documentation
Computes the MLL given p(f) and y. Parameters: function_dist (MultivariateNormal) – p(f) ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@Akella17 If it helps, here’s an explanation of the normalization that is happening:
The ELBO for stochastic optimization is something like
num_data / num_batch E_{q(f)}[log p(y | f)] - KL[q(u)||p(u)]
. We divide the ELBO by the total number of data points, leading to1 / num_batch E_{q(f)}[log p(y | f)] - 1 / num_data KL[q(u)||p(u)]
.num_data
should be the total number of data points you would pass over in an epoch, where-asnum_batch
is the number of data points in a single minibatch.The short answer is no
The long answer: I’m not sure I totally understand the setup. From the GP-based reinforcement l’ve done, here’s the setup I used:
Setting
num_train
to be the episode length (or approximate episode length) implies that your model is only being trained on a single episode, which is probably not what you want.