[Docs] Multitask GP & classification with missing values
See original GitHub issueHi, I am new to this framework & GP in general - I am not sure if this is the right place to ask this kind of question.
I am trying to perform online classification on a variable-length multivariate timeseries dataset with missing values. I am following the method of this paper: https://arxiv.org/abs/1706.04152 - i.e. my framework is a multitask gp -> pytorch model
I have followed the documentation closely & defined my interpolation model this way:
class MTGPInterpolationModel(gpytorch.models.ExactGP):
def __init__(self, input_dim, train_x, train_y, likelihood):
super(MTGPInterpolationModel, self).__init__(train_x, train_y, likelihood)
...
Then, in a parent model, I want to do inference with something like
output = self.predictor_model(self.interpolation_model(vital_features))
for a single multivariate sequence described by vital_features and backprop a classification loss through the entire framework. My questions are:
- Is something like this possible? I want to backprop through the interpolation module, but still need to define a likelihood loss?
- Do I have to instantiate a new interpolation model/add to the existing model’s data for each iteration/observation of my training procedure (I guess this might be the purpose of get_fantasy_model, but it seem to have issues with the multitask gp [/issues/800])?
- I could not find any documentation on dealing with irregularly sampled observations for multitask GP. What is the recommended approach? what should the representation be for train_x (right now a multivariate temporal sequence with missing values)
Sorry for the basic questions. I really appreciate your help!
Issue Analytics
- State:
- Created 4 years ago
- Comments:5 (1 by maintainers)
Top GitHub Comments
You can use a hadarmard style GP for non-regular X: https://github.com/cornellius-gp/gpytorch/blob/master/examples/03_Multitask_GP_Regression/Hadamard_Multitask_GP_Regression.ipynb
Thanks a lot!