question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[Feature Request] Input Transformations

See original GitHub issue

🚀 Feature Request

Essentially, upstream BoTorch’s InputTransformation from https://github.com/pytorch/botorch/blob/master/botorch/models/transforms/input.py to GPyTorch. This allows to add either fixed or learnable transformations that are automatically applied when training models.

Motivation

This allows to do things like normalize inputs but also to combine the GP with a learnable transformation. This simplifies model setup. We currently have this in BoTorch and essentially apply the transform in the forward methods.

Additional context

We recently worked on having input transformations that can change the shape of the input https://github.com/pytorch/botorch/pull/819, which caused some headaches for how to best set this up without a ton of boilerplate code. We were hoping to do this in the __call__ rather than forward method, but this collides with some of GPyTorch’s assumptions. Moving this functionality upstream into gpytorch would allow us to solve these challenges more organically.

Describe alternatives you’ve considered You could do this as we do it right now, but one would have to add boilerplate transformation code to every implementation of forward.

Issue Analytics

  • State:open
  • Created 2 years ago
  • Reactions:3
  • Comments:5 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
saitcakmakcommented, Jun 14, 2021

I’ll just add concrete examples of what we tried and where it fails.

Apply one-to-many transforms at model.forward(): For a batch x q x d-dim input these return batch x new_q x m-dim output. This fails at the reshape operation here: https://github.com/cornellius-gp/gpytorch/blob/a0d8cd2d379742fd2c72a22fe9fcc16e43b3d843/gpytorch/models/exact_prediction_strategies.py#L43

Define a wrapper around ExactGP.__call__ and apply the transforms before calling ExactGP.__call__: Leads to https://github.com/cornellius-gp/gpytorch/blob/a0d8cd2d379742fd2c72a22fe9fcc16e43b3d843/gpytorch/models/exact_gp.py#L256 We could maybe get around this by wrapping the call with with debug(False), but that also breaks tests, so probably not a good idea.

Current proposal at pytorch/botorch#819 is to apply the transforms at model.forward at the training time and at posterior call in eval mode to get around these issues (and do this at each models forward and each posterior). This is admittedly not a good design, so upstreaming the input transforms would make it much cleaner. Edit: Just realized that this actually breaks things. So, we don’t have a proper way of applying one-to-many transforms currently. Edit 2: With some changes to input transforms (storing train inputs as transformed), applying them in both model.forward and posterior now works.

0reactions
wjmaddoxcommented, Aug 10, 2021

After some discussion with @saitcakmak, it looks like placing the input transforms in ApproximateGP.__call__ for variational GPs might be the only feasible option.

If it’s done in the forwards pass, then the inducing points would be estimated on the raw, untransformed scale because of the call here.

I have a version of current botorch transforms + variational GPs in https://github.com/pytorch/botorch/pull/895 but it only works for fixed, un-learnable transforms (e.g. not learnable input warping) because it forcibly sets the “training inputs” and thus the model inputs to be on the transformed scale, at least when using model fitting utilities such as botorch...fit_gpytorch_torch.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Feature Requests: What are they and how to manage them
Feature requests are a form of product feedback you may frequently encounter as a SaaS product manager. They typically come in the form...
Read more >
machine learning - Feature Transformation on Input data
I was reading about the solution to this OTTO Kaggle challenge and the first place solution seems to use several transforms for the...
Read more >
Dynamic parameters or more input transformations that allow ...
I would like to be able to “build up” filenames and database names from parameters, rather than having to literally have each parameter...
Read more >
Feature Requests: How to Track, Prioritize, and Manage - Pipefy
In some cases, more input will be needed from the customer to assess or plan the request. How to respond to a feature...
Read more >
Feature Transformations in Data Science - Analytics Vidhya
– Box-cox requires the input data to be strictly positive(not even zero is acceptable). – for features that have zeroes or negative values,...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found