Multi-output regression problem
See original GitHub issueHi,
My model has several outputs from the forward method:
def forward(self, x):
---code---
return ClCd, angle
This returns a tuple, which LR finder does not like. I get the following error message:
if not (target.size() == input.size()):
AttributeError: 'tuple' object has no attribute 'size'
Is there a way for LR finder to work with tuples? Alternatively, should I be structuring the output from my forward method differently (i.e. using a single output tensor)? I tried outputting a single tensor with two columns from my forward method (each column representing an output), but this gave significantly worse results in training.
Issue Analytics
- State:
- Created 3 years ago
- Comments:11 (7 by maintainers)
Top Results From Across the Web
How to Develop Multi-Output Regression Models with Python
Multioutput regression are regression problems that involve predicting two or more numerical values given an input example.
Read more >Multi-Output Regression using Sklearn - R-bloggers
Regression analysis is a process of building a linear or non-linear fit for one or more continuous target variables.
Read more >Explainable AI for Multi-Output Regression | by Cory Randolph
Break the problem down into a simpler problem by trying to explain a single output/label of a model since there are more resources...
Read more >5.6 Multi-Output Regression
This is often called multiple-output regression, and in this Section we describe how to extend the basic elements of this Chapter to properly...
Read more >Multi Output Regression Techniques | Kaggle
Multi Output Regression Techniques ... Extra trees; K-nn; Linear regression ... DecisionTreeRegressor from sklearn.multioutput import MultiOutputRegressor.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
This worked perfectly. Thank you very much to the both of you for all your time, patience and clear instructions!
@davidtvs Agree with that. Making a wrapper for multiple loss functions is way more elegant for such case!
@Stoops-ML As it’s shown in the link provided by @davidtvs, loss function is also a subclass of
nn.Module
. Therefore, you can simply implement a wrapper like the following one:See also this post for the reason why it works.