Bulding multiclass classification model
See original GitHub issueHi,
I’m opening this issue as a continuation of #1001. I understand now how SoftmaxLikelihood
works (thanks a lot!) but I haven’t been able to perform multiclass classification because I’m encountering the following error during a call to the marginal log likelihood:
Traceback (most recent call last):
File "multiclass_classifier_for_issue.py", line 91, in <module>
log_lik, kl_div, log_prior = mll(output, y_batch,combine_terms=False)
File "/home/mgarort/housekeeping/virtualenv/ml2/lib/python3.6/site-packages/torch/tensor.py", line 421, in __iter__
raise TypeError('iteration over a 0-d tensor')
TypeError: iteration over a 0-d tensor
I think I’m using SoftmaxLikelihood
correctly. I’ve created a short clean script that reproduces the error.
multiclass_classifier_for_issue.zip
For convenience, I have created a Colab notebook where you can execute the code directly https://colab.research.google.com/drive/1Sg8kiqWajy6fyeu5G0sgZhR1Pqdkj8tS
Do you think you could take a look at it and let me know what the problem might be? In the hyperparameters section, you can set mixing_weights
to be True
or False
depending on whether a mixing matrix is desired in the softmax likelihood or not. If True
, the GP model’s number of tasks is 100, and if False
, it is the number of classes (10 in this case, for MNIST). In both cases I obtain the same error.
Thanks a lot in advance.
Issue Analytics
- State:
- Created 4 years ago
- Comments:8 (1 by maintainers)
Top GitHub Comments
Thanks for the Colab notebook! It looks like you might have some code that’s from an older version of the tutorial. The output of the
mll
by default is a scalar. If you pass thecombine_terms=False
to the mll object, then you can get the different components of the variational elbo - log_lik, kl_div, log_prior - on their own. (This is mostly useful for debugging purposes.)To fix this, you should either pass the
combine_terms=False
argument to the mll - e.g.or use the following lines in the training loop:
The tutorial has a correct training loop.
@KeAWang 谢谢你的指教,明白了