question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Error when running inference using softmax observed values

See original GitHub issue

Hi!

I am trying to model observed softmax data using Infer.Net and I get an error no matter which algorithm I use. I have a more complicated model myself, but I have tried to reduce the issue to a simplified Gaussian-softmax example model below:

    class Program
    {
        static void Main(string[] args)
        {
            var testData = new Vector[]
            {
                Vector.FromArray(0.1, 0.3, 0.5, 0.1),
                Vector.FromArray(0.05, 0.5, 0.2, 0.25),
                Vector.FromArray(0.2, 0.4, 0.3, 0.1),
                Vector.FromArray(0.15, 0.2, 0.35, 0.3),
                Vector.FromArray(0.15, 0.3, 0.4, 0.15)
            };
            Model(testData);
            Console.ReadLine();
        }

        static void Model(Vector[] data)
        {
            var priorMean = Variable.VectorGaussianFromMeanAndPrecision(Vector.Constant(4, 0.0), PositiveDefiniteMatrix.Identity(4)).Named("priorMean");
            var priorCov = Variable.WishartFromShapeAndRate(4.0, PositiveDefiniteMatrix.Identity(4)).Named("priorCov");
            var prior = Variable.VectorGaussianFromMeanAndPrecision(priorMean, priorCov).Named("prior");
            var numDocs = Variable.New<int>().Named("numDocs");
            var docR = new Range(numDocs);
            var arrvals = Variable.Array<Vector>(docR).Named("arrvals");
            arrvals[docR] = Variable.Softmax(prior).ForEach(docR);

            // Observations
            numDocs.ObservedValue = data.Length;
            arrvals.ObservedValue = data;

            // var alg = new ExpectationPropagation(); 
            // var alg = new VariationalMessagePassing();
            var alg = new GibbsSampling();
            var ieng = new InferenceEngine(alg);
            var compAlg = ieng.GetCompiledInferenceAlgorithm(priorMean, priorCov);
            Console.WriteLine(compAlg.Marginal(priorMean.NameInGeneratedCode));
        }
    }

The errors I get for each individual algorithm are as follows. For expectation propagation and Gibbs sampling, I get that the model is unsupported because of MMath.Softmax. E.g., the exception for Gibbs sampling:

Error 0: This model is not supported with GibbsSampling due to MMath.Softmax(Vector softmax, IList<double> x). Try using a different algorithm or expressing the model differently Gibbs Sampling requires the conditionals to be conjugate in
MMath.Softmax(prior_rep_uses[0][index0])
Details: System.MissingMethodException: xAverageConditional not found in SoftmaxOp_Bouchard_Sparse,SoftmaxOp_Bouchard,SoftmaxOp_BL06_LBFGS,SoftmaxOp_Bohning,SoftmaxOp_Taylor,SoftmaxOp_KM11,SoftmaxOp_KM11_Sparse2,SoftmaxOp_KM11_Sparse,SoftmaxOp_BL06,VectorSoftmaxOp_KM11,SoftmaxOp_KM11_LBFGS,SoftmaxOp_KM11_LBFGS_Sparse,GammaSoftmaxOp using parameter types: [softmax] Vector,[x] PlaceHolder,[to_x] VectorGaussian,[result] VectorGaussian

For VMP, the model succesfully compiles, but I get a null-pointer exception:

Exception thrown: 'System.NullReferenceException' in Microsoft.ML.Probabilistic.dll
An unhandled exception of type 'System.NullReferenceException' occurred in Microsoft.ML.Probabilistic.dll
Object reference not set to an instance of an object.

Any idea of whether I am misspecifying something here, or there is a workaround to solve the issue?

Thank you very much in advance!

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Comments:7 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
tminkacommented, Jan 21, 2019

Doubling the shape of priorCov seems to work.

0reactions
tminkacommented, Jan 21, 2019

Unfortunately, when I look at the code for Softmax, it is not meant to work when the output is directly observed (as is the case for many factors under VMP). The code should be throwing an exception here. I will fix that. Meanwhile, you shouldn’t trust the output of this program.

Read more comments on GitHub >

github_iconTop Results From Across the Web

At Runtime : "Error while reading resource variable softmax ...
This could mean that the variable was uninitialized. Not found: Resource localhost/softmax/kernel/N10tensorflow3VarE does not exist.
Read more >
Understand the Softmax Function in Minutes | by Uniqtech
Because Softmax function outputs numbers that represent probabilities, each number's value is between 0 and 1 valid value range of probabilities. The range...
Read more >
Why is softmax output not a good uncertainty measure for ...
Softmax is not a uncertanity measure, it is a function that transforms values in (−∞,∞) to (0,1). – Tim ♢. Oct 24, 2017...
Read more >
How to fix “ValueError: not enough values to unpack ...
Does anyone know, which values are here expected? All works when i try to run it with: self.bert = BertModel.from_pretrained( ...
Read more >
How to Avoid Errors When Implementing Sampled Softmax ...
The solution to this error is to ensure that all variables used in the sampled softmax computation are initialized before the computation is ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found