question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

network error rate is unexpected NaN

See original GitHub issue

The fuck?

What is wrong?

I get the error: network error rate is unexpected NaN when training a RNN on the data below.

How do we replicate the issue?

Training data

trainingData = //paste array from link here

net = new brain.recurrent.RNN();
net.train(trainingData);
  
console.log(net.run(trainingData[0].input))

Expected behavior (i.e. solution)

The network should be trained successfully and the output should be ~0.48213740458015264

Other Comments

The datasets’ values are percentages from -100 to 100, normalized to a scale from 0 to 1. I get the same error when I leave out the normalization.

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Reactions:2
  • Comments:10 (7 by maintainers)

github_iconTop GitHub Comments

3reactions
perkyguycommented, Jul 27, 2018

@c4tz and @Connum with this set is that the training set’s outputs don’t really express too large of a swing. They range from ~0.344 to ~0.655, with the majority of results being in the 0.4-0.5 range. Therefore, we could either re-normalize the output values to be between 0-1, or get more training data to exercise the full range of output.

I have a little visualization of this net with a renormalized output (inputs are left untouched). You can see that it was much better swing. https://jsfiddle.net/mubaidr/dw0cL6hj/6/ . You can move the input bars and the dot represents the output value with those new inputs. This is the normalized output value, so you would have to do a little translation to get it back to normal (trivial though). If you think of this as between those two values though, you can see how there won’t be much variance, and how it will be hard to get it out of that ~0.5 +/- 0.05 range.

As to why it didn’t shift when using the second element, this might have been just an outlier for that set.

3reactions
Connumcommented, Jun 26, 2018

Why isn’t anything happening here? Did you ever get it to work?

I’m currently trying to get the gist of brain.js, but no matter which example I try, I’m always getting NaN. I have some training data quite similiar to yours, all normalized to values between 0 and 1, and all i get is NaN. (I’m using NeuralNetwork instead of recurrent.RNN however).

I also tried different examples available on the web, cloned their repos, didn’t change a single value, ran them, and they failed with NaN. It seems to me like brain.js is currently not working at all, but only few people seem to complain, and those who do don’t get any answer at all.

Read more comments on GitHub >

github_iconTop Results From Across the Web

How do I fix the NaN training error in Brain.js? - Stack Overflow
I've tried so far: to use for training only data with output 0 or 1; to normalize data; to flatten the input. Here...
Read more >
Common Causes of NANs During Training
Common Causes of NANs During Training · Gradient blow up · Bad learning rate policy and params · Faulty Loss function · Faulty...
Read more >
brain-js/Lobby - Gitter
Hm, what does "network error rate is unexpected NaN, check network configurations and try again" in brain.recurrent.RNN mean? I set hiddenLayers to [2,3] ......
Read more >
Cost function turning into nan after a certain number of iterations
Well, if you get NaN values in your cost function, it means that the input is outside of the function domain. E.g. the...
Read more >
I am getting a NaN error. What does this mean? - Codecademy
log(square(2))” I get a logical out put: 4. So I know that part of my code works. However, when I change it to...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found