question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Not able to adjust the NeuralNetwork parameters

See original GitHub issue

Summary

I am using the “Extracting the configuration into a file” approach for training and running the code, and I have not been able to adjust the neural network options (such as errorThresh) in the settings of the conf.json file

Simplest Example to Reproduce

Here is the code I am running (from quickstart)

const { dockStart } = require('@nlpjs/basic');

(async () => {
  const dock = await dockStart();
  const nlp = dock.get('nlp');
  await nlp.train();
  const response = await nlp.process('en', 'some query');
...

and I am using this settings in the conf.json file…

{
  "settings": {
    "nlp": {
          "corpora": ["./model/corpus.json"],
	  "threshold": 0.8,
	  "nlu" : {  "log": true,
		  "errorThresh" : 0.0000005,
		  "deltaErrorThresh" : 0.00000001
	  }
	}
  },
  "use": ["Basic", "LangEn"]
}

I look throughout the documentation and the code but there was not definite example on how to structure the input. However, there was one hint that the way shown above might be the way. But no matter what values I change the errorThresh or deltaErrorThresh parameters to, the training log does not change. I also tried several other ways to input the settings for these parameters in conf.json with no effect on the training log. For all the combinations I tried, the training results were identical and no errors were indicated.

How should the conf.json be structured to change these parameters? Thanks in advance!

Current Behavior

I am not able to affect the training history with the inputs I have tried.

Context

My model is training very quickly, but I don’t know if the training is stopping prematurely. That is why I would like to adjust these training parameters.

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:5 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
aiglosscommented, Nov 24, 2022

HI @sjscotti @Apollon77 ,

the parameters errorThresh and deltaErrorThresh are actually used by the NeuralNetwork when training the corpus:

https://github.com/axa-group/nlp.js/blob/66d556cca6b766568c9f87c3e19defac6ddd0dc2/packages/neural/src/neural-network.js#L172-L173.

To let those parameters reach NeuralNetwork you’ll have to add them in the ‘nlu-<language>’ section of your settings, what would make your configuration look like this

{
  "settings": {
    "nlp": {
      "corpora": ["./corpus-en.json"]
    },
    "nlu-??": {
      "log": true,
      "errorThresh": 0.0000005,
      "deltaErrorThresh": 0.00000001
    }
  },
  "use": ["Basic", "ConsoleConnector"]
}

Note the use of -??. That is a common pattern used in this project to act as a wildcard. So, in the example above, those settings would be applied for any NeuralNetwork. If, on the other hand, you’d like to target the NLU for a specific language, you could just something like nlu-en.

Be careful when tweaking those values though, as lower values might lead to a much higher training time, with a not so improved accuracy.

0reactions
Apollon77commented, Nov 24, 2022

Looking forward to hear from you 😃

Read more comments on GitHub >

github_iconTop Results From Across the Web

Neural Networks: parameters, hyperparameters and ...
Learning rate: this hyperparameter refers to the step of backpropagation, when parameters are updated according to an optimization function.
Read more >
What should I do when my neural network doesn't learn?
If your model is unable to overfit a few data points, then either it's too small (which is unlikely in today's age),or something...
Read more >
My Neural Network isn't working! What should I do?
Try switching to another activation function such as leaky ReLUs or ELUs and see if the same thing happens.
Read more >
37 Reasons why your Neural Network is not working
A network might not be training for a number of reasons. Over the course of ... Maybe you using a particularly bad set...
Read more >
Parameter optimization in neural networks
Adjusting gradient descent hyperparameters ... To use gradient descent, you must choose values for hyperparameters such as learning rate and batch size. These ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found