question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

working of feature/add-dropout

See original GitHub issue

Feature request

The dropout feature is already in development. I wanted to contact @Optimox to discuss if the feature works already or needs more development.

What is the expected behavior?

I expect that dropout is either an argument to TabNetRegressor/TabNetClassifier or clf.fit. However, when I tried the development branch feature/add-dropout I found the error: __init__() got an unexpected keyword argument 'dropout' I check the code but it would be useful to understand better why is failing.

What is motivation or use case for adding/changing the behavior?

Dropout can be useful in some situations.

How should this be implemented in your opinion?

I think that it would be awesome to have dropout in the new develop branch.

Are you willing to work on this yourself? yes, I will keep digging the code or try to add the dropout layer in the tabnet_network.py file within the branch develop, but it would be useful to know @Optimox experience.

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:9

github_iconTop GitHub Comments

2reactions
csetraynorcommented, May 6, 2021

@Optimox thanks for your reply.

  • I realised why the feature/add-dropout branch wasn’t working for me! I forgot to modify the imports in tabnet_model to the name I gave to the folder of this branch but the feature is actually working so this thread may be closed if you feel fit.
  • My goal is more obtaining some uncertainty in my predictions rather than improving performance, though improving performance is always a plus.
  • Following your advice I just added a dropout directly with the input features that makes sense with the aims of the exercise.
  • Now I found some improvement see below with dropout in the input features (after embbedings in the forward method of Tabnet)… But may be a fluke because this exercise is a simulation not a real-world benchmark. I will make real world benchmarks next week I can keep you folks updated.

image

Thanks again!

1reaction
csetraynorcommented, May 30, 2021

Yes, the only difference is the dataset. The one that helps is from a simulation using multivariate normal distributions. The one that doesn’t help is from the census example dataset. I say more realistic because, at >70% dropout, I wasn’t expecting it to help. Yes, some improvement from 10% to 50% dropout would be nice, still not improving, but using a 10% dropout is good enough for my goal. I will try to use it in the final mapping. Thank you for the suggestion.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Dropout Regularization in Deep Learning Models with Keras
Dropout is a simple and powerful regularization technique for neural networks and deep learning models. In this post, you will discover the ...
Read more >
Dropout layer - Keras
The Dropout layer randomly sets input units to 0 with a frequency of rate at each step during training time, which helps prevent...
Read more >
Dropout Neural Network Layer In Keras Explained
Dropout can help a model generalize by randomly setting the output for a given neuron to 0. In setting the output to 0,...
Read more >
Dropout Regularization in Neural Networks: How it Works and ...
Dropout regularization is a technique to prevent neural networks from overfitting. Dropout works by randomly disabling neurons and their ...
Read more >
Implementing Dropout in PyTorch: With Example - Wandb
Dropout is a machine learning technique where you remove (or "drop out") units in a neural net to simulate training large numbers of ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found