working of feature/add-dropout
See original GitHub issueFeature request
The dropout feature is already in development. I wanted to contact @Optimox to discuss if the feature works already or needs more development.
What is the expected behavior?
I expect that dropout is either an argument to TabNetRegressor/TabNetClassifier or clf.fit.
However, when I tried the development branch feature/add-dropout I found the error:
__init__() got an unexpected keyword argument 'dropout'
I check the code but it would be useful to understand better why is failing.
What is motivation or use case for adding/changing the behavior?
Dropout can be useful in some situations.
How should this be implemented in your opinion?
I think that it would be awesome to have dropout in the new develop branch.
Are you willing to work on this yourself?
yes, I will keep digging the code or try to add the dropout layer in the tabnet_network.py
file within the branch develop, but it would be useful to know @Optimox experience.
Issue Analytics
- State:
- Created 2 years ago
- Comments:9
Top GitHub Comments
@Optimox thanks for your reply.
Thanks again!
Yes, the only difference is the dataset. The one that helps is from a simulation using multivariate normal distributions. The one that doesn’t help is from the census example dataset. I say more realistic because, at >70% dropout, I wasn’t expecting it to help. Yes, some improvement from 10% to 50% dropout would be nice, still not improving, but using a 10% dropout is good enough for my goal. I will try to use it in the final mapping. Thank you for the suggestion.