question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Neural Network Wrappers

See original GitHub issue

This can be an ongoing discussion on what to include, but convenient wrappers for the neural network functionality would be really nice. Things like optimizers to handle the bookkeeping of all your variable updates, layers that register parameters, and so on. Think PyTorch but documented and clean (so not really PyTorch at all 😛). Feel free to add to the list below, or comment on anything. This can serve as a discussion and a board of ideas.

Not all of these are probably critical (for example, who really needs all those optimizers?), but I think they’d be nice to have to call this fully-fledged.

We should certainly discuss how adding these may change the design of MyGrad and keep in mind design decisions while implementing these.

Optimizers

These should take the parameters of a model and perform optimization over those parameters with respect to some loss. Learning rate schedulers may be included under here.

  • (Batch) SGD [with momentum, maybe with Nesterov]
  • Adam
  • Adadelta
  • Adagrad
  • (L-)BFGS
  • rmsprop

Convenience Layers

My thought here is to have classes that handle all of the parameter registration necessary for each layer in a network. For example, a Conv2D layer may take K, R, C and create the necessary weight Tensor, then register its parameters with an optimizer for easy updates.

  • ConvNd (N ∈{1, 2, 3} probably at the least)
  • BatchNormNd
  • Dense layer
  • Any activations that need a dedicated layer (adaptive layers like PReLu)
  • Recurrent layers (plain RNN, LSTM, GRU)
  • Dropout?

More Losses

Should be self-explanatory

  • L{1,2}
  • Negative log-likelihood
  • KL divergence

Initializers

Very handy to be able to pass a Tensor to a function that will initialize it according to some method.

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Reactions:1
  • Comments:13 (7 by maintainers)

github_iconTop GitHub Comments

1reaction
davidmascharkacommented, Mar 26, 2018

For anyone following, we’re (by which I mean Ryan is) keeping MyGrad a pure autograd library. The neural network functionality will be forked off into its own repo with a dependency on MyGrad. This will hopefully keep everything in both code bases a little cleaner. I’ll be heading up the neural network project while Ryan maintains MyGrad (RyGrad). If you’re interested in contributing to the neural network package, let me know and I’ll get you added as a collaborator on the repo. I’ll leave this open for the next few days, then close this as it’s branched off into the new repo.

0reactions
davidmascharkacommented, Apr 10, 2018

I’m going to close this. This can be an ongoing discussion (either in this thread, the other repo, or hangouts/in person/whatever), but there’s no need to keep an issue open. If a merge needs to happen in the future, that can be its own project.

Read more comments on GitHub >

github_iconTop Results From Across the Web

A new wrapper feature selection approach using neural network
This paper presents a new feature selection (FS) algorithm based on the wrapper approach using neural networks (NNs). The vital aspect of this...
Read more >
Wrapper Approach for Learning Neural Network Ensemble by ...
All the neural network components in the ensemble are trained with feature subsets selected from the total number of available features by wrapper...
Read more >
A new wrapper feature selection approach using neural network
This paper presents a new feature selection (FS) algorithm based on the wrapper approach using neural networks (NNs).
Read more >
Wrapping your head around neural networks in Python
We'll cover: What are neural networks? Getting started with neural networks in Python; Wrapping up and next steps ...
Read more >
Neural Network Wrapping Paper - Zazzle
FREE Design Tool on Zazzle! Shop Neural Network Wrapping Paper created by spudcreative. Personalize it with photos & text or purchase as is!...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found