Types for functions activations used in Sequential?
See original GitHub issueIs there interest in a pull request adding a few model layers representing activations for use in Sequential models? Many Pytorch examples use sequential, and this could be helpful for beginners trying to translate pytorch models to diffsharp.
I recognize that you may not want this because it increases the maintenance burden and -->
covers many Sequential
use cases. Hence the question.
For example, the pytorch.org quickstart model definition example uses sequential, and I believe this is the correct translation:
type ReLU() =
inherit Model()
override _.ToString() = sprintf "ReLU()"
override m.forward(value) =
dsharp.relu(value)
type Flatten(?startDim:int,?endDim:int) =
inherit Model()
let startDim = defaultArg startDim 1
override _.ToString() = sprintf "Flatten()"
override m.forward(value) =
dsharp.flatten(value, startDim, ?endDim=endDim)
type NeuralNetwork() =
inherit Model()
let linear_relu_stack = Sequential([
Linear(28*28, 512)
ReLU()
Linear(512, 512)
ReLU()
Linear(512, 10)
])
do base.addModel([linear_relu_stack])
member self.flatten = Flatten()
override self.forward(x) =
let x = self.flatten.forward(x)
let logits = linear_relu_stack.forward(x)
logits
let model = NeuralNetwork()
Issue Analytics
- State:
- Created 2 years ago
- Comments:7 (7 by maintainers)
Top Results From Across the Web
7 popular activation functions you should know in Deep ...
7 popular activation functions you should know in Deep Learning and how to use them with Keras and TensorFlow 2 · 1. Sigmoid...
Read more >How to Choose an Activation Function for Deep Learning
The activation function used in hidden layers is typically chosen based on the type of neural network architecture. Modern neural network ...
Read more >Layer activation functions
Activations can either be used through an Activation layer, or through the activation argument ... Applies the rectified linear unit activation function.
Read more >Activation Functions in Neural Networks: 15 examples
Activation functions can generally be classified into three main categories: binary step, linear, and non-linear, with numerous subcategories, ...
Read more >5 Neural Network Activation Functions to Know
5 Different Kinds of Activation Functions · 1. Sigmoid Activation Function · 2. Tanh Activation Function · 3. Rectified Linear Unit (ReLu) Activation...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Thank you for the thoughtful response. I am not aware of Torch’s history, so that is helpful.
I prefer
model3
with-->
syntax, but given how frequently I have seenSequential
in PyTorch examples, I think that your function-to-model wrapper is a worthy addition. It is an elegant solution.This context may help explain my motivations. I want to be able to tell my students:
I think there’s value in having the ability to do a line-by-line translation of whatever PyTorch.org has as the quickstart, even if it is more common for people to use
-->
once they’re familiar with the library.Thank you for adding
Model(...)
. This looks fantastic and the two versions (model 4
andlet ReLu() = Model(dsharp.relu)
) are sufficient to cover my use case.I’ll close this issue as addressed. With the addition of
Model(...)
, I no longer feel the need to submit a PR addingModel
objects for activation functions.Thanks again for this great library.