Using Special type activation function
See original GitHub issueHi,
I am pretty new to neurodiffeq, thank you very much for an excellent library. I want to use special type activation function in particular
f(x)= cos(1.75 x)* exp (-x**2/2)
so how I will define the class for using this activation function? and how to call this function in this line
fcnn=FCNN(hidden_units=(50,50, ), actv=nn.Tanh)
Thank you in advance
Issue Analytics
- State:
- Created 2 years ago
- Comments:9 (4 by maintainers)
Top Results From Across the Web
Activation Functions in Neural Networks [12 Types & Use Cases]
The primary role of the Activation Function is to transform the summed weighted input from the node into an output value to be...
Read more >Activation Functions | Fundamentals Of Deep Learning
Popular types of activation functions and when to use them · 1. Binary Step Function · 2. Linear Function · 3. Sigmoid ·...
Read more >Activation Functions in Neural Networks - Towards Data Science
Activation Functions in Neural Networks · Linear or Identity Activation Function · Non-linear Activation Function · 1. Sigmoid or Logistic Activation Function ·...
Read more >How to Choose an Activation Function for Deep Learning
The activation function used in hidden layers is typically chosen based on the type of neural network architecture. Modern neural network models ...
Read more >Activation functions in Neural Networks - GeeksforGeeks
Linear Function · Sigmoid Function · Tanh Function · RELU Function · Softmax Function · Please Login to comment... · Improve your Coding...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
I haven’t heard of the butterfly optimization algorithm. If you really want to try it out, you probably have to implement your own Optimizer. Here is an article.
Can you be a little bit more specific as to “Loss and validation graph”? In general, you can inject behavior, including custom plotting and logging using the callback feature.
exp(0.5)
is just a constant. You can usenp.exp
ormath.exp
instead oftorch.exp
, which only works for pytorch tensors (similar to numpy arrays)