question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Using Special type activation function

See original GitHub issue

Hi,

I am pretty new to neurodiffeq, thank you very much for an excellent library. I want to use special type activation function in particular f(x)= cos(1.75 x)* exp (-x**2/2) so how I will define the class for using this activation function? and how to call this function in this line fcnn=FCNN(hidden_units=(50,50, ), actv=nn.Tanh) Thank you in advance

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:9 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
shuheng-liucommented, Dec 20, 2021

I haven’t heard of the butterfly optimization algorithm. If you really want to try it out, you probably have to implement your own Optimizer. Here is an article.

Can you be a little bit more specific as to “Loss and validation graph”? In general, you can inject behavior, including custom plotting and logging using the callback feature.

1reaction
shuheng-liucommented, Dec 10, 2021

exp(0.5) is just a constant. You can use np.exp or math.exp instead of torch.exp, which only works for pytorch tensors (similar to numpy arrays)

Read more comments on GitHub >

github_iconTop Results From Across the Web

Activation Functions in Neural Networks [12 Types & Use Cases]
The primary role of the Activation Function is to transform the summed weighted input from the node into an output value to be...
Read more >
Activation Functions | Fundamentals Of Deep Learning
Popular types of activation functions and when to use them · 1. Binary Step Function · 2. Linear Function · 3. Sigmoid ·...
Read more >
Activation Functions in Neural Networks - Towards Data Science
Activation Functions in Neural Networks · Linear or Identity Activation Function · Non-linear Activation Function · 1. Sigmoid or Logistic Activation Function ·...
Read more >
How to Choose an Activation Function for Deep Learning
The activation function used in hidden layers is typically chosen based on the type of neural network architecture. Modern neural network models ...
Read more >
Activation functions in Neural Networks - GeeksforGeeks
Linear Function · Sigmoid Function · Tanh Function · RELU Function · Softmax Function · Please Login to comment... · Improve your Coding...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found