Getting learning_phase at the activation functions level
See original GitHub issueI modified the ReLU activation function to print the value of the learning phase, expecting that during the training (fit) and testing (evaluate, predict) it will be different. In fact, according to the documentation, the value during the training should be 1 whereas during the testing 0 is expected.
def relu(x, alpha=0., max_value=None): print(K.learning_phase()) return K.relu(x, alpha=alpha, max_value=max_value)
During the training I get:
Tensor(“activation_1/keras_learning_phase:0”, dtype=bool)
and during testing I don’t get anything at all.
How does it make sense? JM.
Issue Analytics
- State:
- Created 6 years ago
- Comments:8
Top Results From Across the Web
Everything you need to know about “Activation Functions” in ...
Everything you need to know about “Activation Functions” in Deep learning models. This article is your one-stop solution to every possible ...
Read more >Activation Functions in Neural Networks [12 Types & Use Cases]
The primary role of the Activation Function is to transform the summed weighted input from the node into an output value to be...
Read more >Activation Functions — ML Glossary documentation
A straight line function where activation is proportional to input ( which is the weighted sum from neuron ). Function, Derivative. R(z ...
Read more >Understanding Activation Functions in Neural Networks
The first thing that comes to our minds is how about a threshold based activation function? If the value of Y is above...
Read more >How to choose Activation Functions in Deep Learning? - Turing
The most vital part of a neural network is its activation functions. Based on them, a neural network decides if a node will...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Thanks!
activation._uses_learning_phase was the key 😉
@mahnerak Thanks! I noticed that I have wrongly edited it. And the original error reported is:
Thank you for your attention.