question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Getting learning_phase at the activation functions level

See original GitHub issue

I modified the ReLU activation function to print the value of the learning phase, expecting that during the training (fit) and testing (evaluate, predict) it will be different. In fact, according to the documentation, the value during the training should be 1 whereas during the testing 0 is expected.

def relu(x, alpha=0., max_value=None): print(K.learning_phase()) return K.relu(x, alpha=alpha, max_value=max_value) During the training I get: Tensor(“activation_1/keras_learning_phase:0”, dtype=bool) and during testing I don’t get anything at all.

How does it make sense? JM.

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Comments:8

github_iconTop GitHub Comments

1reaction
jmlipmancommented, Aug 14, 2017

Thanks!

activation._uses_learning_phase was the key 😉

0reactions
zhhongzhicommented, Sep 6, 2017

@mahnerak Thanks! I noticed that I have wrongly edited it. And the original error reported is:

`Using TensorFlow backend.
Traceback (most recent call last):
  File "/mydata/wp/SimpleQAMine/fullSimpleQA/relation_reranking/learn_get_intern_valuse.py", line 60, in <module>
    y = Dense(units=10, activation=custom_activation) (h)
  File "/home/hongzhi/anaconda2/lib/python2.7/site-packages/Keras-2.0.2-py2.7.egg/keras/engine/topology.py", line 578, in __call__
    output = self.call(inputs, **kwargs)
  File "/home/hongzhi/anaconda2/lib/python2.7/site-packages/Keras-2.0.2-py2.7.egg/keras/layers/core.py", line 845, in call
    output = self.activation(output)
  File "/mydata/wp/SimpleQAMine/fullSimpleQA/relation_reranking/learn_get_intern_valuse.py", line 54, in custom_activation
    activation = x + K.in_train_phase(0.0, 1000.0)
  File "/home/hongzhi/anaconda2/lib/python2.7/site-packages/Keras-2.0.2-py2.7.egg/keras/backend/tensorflow_backend.py", line 2451, in in_train_phase
    x = switch(training, x, alt)
  File "/home/hongzhi/anaconda2/lib/python2.7/site-packages/Keras-2.0.2-py2.7.egg/keras/backend/tensorflow_backend.py", line 2410, in switch
    else_expression_fn)
  File "/home/hongzhi/anaconda2/lib/python2.7/site-packages/tensorflow/python/ops/control_flow_ops.py", line 1741, in cond
    orig_res, res_t = context_t.BuildCondBranch(fn1)
  File "/home/hongzhi/anaconda2/lib/python2.7/site-packages/tensorflow/python/ops/control_flow_ops.py", line 1668, in BuildCondBranch
    real_v = self._ProcessOutputTensor(v)
  File "/home/hongzhi/anaconda2/lib/python2.7/site-packages/tensorflow/python/ops/control_flow_ops.py", line 1626, in _ProcessOutputTensor
    if val.name not in self._values:
AttributeError: 'float' object has no attribute 'name'`

Thank you for your attention.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Everything you need to know about “Activation Functions” in ...
Everything you need to know about “Activation Functions” in Deep learning models. This article is your one-stop solution to every possible ...
Read more >
Activation Functions in Neural Networks [12 Types & Use Cases]
The primary role of the Activation Function is to transform the summed weighted input from the node into an output value to be...
Read more >
Activation Functions — ML Glossary documentation
A straight line function where activation is proportional to input ( which is the weighted sum from neuron ). Function, Derivative. R(z ...
Read more >
Understanding Activation Functions in Neural Networks
The first thing that comes to our minds is how about a threshold based activation function? If the value of Y is above...
Read more >
How to choose Activation Functions in Deep Learning? - Turing
The most vital part of a neural network is its activation functions. Based on them, a neural network decides if a node will...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found