question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

activation function not work properly

See original GitHub issue
import tensorflow as tf
import tf_encrypted as tfe
import numpy as np

val = 6

x = tf.constant(value=val, dtype=tf.float32)

def asdf():
    global x
    return x

def qwer():
    return tf.random_normal(shape=(1, 5), dtype=tf.float32)

def fdsa(x):
    return tf.print(x)

with tf.Session() as sess:
    z1 = sess.run(x)

tfe_x = tfe.define_private_input("a", asdf)
W = tfe.define_private_input("a", qwer)
r = tfe_x * W
relu_r = tfe.relu(r)
op = tfe.define_output("a", r, fdsa)
relu_op = tfe.define_output("a", relu_r, fdsa)

with tfe.Session() as sess:
    sess.run(tfe.global_variables_initializer())
    sess.run(op)
    sess.run(relu_op)

I write my code like this to check that relu works fine.

[[6.2624599908550529 -4.1710105166895293 -0.20941929583904892 10.096936442615455 8.4179240969364422]] [[-0.15759792714525225 4.9518366102728244 -0.032464563328760861 0.425849718030788 -2.7239750038103949]]

and I get answer like this.

answer for r is nearly what I expected but result for relu_r is not what I expected. I want to know what I miss from my code.

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:5 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
jvmncscommented, May 22, 2019

This is the same problem as with the approximate relu – for sigmoid there is no exact protocol, so we approximate it with polynomials. The polynomial approximation gets worse outside of a thin interval (something like -4 to 4), so the values start to explode outside of the normal range when you get outside of that interval.

If you switch val to something like 1 or -1 you should see better results.

1reaction
mortendahlcommented, May 21, 2019

Hi @ByungCheolShin, thanks a lot for giving it a try!

The problem is that by default we’re using an approximation to ReLU for better performance. Simply add tfe.set_protocol(tfe.protocol.SecureNN()) directly after your imports and you should be good.

Note that you are computing r and relu_r on two different values since every sess.run will re-run tf.random_normal; if this is not what you intended then simply use op = tfe.define_output("a", [r, relu_r], fdsa) instead of the two define_output you have now (changing the signature of fdsa to match).

Read more comments on GitHub >

github_iconTop Results From Across the Web

My Neural Network isn't working! What should I do?
The most common error is using a ReLU on the final layer - resulting in a network can only produce positive values as...
Read more >
neural networks - Why is step function not used in activation ...
There are two main reasons why we cannot use the Heaviside step function in (deep) Neural Net: At the moment, one of the...
Read more >
Everything you need to know about “Activation Functions” in ...
This article is your one-stop solution to every possible question related to activation functions that can come into your mind that are used ......
Read more >
Keras custom activation function (not training) - Stack Overflow
My first approach was to rebuild an existing activation function (ELU) to see if the there is a problem with my own activation...
Read more >
Activation Functions in Neural Networks [12 Types & Use Cases]
What is a neural network activation function and how does it work? Explore twelve different types of activation functions and learn how to...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found