question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Deprecation: gelu activation has been migrated to core TensorFlow, and will be deprecated in Addons 0.13.

See original GitHub issue

While working on #9078 I encountered a warning while training from the default project:

DeprecationWarning('gelu activation has been migrated to core TensorFlow, and will be deprecated in Addons 0.13.')

For some reason this does not show when running it through the CLI, it maybe suppressed. However this should be fixed and is also blocking the tests where I assert programmatically that there are no warnings.

** Definition of Done **

  • Deprecation warning no longer occurs while training default project

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:10 (10 by maintainers)

github_iconTop GitHub Comments

1reaction
joejuzlcommented, Oct 15, 2021

Feel free to close this if/when it’s been fixed by TF 2.6

1reaction
samsucikcommented, Jul 23, 2021

@alopez yes, but with the entire TF 2.5 thing potentially taking long to be closed off, we might tick this one off separately. Afaik it’s just about replacing tfa.activations.gelu with tf.nn.gelu in the code 🙂 I think it might be an effort:research/0.5 😄

Read more comments on GitHub >

github_iconTop Results From Across the Web

tfa.activations.gelu | TensorFlow Addons
tfa.activations.gelu( x: tfa.types.TensorLike , approximate: bool = True ) -> tf.Tensor. Computes gaussian error linear:.
Read more >
tensorflow.nn.gelu Example - Program Talk
Learn how to use python api tensorflow.nn.gelu. ... "gelu activation has been migrated to core TensorFlow, " "and will be deprecated in Addons...
Read more >
NEWS.md - apache/mxnet - Sourcegraph
MXNet Extensions: custom operators, partitioning, and graph passes ... NumPy has long been established as the standard math library in Python, ...
Read more >
Accelerating ReLu and GeLu Activation Functions, and ...
GeLU for INT8 I/O, INT32 Tensor Core compute kernels. Support for Batched Sparse GEMM: Single sparse matrix / Multiple dense matrices (Broadcast) ...
Read more >
NEWS.md · 630a14488db4f351551063290ad545f7138108df ...
MXNet Extensions: custom operators, partitioning, and graph passes ... NumPy has long been established as the standard math library in Python, ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found