Deprecation: gelu activation has been migrated to core TensorFlow, and will be deprecated in Addons 0.13.
See original GitHub issueWhile working on #9078 I encountered a warning while training from the default project:
DeprecationWarning('gelu activation has been migrated to core TensorFlow, and will be deprecated in Addons 0.13.')
For some reason this does not show when running it through the CLI, it maybe suppressed. However this should be fixed and is also blocking the tests where I assert programmatically that there are no warnings.
** Definition of Done **
- Deprecation warning no longer occurs while training default project
Issue Analytics
- State:
- Created 2 years ago
- Comments:10 (10 by maintainers)
Top Results From Across the Web
tfa.activations.gelu | TensorFlow Addons
tfa.activations.gelu( x: tfa.types.TensorLike , approximate: bool = True ) -> tf.Tensor. Computes gaussian error linear:.
Read more >tensorflow.nn.gelu Example - Program Talk
Learn how to use python api tensorflow.nn.gelu. ... "gelu activation has been migrated to core TensorFlow, " "and will be deprecated in Addons...
Read more >NEWS.md - apache/mxnet - Sourcegraph
MXNet Extensions: custom operators, partitioning, and graph passes ... NumPy has long been established as the standard math library in Python, ...
Read more >Accelerating ReLu and GeLu Activation Functions, and ...
GeLU for INT8 I/O, INT32 Tensor Core compute kernels. Support for Batched Sparse GEMM: Single sparse matrix / Multiple dense matrices (Broadcast) ...
Read more >NEWS.md · 630a14488db4f351551063290ad545f7138108df ...
MXNet Extensions: custom operators, partitioning, and graph passes ... NumPy has long been established as the standard math library in Python, ...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Feel free to close this if/when it’s been fixed by TF 2.6
@alopez yes, but with the entire TF 2.5 thing potentially taking long to be closed off, we might tick this one off separately. Afaik it’s just about replacing
tfa.activations.gelu
withtf.nn.gelu
in the code 🙂 I think it might be aneffort:research/0.5
😄