optimizer.applyGradients missing from API documentation
See original GitHub issueTensorFlow.js version
1.7
Browser version
Chrome 80
Describe the problem or feature request
I’m trying to figure out how to manually apply gradients and adjust weights for a GAN that I am building. I noticed in the cart pole example, there is a call to ‘optimizer.applyGradients’.
// Add the scaled gradients to the weights of the policy network. This
// step makes the policy network more likely to make choices that lead
// to long-lasting games in the future (i.e., the crux of this RL
// algorithm.)
optimizer.applyGradients(
scaleAndAverageGradients(allGradients, normalizedRewards));
Here is a link to the example above: https://github.com/tensorflow/tfjs-examples/blob/master/cart-pole/index.js#L170
This sounds like what I need to do for my GAN, but I cannot find any reference to the applyGradients function in the API documentation. Is there a reason why this is not documented?
I’m looking at the 1.7 docs here: https://js.tensorflow.org/api/1.7.0/
Code to reproduce the bug / link to feature request
N/A
Issue Analytics
- State:
- Created 4 years ago
- Comments:6 (2 by maintainers)
Top Results From Across the Web
tf.keras.optimizers.Optimizer | TensorFlow v2.11.0
This optimizer class is tf.distribute.Strategy aware, which means it automatically sums gradients across all replicas. To aggregate gradients yourself, call ...
Read more >Optimizers - Keras
Core Optimizer API. These methods and attributes are common to all Keras optimizers. [source]. apply_gradients method.
Read more >Summary - Flax - Read the Docs
This FLIP proposes to replace our current flax.optim API (referred to as previous API in this document) with Optax, DeepMind's optimizer library.
Read more >python - Tensorflow GradientTape "Gradients does not exist ...
If missing gradients are expected, this warning can be suppressed by this workaround: optimizer.apply_gradients( (grad, var) for (grad, ...
Read more >modulus.hydra.optimizer - NVIDIA Documentation Center
[docs]@dataclass class OptimizerConf: _target_ = MISSING _params_: Any = field( default_factory=lambda: { "compute_gradients": "adam_compute_gradients", ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Thanks for working on this!
related PR has been merged , thank you @freeman-g @dhirensr