question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

optimizer.applyGradients missing from API documentation

See original GitHub issue

TensorFlow.js version

1.7

Browser version

Chrome 80

Describe the problem or feature request

I’m trying to figure out how to manually apply gradients and adjust weights for a GAN that I am building. I noticed in the cart pole example, there is a call to ‘optimizer.applyGradients’.

      // Add the scaled gradients to the weights of the policy network. This
      // step makes the policy network more likely to make choices that lead
      // to long-lasting games in the future (i.e., the crux of this RL
      // algorithm.)
      optimizer.applyGradients(
          scaleAndAverageGradients(allGradients, normalizedRewards));

Here is a link to the example above: https://github.com/tensorflow/tfjs-examples/blob/master/cart-pole/index.js#L170

This sounds like what I need to do for my GAN, but I cannot find any reference to the applyGradients function in the API documentation. Is there a reason why this is not documented?

I’m looking at the 1.7 docs here: https://js.tensorflow.org/api/1.7.0/

Code to reproduce the bug / link to feature request

N/A

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:6 (2 by maintainers)

github_iconTop GitHub Comments

2reactions
freeman-gcommented, Mar 18, 2020

Thanks for working on this!

0reactions
rthadurcommented, Jul 17, 2020

related PR has been merged , thank you @freeman-g @dhirensr

Read more comments on GitHub >

github_iconTop Results From Across the Web

tf.keras.optimizers.Optimizer | TensorFlow v2.11.0
This optimizer class is tf.distribute.Strategy aware, which means it automatically sums gradients across all replicas. To aggregate gradients yourself, call ...
Read more >
Optimizers - Keras
Core Optimizer API. These methods and attributes are common to all Keras optimizers. [source]. apply_gradients method.
Read more >
Summary - Flax - Read the Docs
This FLIP proposes to replace our current flax.optim API (referred to as previous API in this document) with Optax, DeepMind's optimizer library.
Read more >
python - Tensorflow GradientTape "Gradients does not exist ...
If missing gradients are expected, this warning can be suppressed by this workaround: optimizer.apply_gradients( (grad, var) for (grad, ...
Read more >
modulus.hydra.optimizer - NVIDIA Documentation Center
[docs]@dataclass class OptimizerConf: _target_ = MISSING _params_: Any = field( default_factory=lambda: { "compute_gradients": "adam_compute_gradients", ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found