question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Improve customizability of Bop

See original GitHub issue

Feature motivation

Currently when using the Bop optimizer, it will update the weights on all quantized layers: https://github.com/larq/larq/blob/b32830b4cca4a69ac1daf1d176239971112620e4/larq/optimizers_v2.py#L83-L84

This can be problematic if one wants to use a quantized layer with no kernel_quantizer or a quantizer with higher precision.

Feature description

It would be good to have a more fine grained control over which layers are trained with Bop and which use other precision.

Feature implementation

One possible implementation would be to add a fake lq.quantizers.bop function that doesn’t change the forward pass, but marks this kernel so Bop should handle weight updates. This could be achieved by using a specific name scope or adding an attribute that Bop recognises. An other possibility would be to explicitly pass a list of layers or variables to Bop. Can you think of a better way to handle this?

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:8 (8 by maintainers)

github_iconTop GitHub Comments

1reaction
leonoverweelcommented, Nov 12, 2019

Reopening for now because we haven’t addressed the is_binary() part yet.

1reaction
leonoverweelcommented, Oct 31, 2019

@lgeiger and I chatted about this earlier today. Outcome of our discussion:

Optimizer API

We have the following three choices:

  1. Current setup: Bop has an optimizer parameter to which we pass a real-valued optimizer. Bop takes care of whether it or the real-valued optimizer trains a given variable.
  2. Subclassing keras.models.Model so that model.compile can take more than one optimizer. Forking the compile or fit functions would then take care of which optimizer trains which weights.
  3. Creating an OptimizerGroup with the binary and real optimizers as attributes, which subclasses tf.keras.optimizers.Optimizer we passs as optimizer to model.compile. This then takes care of which optimizer trains which weights.

We decided to go for 3. for several reasons. One is separation of concerns: the OptimizerGroup will be able to take care of the complexities of selecting which optimizer trains which weights, so that each optimizer just has to do “optimizer stuff.” It will also be able to take care of things like the following, which currently Bop needs to do:

    def __getattr__(self, name):
        if name == "lr":
            return self.fp_optimizer.lr
        return super().__getattr__(name)

Another reason is that, for now, 2. looks like too much work because model.compile and model.fit do a lot of different things, and updating/maintaining those to work with multiple optimizers would be a lot of work.

Binary check

For making is_binary more robust, we’re going to try setting a _is_binary (or similar) attribute on the variables of the layers that need to be optimized by Bop. We’ll replace calls to this:

    @staticmethod
    def is_binary(var):
        return "/kernel" in var.name and "quant_" in var.name

With just a hasattr("is_binary") check. Setting this explicitly should be more robust than checking the generated names of layers.


I’ll begin with implementing the OptimizerGroup and then move to the is_binary check.

Read more comments on GitHub >

github_iconTop Results From Across the Web

New World Creation Customization Options for BOP - Reddit
We've already added flower baskets that automatically collect flowers and other plants in them if you have one with you, rather than adding...
Read more >
Five BOP Gaps and Pitfalls Agents Should Watch - IIABA
The BOP offers no way for the insured to increase the protection ... the BOP looks like a full package of protection, it...
Read more >
Five Things Every Agent Should Know about Our BOP
When it comes to affordable, customizable coverage, a businessowners policy may be the perfect fit for your small business customers. To learn more...
Read more >
Innovations in Backorder processing in S/4 HANA - SAP Blogs
This blog post highlights on the set up procedure and innovations that SAP offers in the area of Back Order processing (BOP) as...
Read more >
BoP Business - Training Programs - MIT D-Lab
MIT D-Lab: A Tailored Approach to BoP Business Training Programs ... thumb classroom-based training program resulted in significantly improved business.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found