question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Constraint on subset of weight matrix

See original GitHub issue

I’ve got an MLP with an input layer connecting into a Dense layer. I currently have 3 inputs, the last of which must only have non-negative weightts, but for the other 2 covariates I don’t want any restrictions.

Currently, my Dense layer is defined as:

Dense(2, activation='relu',
      kernel_constraint=non_neg(), 
      kernel_initializer=RandomUniform(minval=0, maxval=2))

However, I’d ideally only apply this initialiser and constraint to the last input, i.e. the last row of the weight matrix. Is this possible?

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Reactions:1
  • Comments:5

github_iconTop GitHub Comments

1reaction
stulacycommented, Jul 28, 2017

I had to separate the row of weights that I’m interested in and concatenate it with the rest of the weight matrix, rather than modify it in place.

class NonNegLast(Constraint):
    def __call__(self, w):
        last_row = w[-1, :] * K.cast(K.greater_equal(w[-1, :], 0.), K.floatx())
        last_row = K.expand_dims(last_row, axis=0)
        full_w = K.concatenate([w[:-1, :], last_row], axis=0)
        return full_w
0reactions
shaifugptcommented, Jul 2, 2018

@raghakot How can we define such a custom constraint in a bidirectional wrapper, need two separate custom kernel constraints: one for the forward layer and other for backward layer?

Read more comments on GitHub >

github_iconTop Results From Across the Web

Combining Subset Selection and Parameter Constraints in ...
This paper takes a different approach, by considering only a subset of the parameters to be in error. The critical decision is then...
Read more >
Better Training using Weight-Constrained Stochastic ... - arXiv
We provide a general approach to efficiently in- corporate constraints into a stochastic gradient. Langevin framework, allowing enhanced explo-.
Read more >
Subset of matrix rows with half of column sums
If we put all of the constraints in a big matrix B, it will be (1+2m+n)×m. Now I claim that the rank of...
Read more >
Mean-Variance Optimization in Practice: Subset Resampling ...
Gillen shows that combining the subset mean-variance efficient portfolios by averaging their weights is an ex-ante optimal weighting scheme.
Read more >
A new algorithm for optimal 2-constraint satisfaction and its ...
When the constraints have arbitrary weights, there is a (1 + )-approximation ... when k = 3) or matrix multiplication over GF (2)...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found