Constraint on subset of weight matrix
See original GitHub issueI’ve got an MLP with an input layer connecting into a Dense
layer. I currently have 3 inputs, the last of which must only have non-negative weightts, but for the other 2 covariates I don’t want any restrictions.
Currently, my Dense
layer is defined as:
Dense(2, activation='relu',
kernel_constraint=non_neg(),
kernel_initializer=RandomUniform(minval=0, maxval=2))
However, I’d ideally only apply this initialiser and constraint to the last input, i.e. the last row of the weight matrix. Is this possible?
Issue Analytics
- State:
- Created 6 years ago
- Reactions:1
- Comments:5
Top Results From Across the Web
Combining Subset Selection and Parameter Constraints in ...
This paper takes a different approach, by considering only a subset of the parameters to be in error. The critical decision is then...
Read more >Better Training using Weight-Constrained Stochastic ... - arXiv
We provide a general approach to efficiently in- corporate constraints into a stochastic gradient. Langevin framework, allowing enhanced explo-.
Read more >Subset of matrix rows with half of column sums
If we put all of the constraints in a big matrix B, it will be (1+2m+n)×m. Now I claim that the rank of...
Read more >Mean-Variance Optimization in Practice: Subset Resampling ...
Gillen shows that combining the subset mean-variance efficient portfolios by averaging their weights is an ex-ante optimal weighting scheme.
Read more >A new algorithm for optimal 2-constraint satisfaction and its ...
When the constraints have arbitrary weights, there is a (1 + )-approximation ... when k = 3) or matrix multiplication over GF (2)...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
I had to separate the row of weights that I’m interested in and concatenate it with the rest of the weight matrix, rather than modify it in place.
@raghakot How can we define such a custom constraint in a bidirectional wrapper, need two separate custom kernel constraints: one for the forward layer and other for backward layer?