Cost function that isn't just y_pred, y_true?
See original GitHub issueI have a cost function that in addition to using the overall network output, needs to multiply it by another function of the network weights (it’s actually the partial derivative of the output with respect to one of the inputs). Custom cost functions are parameterised as f(y_true, y_pred)
, and so cannot be used to provide this second function of the weights that I want.
I’ve seen a similar issue before where @shamidreza states that they had to use Theano for this functionality.
Is it still the case that this is the best option? I’ve only used Keras in R before so have no experience with either TensorFlow or Theano, would either be suitable in R?
Issue Analytics
- State:
- Created 6 years ago
- Reactions:2
- Comments:5 (2 by maintainers)
Top Results From Across the Web
How to define custom cost function that depends on input ...
1 Answer 1 · Inner model: original model that predicts desired outputs · Outer model: Takes y_true data as inputs; Takes features as...
Read more >How to Create a Custom Loss Function | Keras | by Shiva Verma
Creating Custom Loss Function The loss function should take only 2 arguments, which are target value (y_true) and predicted value (y_pred) . ...
Read more >A Guide To Logistic Regression With Tensorflow 2.0 | Built In
The cost function is the element that deviates the path from linear to logistic. In linear regression, the output is a continuously valued...
Read more >Introduction to Neural Networks in TensorFlow 2
Just like numpy, TensorFlow supports broadcasting. ... def mse(y_true, y_pred): # TensorFlow has this function built in: tf.keras.losses.
Read more >Programming a neural network from scratch - Ritchie Vink
The second layer seems to repeat the same math operation, only with ... functions can thus be replaced by one linear function and...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
It sounds like this example implements what you need. It’s not ideal in my opinion, feels a bit like a workaround, but it appears to work.
I’m going to close this issue as the custom Layer method method suggested by @hgaiser works for my use case. Essentially rather than adding a loss function explicitly, you create a custom layer that calculates the loss. When using the network for prediction you create a new model that uses an earlier layer as the output of interest. The
K.gradients
function also provided the differentiation that I required.