Adding custom loss function for variational autoencoder keras
I am trying to add a custom loss function for variational autoencoder. Along with the reconstruction loss, KL divergence I wish to add a loss based on the difference between hamming distances of pairs of input and outputs.
But the problem what I am having is that with or without this extra loss, the results are same. Could anyone point out what I should be doing to correct it? Is it something to do with dimensions or something else.
Here is my code snippet:
def ham_loss(y_true,y_pred):
# calculate pairwise hamming distance matrix
# differences of y_pred probabilities)
pairwise_diff_pred = K.expand_dims(y_pred, 0) - K.expand_dims(y_pred, 1)
pairwise_distance_pred = K.sum(pairwise_diff_pred, axis=-1)
# calculate pairwise hamming distance matrix for inputs
pairwise_diff_true = K.expand_dims(y_true, 0) - K.expand_dims(y_true, 1)
pairwise_distance_true = K.sum(pairwise_diff_true, axis=-1)
#Difference between the distances of y_true and y_predictions
hamm_sum= Lambda(differences)([pairwise_distance_true, pairwise_distance_pred])
print(hamm_sum)
return K.sum(hamm_sum, axis=-1)
def vae_loss(y_true, y_pred):
""" Calculate loss = reconstruction loss + KL loss for each data in minibatch """
# E[log P(X|z)]
recon = K.sum(K.binary_crossentropy(y_true,y_pred),axis=1)
# D_KL(Q(z|X) || P(z|X)); calculate in closed form as both dist. are Gaussian
kl = 0.5 * K.sum(K.exp(z_log_var) + K.square(z_mean) - 1. - z_log_var, axis=1)
hamming_loss = ham_loss(y_true,y_pred)
return recon + kl + hamming_loss
Any help much appreciated!
Thanks in advance..
python keras autoencoder loss-function
add a comment |
I am trying to add a custom loss function for variational autoencoder. Along with the reconstruction loss, KL divergence I wish to add a loss based on the difference between hamming distances of pairs of input and outputs.
But the problem what I am having is that with or without this extra loss, the results are same. Could anyone point out what I should be doing to correct it? Is it something to do with dimensions or something else.
Here is my code snippet:
def ham_loss(y_true,y_pred):
# calculate pairwise hamming distance matrix
# differences of y_pred probabilities)
pairwise_diff_pred = K.expand_dims(y_pred, 0) - K.expand_dims(y_pred, 1)
pairwise_distance_pred = K.sum(pairwise_diff_pred, axis=-1)
# calculate pairwise hamming distance matrix for inputs
pairwise_diff_true = K.expand_dims(y_true, 0) - K.expand_dims(y_true, 1)
pairwise_distance_true = K.sum(pairwise_diff_true, axis=-1)
#Difference between the distances of y_true and y_predictions
hamm_sum= Lambda(differences)([pairwise_distance_true, pairwise_distance_pred])
print(hamm_sum)
return K.sum(hamm_sum, axis=-1)
def vae_loss(y_true, y_pred):
""" Calculate loss = reconstruction loss + KL loss for each data in minibatch """
# E[log P(X|z)]
recon = K.sum(K.binary_crossentropy(y_true,y_pred),axis=1)
# D_KL(Q(z|X) || P(z|X)); calculate in closed form as both dist. are Gaussian
kl = 0.5 * K.sum(K.exp(z_log_var) + K.square(z_mean) - 1. - z_log_var, axis=1)
hamming_loss = ham_loss(y_true,y_pred)
return recon + kl + hamming_loss
Any help much appreciated!
Thanks in advance..
python keras autoencoder loss-function
Have you triedreturn recon + kl + 100*hamming_loss
? Perhaps just your hamming cost is several orders of magnitude lower than the rest of your cost?
– Matthieu Brucher
Nov 19 '18 at 10:32
Yeah I did but it didn't really change the result in a big way!
– Anil Gaddam
Dec 6 '18 at 20:19
Probably because of the difference in scale between the costs!
– Matthieu Brucher
Dec 6 '18 at 20:49
add a comment |
I am trying to add a custom loss function for variational autoencoder. Along with the reconstruction loss, KL divergence I wish to add a loss based on the difference between hamming distances of pairs of input and outputs.
But the problem what I am having is that with or without this extra loss, the results are same. Could anyone point out what I should be doing to correct it? Is it something to do with dimensions or something else.
Here is my code snippet:
def ham_loss(y_true,y_pred):
# calculate pairwise hamming distance matrix
# differences of y_pred probabilities)
pairwise_diff_pred = K.expand_dims(y_pred, 0) - K.expand_dims(y_pred, 1)
pairwise_distance_pred = K.sum(pairwise_diff_pred, axis=-1)
# calculate pairwise hamming distance matrix for inputs
pairwise_diff_true = K.expand_dims(y_true, 0) - K.expand_dims(y_true, 1)
pairwise_distance_true = K.sum(pairwise_diff_true, axis=-1)
#Difference between the distances of y_true and y_predictions
hamm_sum= Lambda(differences)([pairwise_distance_true, pairwise_distance_pred])
print(hamm_sum)
return K.sum(hamm_sum, axis=-1)
def vae_loss(y_true, y_pred):
""" Calculate loss = reconstruction loss + KL loss for each data in minibatch """
# E[log P(X|z)]
recon = K.sum(K.binary_crossentropy(y_true,y_pred),axis=1)
# D_KL(Q(z|X) || P(z|X)); calculate in closed form as both dist. are Gaussian
kl = 0.5 * K.sum(K.exp(z_log_var) + K.square(z_mean) - 1. - z_log_var, axis=1)
hamming_loss = ham_loss(y_true,y_pred)
return recon + kl + hamming_loss
Any help much appreciated!
Thanks in advance..
python keras autoencoder loss-function
I am trying to add a custom loss function for variational autoencoder. Along with the reconstruction loss, KL divergence I wish to add a loss based on the difference between hamming distances of pairs of input and outputs.
But the problem what I am having is that with or without this extra loss, the results are same. Could anyone point out what I should be doing to correct it? Is it something to do with dimensions or something else.
Here is my code snippet:
def ham_loss(y_true,y_pred):
# calculate pairwise hamming distance matrix
# differences of y_pred probabilities)
pairwise_diff_pred = K.expand_dims(y_pred, 0) - K.expand_dims(y_pred, 1)
pairwise_distance_pred = K.sum(pairwise_diff_pred, axis=-1)
# calculate pairwise hamming distance matrix for inputs
pairwise_diff_true = K.expand_dims(y_true, 0) - K.expand_dims(y_true, 1)
pairwise_distance_true = K.sum(pairwise_diff_true, axis=-1)
#Difference between the distances of y_true and y_predictions
hamm_sum= Lambda(differences)([pairwise_distance_true, pairwise_distance_pred])
print(hamm_sum)
return K.sum(hamm_sum, axis=-1)
def vae_loss(y_true, y_pred):
""" Calculate loss = reconstruction loss + KL loss for each data in minibatch """
# E[log P(X|z)]
recon = K.sum(K.binary_crossentropy(y_true,y_pred),axis=1)
# D_KL(Q(z|X) || P(z|X)); calculate in closed form as both dist. are Gaussian
kl = 0.5 * K.sum(K.exp(z_log_var) + K.square(z_mean) - 1. - z_log_var, axis=1)
hamming_loss = ham_loss(y_true,y_pred)
return recon + kl + hamming_loss
Any help much appreciated!
Thanks in advance..
python keras autoencoder loss-function
python keras autoencoder loss-function
edited Nov 19 '18 at 10:31
Matthieu Brucher
15.3k32140
15.3k32140
asked Nov 18 '18 at 1:31
Anil GaddamAnil Gaddam
63
63
Have you triedreturn recon + kl + 100*hamming_loss
? Perhaps just your hamming cost is several orders of magnitude lower than the rest of your cost?
– Matthieu Brucher
Nov 19 '18 at 10:32
Yeah I did but it didn't really change the result in a big way!
– Anil Gaddam
Dec 6 '18 at 20:19
Probably because of the difference in scale between the costs!
– Matthieu Brucher
Dec 6 '18 at 20:49
add a comment |
Have you triedreturn recon + kl + 100*hamming_loss
? Perhaps just your hamming cost is several orders of magnitude lower than the rest of your cost?
– Matthieu Brucher
Nov 19 '18 at 10:32
Yeah I did but it didn't really change the result in a big way!
– Anil Gaddam
Dec 6 '18 at 20:19
Probably because of the difference in scale between the costs!
– Matthieu Brucher
Dec 6 '18 at 20:49
Have you tried
return recon + kl + 100*hamming_loss
? Perhaps just your hamming cost is several orders of magnitude lower than the rest of your cost?– Matthieu Brucher
Nov 19 '18 at 10:32
Have you tried
return recon + kl + 100*hamming_loss
? Perhaps just your hamming cost is several orders of magnitude lower than the rest of your cost?– Matthieu Brucher
Nov 19 '18 at 10:32
Yeah I did but it didn't really change the result in a big way!
– Anil Gaddam
Dec 6 '18 at 20:19
Yeah I did but it didn't really change the result in a big way!
– Anil Gaddam
Dec 6 '18 at 20:19
Probably because of the difference in scale between the costs!
– Matthieu Brucher
Dec 6 '18 at 20:49
Probably because of the difference in scale between the costs!
– Matthieu Brucher
Dec 6 '18 at 20:49
add a comment |
0
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53357133%2fadding-custom-loss-function-for-variational-autoencoder-keras%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53357133%2fadding-custom-loss-function-for-variational-autoencoder-keras%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Have you tried
return recon + kl + 100*hamming_loss
? Perhaps just your hamming cost is several orders of magnitude lower than the rest of your cost?– Matthieu Brucher
Nov 19 '18 at 10:32
Yeah I did but it didn't really change the result in a big way!
– Anil Gaddam
Dec 6 '18 at 20:19
Probably because of the difference in scale between the costs!
– Matthieu Brucher
Dec 6 '18 at 20:49