how to calculate the gradient in python numpy
i have to implement the Stochastic Gradient Descent in Numpy. So I've to define the gradient of this function E:
In which also f and g are defined in the image.
I've no idea of how to do this, I tried with Sympy and numdifftools but these libraries give me some errors.
How could I write the gradient of the function E?
Thank you
numpy scipy gradient sympy
add a comment |
i have to implement the Stochastic Gradient Descent in Numpy. So I've to define the gradient of this function E:
In which also f and g are defined in the image.
I've no idea of how to do this, I tried with Sympy and numdifftools but these libraries give me some errors.
How could I write the gradient of the function E?
Thank you
numpy scipy gradient sympy
add a comment |
i have to implement the Stochastic Gradient Descent in Numpy. So I've to define the gradient of this function E:
In which also f and g are defined in the image.
I've no idea of how to do this, I tried with Sympy and numdifftools but these libraries give me some errors.
How could I write the gradient of the function E?
Thank you
numpy scipy gradient sympy
i have to implement the Stochastic Gradient Descent in Numpy. So I've to define the gradient of this function E:
In which also f and g are defined in the image.
I've no idea of how to do this, I tried with Sympy and numdifftools but these libraries give me some errors.
How could I write the gradient of the function E?
Thank you
numpy scipy gradient sympy
numpy scipy gradient sympy
edited Nov 16 '18 at 16:39
RUL
11810
11810
asked Nov 15 '18 at 12:06
AriAri
72
72
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
you mean this?
import numpy as np
# G function
def g(x):
return np.tanh(x/2)
# F function
def f(x, N, n, v, g):
sumf = 0
for j in range(1, N):
sumi = 0
for i in range(1, n):
sumi += w[j, i]*x[i] - b[j]
sumf += v[j]*g(sumi)
return sumf
Hi, this is my activation function in f. But I've to do the gradient of the function Error E (the first one in the picture).
– Ari
Nov 15 '18 at 12:22
you want to implement the first function ? E(w, pi) ?
– Eran Moshe
Nov 15 '18 at 12:26
I want to compute the Gradient of the function E(w, pi) because i've just defined this function in Python but now i need to optimize, respect omega, with gradient algorithm.
– Ari
Nov 15 '18 at 12:28
you want to write the function F ?
– Eran Moshe
Nov 15 '18 at 12:31
i've already written the function, but i need the gradient of the error function E to optimize. I need to use algorithm gradient descent and before to use this i need the gradient of E
– Ari
Nov 15 '18 at 13:38
|
show 1 more comment
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53319157%2fhow-to-calculate-the-gradient-in-python-numpy%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
you mean this?
import numpy as np
# G function
def g(x):
return np.tanh(x/2)
# F function
def f(x, N, n, v, g):
sumf = 0
for j in range(1, N):
sumi = 0
for i in range(1, n):
sumi += w[j, i]*x[i] - b[j]
sumf += v[j]*g(sumi)
return sumf
Hi, this is my activation function in f. But I've to do the gradient of the function Error E (the first one in the picture).
– Ari
Nov 15 '18 at 12:22
you want to implement the first function ? E(w, pi) ?
– Eran Moshe
Nov 15 '18 at 12:26
I want to compute the Gradient of the function E(w, pi) because i've just defined this function in Python but now i need to optimize, respect omega, with gradient algorithm.
– Ari
Nov 15 '18 at 12:28
you want to write the function F ?
– Eran Moshe
Nov 15 '18 at 12:31
i've already written the function, but i need the gradient of the error function E to optimize. I need to use algorithm gradient descent and before to use this i need the gradient of E
– Ari
Nov 15 '18 at 13:38
|
show 1 more comment
you mean this?
import numpy as np
# G function
def g(x):
return np.tanh(x/2)
# F function
def f(x, N, n, v, g):
sumf = 0
for j in range(1, N):
sumi = 0
for i in range(1, n):
sumi += w[j, i]*x[i] - b[j]
sumf += v[j]*g(sumi)
return sumf
Hi, this is my activation function in f. But I've to do the gradient of the function Error E (the first one in the picture).
– Ari
Nov 15 '18 at 12:22
you want to implement the first function ? E(w, pi) ?
– Eran Moshe
Nov 15 '18 at 12:26
I want to compute the Gradient of the function E(w, pi) because i've just defined this function in Python but now i need to optimize, respect omega, with gradient algorithm.
– Ari
Nov 15 '18 at 12:28
you want to write the function F ?
– Eran Moshe
Nov 15 '18 at 12:31
i've already written the function, but i need the gradient of the error function E to optimize. I need to use algorithm gradient descent and before to use this i need the gradient of E
– Ari
Nov 15 '18 at 13:38
|
show 1 more comment
you mean this?
import numpy as np
# G function
def g(x):
return np.tanh(x/2)
# F function
def f(x, N, n, v, g):
sumf = 0
for j in range(1, N):
sumi = 0
for i in range(1, n):
sumi += w[j, i]*x[i] - b[j]
sumf += v[j]*g(sumi)
return sumf
you mean this?
import numpy as np
# G function
def g(x):
return np.tanh(x/2)
# F function
def f(x, N, n, v, g):
sumf = 0
for j in range(1, N):
sumi = 0
for i in range(1, n):
sumi += w[j, i]*x[i] - b[j]
sumf += v[j]*g(sumi)
return sumf
edited Nov 15 '18 at 12:36
answered Nov 15 '18 at 12:10
Eran MosheEran Moshe
1,372621
1,372621
Hi, this is my activation function in f. But I've to do the gradient of the function Error E (the first one in the picture).
– Ari
Nov 15 '18 at 12:22
you want to implement the first function ? E(w, pi) ?
– Eran Moshe
Nov 15 '18 at 12:26
I want to compute the Gradient of the function E(w, pi) because i've just defined this function in Python but now i need to optimize, respect omega, with gradient algorithm.
– Ari
Nov 15 '18 at 12:28
you want to write the function F ?
– Eran Moshe
Nov 15 '18 at 12:31
i've already written the function, but i need the gradient of the error function E to optimize. I need to use algorithm gradient descent and before to use this i need the gradient of E
– Ari
Nov 15 '18 at 13:38
|
show 1 more comment
Hi, this is my activation function in f. But I've to do the gradient of the function Error E (the first one in the picture).
– Ari
Nov 15 '18 at 12:22
you want to implement the first function ? E(w, pi) ?
– Eran Moshe
Nov 15 '18 at 12:26
I want to compute the Gradient of the function E(w, pi) because i've just defined this function in Python but now i need to optimize, respect omega, with gradient algorithm.
– Ari
Nov 15 '18 at 12:28
you want to write the function F ?
– Eran Moshe
Nov 15 '18 at 12:31
i've already written the function, but i need the gradient of the error function E to optimize. I need to use algorithm gradient descent and before to use this i need the gradient of E
– Ari
Nov 15 '18 at 13:38
Hi, this is my activation function in f. But I've to do the gradient of the function Error E (the first one in the picture).
– Ari
Nov 15 '18 at 12:22
Hi, this is my activation function in f. But I've to do the gradient of the function Error E (the first one in the picture).
– Ari
Nov 15 '18 at 12:22
you want to implement the first function ? E(w, pi) ?
– Eran Moshe
Nov 15 '18 at 12:26
you want to implement the first function ? E(w, pi) ?
– Eran Moshe
Nov 15 '18 at 12:26
I want to compute the Gradient of the function E(w, pi) because i've just defined this function in Python but now i need to optimize, respect omega, with gradient algorithm.
– Ari
Nov 15 '18 at 12:28
I want to compute the Gradient of the function E(w, pi) because i've just defined this function in Python but now i need to optimize, respect omega, with gradient algorithm.
– Ari
Nov 15 '18 at 12:28
you want to write the function F ?
– Eran Moshe
Nov 15 '18 at 12:31
you want to write the function F ?
– Eran Moshe
Nov 15 '18 at 12:31
i've already written the function, but i need the gradient of the error function E to optimize. I need to use algorithm gradient descent and before to use this i need the gradient of E
– Ari
Nov 15 '18 at 13:38
i've already written the function, but i need the gradient of the error function E to optimize. I need to use algorithm gradient descent and before to use this i need the gradient of E
– Ari
Nov 15 '18 at 13:38
|
show 1 more comment
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53319157%2fhow-to-calculate-the-gradient-in-python-numpy%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown