how to calculate the gradient in python numpy












0















i have to implement the Stochastic Gradient Descent in Numpy. So I've to define the gradient of this function E:



enter image description here



In which also f and g are defined in the image.
I've no idea of how to do this, I tried with Sympy and numdifftools but these libraries give me some errors.
How could I write the gradient of the function E?
Thank you










share|improve this question





























    0















    i have to implement the Stochastic Gradient Descent in Numpy. So I've to define the gradient of this function E:



    enter image description here



    In which also f and g are defined in the image.
    I've no idea of how to do this, I tried with Sympy and numdifftools but these libraries give me some errors.
    How could I write the gradient of the function E?
    Thank you










    share|improve this question



























      0












      0








      0








      i have to implement the Stochastic Gradient Descent in Numpy. So I've to define the gradient of this function E:



      enter image description here



      In which also f and g are defined in the image.
      I've no idea of how to do this, I tried with Sympy and numdifftools but these libraries give me some errors.
      How could I write the gradient of the function E?
      Thank you










      share|improve this question
















      i have to implement the Stochastic Gradient Descent in Numpy. So I've to define the gradient of this function E:



      enter image description here



      In which also f and g are defined in the image.
      I've no idea of how to do this, I tried with Sympy and numdifftools but these libraries give me some errors.
      How could I write the gradient of the function E?
      Thank you







      numpy scipy gradient sympy






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Nov 16 '18 at 16:39









      RUL

      11810




      11810










      asked Nov 15 '18 at 12:06









      AriAri

      72




      72
























          1 Answer
          1






          active

          oldest

          votes


















          0














          you mean this?



          import numpy as np

          # G function
          def g(x):
          return np.tanh(x/2)

          # F function
          def f(x, N, n, v, g):
          sumf = 0
          for j in range(1, N):
          sumi = 0
          for i in range(1, n):
          sumi += w[j, i]*x[i] - b[j]
          sumf += v[j]*g(sumi)

          return sumf





          share|improve this answer


























          • Hi, this is my activation function in f. But I've to do the gradient of the function Error E (the first one in the picture).

            – Ari
            Nov 15 '18 at 12:22











          • you want to implement the first function ? E(w, pi) ?

            – Eran Moshe
            Nov 15 '18 at 12:26











          • I want to compute the Gradient of the function E(w, pi) because i've just defined this function in Python but now i need to optimize, respect omega, with gradient algorithm.

            – Ari
            Nov 15 '18 at 12:28











          • you want to write the function F ?

            – Eran Moshe
            Nov 15 '18 at 12:31











          • i've already written the function, but i need the gradient of the error function E to optimize. I need to use algorithm gradient descent and before to use this i need the gradient of E

            – Ari
            Nov 15 '18 at 13:38













          Your Answer






          StackExchange.ifUsing("editor", function () {
          StackExchange.using("externalEditor", function () {
          StackExchange.using("snippets", function () {
          StackExchange.snippets.init();
          });
          });
          }, "code-snippets");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "1"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53319157%2fhow-to-calculate-the-gradient-in-python-numpy%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          0














          you mean this?



          import numpy as np

          # G function
          def g(x):
          return np.tanh(x/2)

          # F function
          def f(x, N, n, v, g):
          sumf = 0
          for j in range(1, N):
          sumi = 0
          for i in range(1, n):
          sumi += w[j, i]*x[i] - b[j]
          sumf += v[j]*g(sumi)

          return sumf





          share|improve this answer


























          • Hi, this is my activation function in f. But I've to do the gradient of the function Error E (the first one in the picture).

            – Ari
            Nov 15 '18 at 12:22











          • you want to implement the first function ? E(w, pi) ?

            – Eran Moshe
            Nov 15 '18 at 12:26











          • I want to compute the Gradient of the function E(w, pi) because i've just defined this function in Python but now i need to optimize, respect omega, with gradient algorithm.

            – Ari
            Nov 15 '18 at 12:28











          • you want to write the function F ?

            – Eran Moshe
            Nov 15 '18 at 12:31











          • i've already written the function, but i need the gradient of the error function E to optimize. I need to use algorithm gradient descent and before to use this i need the gradient of E

            – Ari
            Nov 15 '18 at 13:38


















          0














          you mean this?



          import numpy as np

          # G function
          def g(x):
          return np.tanh(x/2)

          # F function
          def f(x, N, n, v, g):
          sumf = 0
          for j in range(1, N):
          sumi = 0
          for i in range(1, n):
          sumi += w[j, i]*x[i] - b[j]
          sumf += v[j]*g(sumi)

          return sumf





          share|improve this answer


























          • Hi, this is my activation function in f. But I've to do the gradient of the function Error E (the first one in the picture).

            – Ari
            Nov 15 '18 at 12:22











          • you want to implement the first function ? E(w, pi) ?

            – Eran Moshe
            Nov 15 '18 at 12:26











          • I want to compute the Gradient of the function E(w, pi) because i've just defined this function in Python but now i need to optimize, respect omega, with gradient algorithm.

            – Ari
            Nov 15 '18 at 12:28











          • you want to write the function F ?

            – Eran Moshe
            Nov 15 '18 at 12:31











          • i've already written the function, but i need the gradient of the error function E to optimize. I need to use algorithm gradient descent and before to use this i need the gradient of E

            – Ari
            Nov 15 '18 at 13:38
















          0












          0








          0







          you mean this?



          import numpy as np

          # G function
          def g(x):
          return np.tanh(x/2)

          # F function
          def f(x, N, n, v, g):
          sumf = 0
          for j in range(1, N):
          sumi = 0
          for i in range(1, n):
          sumi += w[j, i]*x[i] - b[j]
          sumf += v[j]*g(sumi)

          return sumf





          share|improve this answer















          you mean this?



          import numpy as np

          # G function
          def g(x):
          return np.tanh(x/2)

          # F function
          def f(x, N, n, v, g):
          sumf = 0
          for j in range(1, N):
          sumi = 0
          for i in range(1, n):
          sumi += w[j, i]*x[i] - b[j]
          sumf += v[j]*g(sumi)

          return sumf






          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited Nov 15 '18 at 12:36

























          answered Nov 15 '18 at 12:10









          Eran MosheEran Moshe

          1,372621




          1,372621













          • Hi, this is my activation function in f. But I've to do the gradient of the function Error E (the first one in the picture).

            – Ari
            Nov 15 '18 at 12:22











          • you want to implement the first function ? E(w, pi) ?

            – Eran Moshe
            Nov 15 '18 at 12:26











          • I want to compute the Gradient of the function E(w, pi) because i've just defined this function in Python but now i need to optimize, respect omega, with gradient algorithm.

            – Ari
            Nov 15 '18 at 12:28











          • you want to write the function F ?

            – Eran Moshe
            Nov 15 '18 at 12:31











          • i've already written the function, but i need the gradient of the error function E to optimize. I need to use algorithm gradient descent and before to use this i need the gradient of E

            – Ari
            Nov 15 '18 at 13:38





















          • Hi, this is my activation function in f. But I've to do the gradient of the function Error E (the first one in the picture).

            – Ari
            Nov 15 '18 at 12:22











          • you want to implement the first function ? E(w, pi) ?

            – Eran Moshe
            Nov 15 '18 at 12:26











          • I want to compute the Gradient of the function E(w, pi) because i've just defined this function in Python but now i need to optimize, respect omega, with gradient algorithm.

            – Ari
            Nov 15 '18 at 12:28











          • you want to write the function F ?

            – Eran Moshe
            Nov 15 '18 at 12:31











          • i've already written the function, but i need the gradient of the error function E to optimize. I need to use algorithm gradient descent and before to use this i need the gradient of E

            – Ari
            Nov 15 '18 at 13:38



















          Hi, this is my activation function in f. But I've to do the gradient of the function Error E (the first one in the picture).

          – Ari
          Nov 15 '18 at 12:22





          Hi, this is my activation function in f. But I've to do the gradient of the function Error E (the first one in the picture).

          – Ari
          Nov 15 '18 at 12:22













          you want to implement the first function ? E(w, pi) ?

          – Eran Moshe
          Nov 15 '18 at 12:26





          you want to implement the first function ? E(w, pi) ?

          – Eran Moshe
          Nov 15 '18 at 12:26













          I want to compute the Gradient of the function E(w, pi) because i've just defined this function in Python but now i need to optimize, respect omega, with gradient algorithm.

          – Ari
          Nov 15 '18 at 12:28





          I want to compute the Gradient of the function E(w, pi) because i've just defined this function in Python but now i need to optimize, respect omega, with gradient algorithm.

          – Ari
          Nov 15 '18 at 12:28













          you want to write the function F ?

          – Eran Moshe
          Nov 15 '18 at 12:31





          you want to write the function F ?

          – Eran Moshe
          Nov 15 '18 at 12:31













          i've already written the function, but i need the gradient of the error function E to optimize. I need to use algorithm gradient descent and before to use this i need the gradient of E

          – Ari
          Nov 15 '18 at 13:38







          i've already written the function, but i need the gradient of the error function E to optimize. I need to use algorithm gradient descent and before to use this i need the gradient of E

          – Ari
          Nov 15 '18 at 13:38




















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53319157%2fhow-to-calculate-the-gradient-in-python-numpy%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          這個網誌中的熱門文章

          Tangent Lines Diagram Along Smooth Curve

          Yusuf al-Mu'taman ibn Hud

          Zucchini