PHP uses more memory than the file required





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}







3















Take the given example code:



<?php

if (! function_exists('human_filesize')) {
function human_filesize($size, $precision = 2, $step = 1000)
{
$i = 0;
$units = ['B', 'KB', 'MB', 'GB', 'TB', 'PB', 'EB', 'ZB', 'YB'];

while (($size / $step) > 0.9) {
$size = $size / $step;

$i++;
}

return round($size, $precision) . ' ' . $units[$i];
}
}

if (! function_exists('dd')) {
function dd($vars)
{
foreach (func_get_args() as $var) {
var_dump($var);
}

die();
}
}

$start = microtime(true);
$usage = memory_get_usage(true);

require "brown_corpus.php"; // It's 1.6 MB

$dump = round(microtime(true) - $start, 3);
$dump = human_filesize(memory_get_usage(true) - $usage);

dd(...$dump); // 0.063ms to run | 38.01 MB memory used


brown_corpus.php is 1.6 MB, but when it's required the script tells me it's using 38.01 MB in memory. I've been doing some reading and I'm wondering if this is because PHP compiles required files into opcode, for faster execution? Can someone enlighten be on the pros and cons of this... i.e. if I go ahead and search for keys within an array in that requirement, is that now faster - because of the way PHP has compiled the file?










share|improve this question





























    3















    Take the given example code:



    <?php

    if (! function_exists('human_filesize')) {
    function human_filesize($size, $precision = 2, $step = 1000)
    {
    $i = 0;
    $units = ['B', 'KB', 'MB', 'GB', 'TB', 'PB', 'EB', 'ZB', 'YB'];

    while (($size / $step) > 0.9) {
    $size = $size / $step;

    $i++;
    }

    return round($size, $precision) . ' ' . $units[$i];
    }
    }

    if (! function_exists('dd')) {
    function dd($vars)
    {
    foreach (func_get_args() as $var) {
    var_dump($var);
    }

    die();
    }
    }

    $start = microtime(true);
    $usage = memory_get_usage(true);

    require "brown_corpus.php"; // It's 1.6 MB

    $dump = round(microtime(true) - $start, 3);
    $dump = human_filesize(memory_get_usage(true) - $usage);

    dd(...$dump); // 0.063ms to run | 38.01 MB memory used


    brown_corpus.php is 1.6 MB, but when it's required the script tells me it's using 38.01 MB in memory. I've been doing some reading and I'm wondering if this is because PHP compiles required files into opcode, for faster execution? Can someone enlighten be on the pros and cons of this... i.e. if I go ahead and search for keys within an array in that requirement, is that now faster - because of the way PHP has compiled the file?










    share|improve this question

























      3












      3








      3








      Take the given example code:



      <?php

      if (! function_exists('human_filesize')) {
      function human_filesize($size, $precision = 2, $step = 1000)
      {
      $i = 0;
      $units = ['B', 'KB', 'MB', 'GB', 'TB', 'PB', 'EB', 'ZB', 'YB'];

      while (($size / $step) > 0.9) {
      $size = $size / $step;

      $i++;
      }

      return round($size, $precision) . ' ' . $units[$i];
      }
      }

      if (! function_exists('dd')) {
      function dd($vars)
      {
      foreach (func_get_args() as $var) {
      var_dump($var);
      }

      die();
      }
      }

      $start = microtime(true);
      $usage = memory_get_usage(true);

      require "brown_corpus.php"; // It's 1.6 MB

      $dump = round(microtime(true) - $start, 3);
      $dump = human_filesize(memory_get_usage(true) - $usage);

      dd(...$dump); // 0.063ms to run | 38.01 MB memory used


      brown_corpus.php is 1.6 MB, but when it's required the script tells me it's using 38.01 MB in memory. I've been doing some reading and I'm wondering if this is because PHP compiles required files into opcode, for faster execution? Can someone enlighten be on the pros and cons of this... i.e. if I go ahead and search for keys within an array in that requirement, is that now faster - because of the way PHP has compiled the file?










      share|improve this question














      Take the given example code:



      <?php

      if (! function_exists('human_filesize')) {
      function human_filesize($size, $precision = 2, $step = 1000)
      {
      $i = 0;
      $units = ['B', 'KB', 'MB', 'GB', 'TB', 'PB', 'EB', 'ZB', 'YB'];

      while (($size / $step) > 0.9) {
      $size = $size / $step;

      $i++;
      }

      return round($size, $precision) . ' ' . $units[$i];
      }
      }

      if (! function_exists('dd')) {
      function dd($vars)
      {
      foreach (func_get_args() as $var) {
      var_dump($var);
      }

      die();
      }
      }

      $start = microtime(true);
      $usage = memory_get_usage(true);

      require "brown_corpus.php"; // It's 1.6 MB

      $dump = round(microtime(true) - $start, 3);
      $dump = human_filesize(memory_get_usage(true) - $usage);

      dd(...$dump); // 0.063ms to run | 38.01 MB memory used


      brown_corpus.php is 1.6 MB, but when it's required the script tells me it's using 38.01 MB in memory. I've been doing some reading and I'm wondering if this is because PHP compiles required files into opcode, for faster execution? Can someone enlighten be on the pros and cons of this... i.e. if I go ahead and search for keys within an array in that requirement, is that now faster - because of the way PHP has compiled the file?







      php memory memory-management






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Nov 24 '18 at 17:33









      LukaLuka

      1389




      1389
























          1 Answer
          1






          active

          oldest

          votes


















          2





          +50









          The size of the file has no bearing on the amount of memory it consumes. Without seeing brown_corpus.php, it's impossible to know how it is consuming memory, but keep in mind that code is (usually) a condensed way of describing the structures that it actually creates. Consider the following:



          $arr = array();
          for ($i = 0; $i < 100000; $i++) {
          $arr[$i] = $i;
          }


          Save this into a php file and it occupies about 70 bytes. Run it and it will create an array structure containing 100,000 elements, each of which is an 8-byte integer. Boom, 800 kilobytes used.



          In practice it's far worse than this because of how PHP is configured on your system, to say nothing of various sorts of overheads PHP imposes (the manner in which it stores arrays, for example, boggles the mind -- see https://nikic.github.io/2011/12/12/How-big-are-PHP-arrays-really-Hint-BIG.html).






          share|improve this answer
























          • Those articles seem to really explain this well. brown_corpus contains a returned array of lots of key / value pairs. So that would explain how more bytes are being consumed when required into memory. I wonder if a DB look up might be better, but the script will surely take longer to run, yet be less memory intensive. Giving bounty shortly...

            – Luka
            Nov 28 '18 at 13:51













          • Yes, and different types use widely ranging amounts of memory. You can do memory_get_usage() before and after any variable assignments to get a rough idea how much is being used for that type or structure, but the values will vary from system to system. If you're looking to compare different approaches timing-wise, I'd recommend using microtime() to measure the time taken by each. See the first example on php.net/manual/en/function.microtime.php.

            – Phl3tch
            Nov 28 '18 at 21:23












          Your Answer






          StackExchange.ifUsing("editor", function () {
          StackExchange.using("externalEditor", function () {
          StackExchange.using("snippets", function () {
          StackExchange.snippets.init();
          });
          });
          }, "code-snippets");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "1"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53460729%2fphp-uses-more-memory-than-the-file-required%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          2





          +50









          The size of the file has no bearing on the amount of memory it consumes. Without seeing brown_corpus.php, it's impossible to know how it is consuming memory, but keep in mind that code is (usually) a condensed way of describing the structures that it actually creates. Consider the following:



          $arr = array();
          for ($i = 0; $i < 100000; $i++) {
          $arr[$i] = $i;
          }


          Save this into a php file and it occupies about 70 bytes. Run it and it will create an array structure containing 100,000 elements, each of which is an 8-byte integer. Boom, 800 kilobytes used.



          In practice it's far worse than this because of how PHP is configured on your system, to say nothing of various sorts of overheads PHP imposes (the manner in which it stores arrays, for example, boggles the mind -- see https://nikic.github.io/2011/12/12/How-big-are-PHP-arrays-really-Hint-BIG.html).






          share|improve this answer
























          • Those articles seem to really explain this well. brown_corpus contains a returned array of lots of key / value pairs. So that would explain how more bytes are being consumed when required into memory. I wonder if a DB look up might be better, but the script will surely take longer to run, yet be less memory intensive. Giving bounty shortly...

            – Luka
            Nov 28 '18 at 13:51













          • Yes, and different types use widely ranging amounts of memory. You can do memory_get_usage() before and after any variable assignments to get a rough idea how much is being used for that type or structure, but the values will vary from system to system. If you're looking to compare different approaches timing-wise, I'd recommend using microtime() to measure the time taken by each. See the first example on php.net/manual/en/function.microtime.php.

            – Phl3tch
            Nov 28 '18 at 21:23
















          2





          +50









          The size of the file has no bearing on the amount of memory it consumes. Without seeing brown_corpus.php, it's impossible to know how it is consuming memory, but keep in mind that code is (usually) a condensed way of describing the structures that it actually creates. Consider the following:



          $arr = array();
          for ($i = 0; $i < 100000; $i++) {
          $arr[$i] = $i;
          }


          Save this into a php file and it occupies about 70 bytes. Run it and it will create an array structure containing 100,000 elements, each of which is an 8-byte integer. Boom, 800 kilobytes used.



          In practice it's far worse than this because of how PHP is configured on your system, to say nothing of various sorts of overheads PHP imposes (the manner in which it stores arrays, for example, boggles the mind -- see https://nikic.github.io/2011/12/12/How-big-are-PHP-arrays-really-Hint-BIG.html).






          share|improve this answer
























          • Those articles seem to really explain this well. brown_corpus contains a returned array of lots of key / value pairs. So that would explain how more bytes are being consumed when required into memory. I wonder if a DB look up might be better, but the script will surely take longer to run, yet be less memory intensive. Giving bounty shortly...

            – Luka
            Nov 28 '18 at 13:51













          • Yes, and different types use widely ranging amounts of memory. You can do memory_get_usage() before and after any variable assignments to get a rough idea how much is being used for that type or structure, but the values will vary from system to system. If you're looking to compare different approaches timing-wise, I'd recommend using microtime() to measure the time taken by each. See the first example on php.net/manual/en/function.microtime.php.

            – Phl3tch
            Nov 28 '18 at 21:23














          2





          +50







          2





          +50



          2




          +50





          The size of the file has no bearing on the amount of memory it consumes. Without seeing brown_corpus.php, it's impossible to know how it is consuming memory, but keep in mind that code is (usually) a condensed way of describing the structures that it actually creates. Consider the following:



          $arr = array();
          for ($i = 0; $i < 100000; $i++) {
          $arr[$i] = $i;
          }


          Save this into a php file and it occupies about 70 bytes. Run it and it will create an array structure containing 100,000 elements, each of which is an 8-byte integer. Boom, 800 kilobytes used.



          In practice it's far worse than this because of how PHP is configured on your system, to say nothing of various sorts of overheads PHP imposes (the manner in which it stores arrays, for example, boggles the mind -- see https://nikic.github.io/2011/12/12/How-big-are-PHP-arrays-really-Hint-BIG.html).






          share|improve this answer













          The size of the file has no bearing on the amount of memory it consumes. Without seeing brown_corpus.php, it's impossible to know how it is consuming memory, but keep in mind that code is (usually) a condensed way of describing the structures that it actually creates. Consider the following:



          $arr = array();
          for ($i = 0; $i < 100000; $i++) {
          $arr[$i] = $i;
          }


          Save this into a php file and it occupies about 70 bytes. Run it and it will create an array structure containing 100,000 elements, each of which is an 8-byte integer. Boom, 800 kilobytes used.



          In practice it's far worse than this because of how PHP is configured on your system, to say nothing of various sorts of overheads PHP imposes (the manner in which it stores arrays, for example, boggles the mind -- see https://nikic.github.io/2011/12/12/How-big-are-PHP-arrays-really-Hint-BIG.html).







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Nov 28 '18 at 1:00









          Phl3tchPhl3tch

          138310




          138310













          • Those articles seem to really explain this well. brown_corpus contains a returned array of lots of key / value pairs. So that would explain how more bytes are being consumed when required into memory. I wonder if a DB look up might be better, but the script will surely take longer to run, yet be less memory intensive. Giving bounty shortly...

            – Luka
            Nov 28 '18 at 13:51













          • Yes, and different types use widely ranging amounts of memory. You can do memory_get_usage() before and after any variable assignments to get a rough idea how much is being used for that type or structure, but the values will vary from system to system. If you're looking to compare different approaches timing-wise, I'd recommend using microtime() to measure the time taken by each. See the first example on php.net/manual/en/function.microtime.php.

            – Phl3tch
            Nov 28 '18 at 21:23



















          • Those articles seem to really explain this well. brown_corpus contains a returned array of lots of key / value pairs. So that would explain how more bytes are being consumed when required into memory. I wonder if a DB look up might be better, but the script will surely take longer to run, yet be less memory intensive. Giving bounty shortly...

            – Luka
            Nov 28 '18 at 13:51













          • Yes, and different types use widely ranging amounts of memory. You can do memory_get_usage() before and after any variable assignments to get a rough idea how much is being used for that type or structure, but the values will vary from system to system. If you're looking to compare different approaches timing-wise, I'd recommend using microtime() to measure the time taken by each. See the first example on php.net/manual/en/function.microtime.php.

            – Phl3tch
            Nov 28 '18 at 21:23

















          Those articles seem to really explain this well. brown_corpus contains a returned array of lots of key / value pairs. So that would explain how more bytes are being consumed when required into memory. I wonder if a DB look up might be better, but the script will surely take longer to run, yet be less memory intensive. Giving bounty shortly...

          – Luka
          Nov 28 '18 at 13:51







          Those articles seem to really explain this well. brown_corpus contains a returned array of lots of key / value pairs. So that would explain how more bytes are being consumed when required into memory. I wonder if a DB look up might be better, but the script will surely take longer to run, yet be less memory intensive. Giving bounty shortly...

          – Luka
          Nov 28 '18 at 13:51















          Yes, and different types use widely ranging amounts of memory. You can do memory_get_usage() before and after any variable assignments to get a rough idea how much is being used for that type or structure, but the values will vary from system to system. If you're looking to compare different approaches timing-wise, I'd recommend using microtime() to measure the time taken by each. See the first example on php.net/manual/en/function.microtime.php.

          – Phl3tch
          Nov 28 '18 at 21:23





          Yes, and different types use widely ranging amounts of memory. You can do memory_get_usage() before and after any variable assignments to get a rough idea how much is being used for that type or structure, but the values will vary from system to system. If you're looking to compare different approaches timing-wise, I'd recommend using microtime() to measure the time taken by each. See the first example on php.net/manual/en/function.microtime.php.

          – Phl3tch
          Nov 28 '18 at 21:23




















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53460729%2fphp-uses-more-memory-than-the-file-required%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          這個網誌中的熱門文章

          Tangent Lines Diagram Along Smooth Curve

          Yusuf al-Mu'taman ibn Hud

          Zucchini