Skip to main content

Why does an infinitely recursive function in PHP cause a segfault?



A hypothetical question for you all to chew on...





I recently answered another question on SO where a PHP script was segfaulting, and it reminded me of something I have always wondered, so let's see if anyone can shed any light on it.





Consider the following:







<?php



function segfault ($i = 1) {

echo "$i\n";

segfault($i + 1);

}



segfault();



?>







Obviously, this (useless) function loops infinitely. And eventually, will run out of memory because each call to the function executes before the previous one has finished. Sort of like a fork bomb without the forking.





But... eventually, on POSIX platforms, the script will die with SIGSEGV (it also dies on Windows, but more gracefully - so far as my extremely limited low-level debugging skills can tell). The number of loops varies depending on the system configuration (memory allocated to PHP, 32bit/64bit, etc etc) and the OS but my real question is - why does it happen with a segfault?





  • Is this simply how PHP handles "out-of-memory" errors? Surely there must be a more graceful way of handling this?



  • Is this a bug in the Zend engine?



  • Is there any way this can be controlled or handled more gracefully from within a PHP script?



  • Is there any setting that generally controls that maximum number of recursive calls that can be made in a function?




Comments

  1. If you use XDebug, there is a maximum function nesting depth which is controlled by an ini setting:

    $foo = function() use (&$foo) {
    $foo();
    };
    $foo();


    Produces the following error:


    Fatal error: Maximum function nesting level of '100' reached, aborting!


    This IMHO is a far better alternative than a segfault, since it only kills the current script, not the whole process.

    There is this thread that was on the internals list a few years ago (2006). His comments are:


    So far nobody had proposed a solution for endless loop problem that
    would satisfy these conditions:


    No false positives (i.e. good code always works)
    No slowdown for execution
    Works with any stack size


    Thus, this problem remains unsloved.


    Now, #1 is quite literally impossible to solve due to the halting problem. #2 is trivial if you keep a counter of stack depth (since you're just checking the incremented stack level on stack push).

    Finally, #3 Is a much harder problem to solve. Considering that some operating systems will allocate stack space in a non-contiguous manner, it's not going to be possible to implement with 100% accuracy, since it's impossible to portably get the stack size or usage (for a specific platform it may be possible or even easy, but not in general).

    Instead, PHP should take the hint from XDebug and other languages (Python, etc) and make a configurable nesting level (Python's is set to 1000 by default)....

    Either that, or trap memory allocation errors on the stack to check for the segfault before it happens and convert that into a RecursionLimitException so that you may be able to recover....

    ReplyDelete
  2. I could be totally wrong about this since my testing was fairly brief. It seems that Php will only seg fault if it runs out of memory (and presumably tries to access an invalid address). If the memory limit is set and low enough, you will get an out of memory error beforehand. Otherwise, the code seg faults and is handled by the OS.

    Can't say whether this is a bug or not, but the script should probably not be allowed to get out of control like this.

    See the script below. Behavior is practically identical regardless of options. Without a memory limit, it also slows my computer down severely before it's killed.

    <?php
    $opts = getopt('ilrv');
    $type = null;
    //iterative
    if (isset($opts['i'])) {
    $type = 'i';
    }
    //recursive
    else if (isset($opts['r'])) {
    $type = 'r';
    }
    if (isset($opts['i']) && isset($opts['r'])) {
    }

    if (isset($opts['l'])) {
    ini_set('memory_limit', '64M');
    }

    define('VERBOSE', isset($opts['v']));

    function print_memory_usage() {
    if (VERBOSE) {
    echo memory_get_usage() . "\n";
    }
    }

    switch ($type) {
    case 'r':
    function segf() {
    print_memory_usage();
    segf();
    }
    segf();
    break;
    case 'i':
    $a = array();
    for ($x = 0; $x >= 0; $x++) {
    print_memory_usage();
    $a[] = $x;
    }
    break;
    default:
    die("Usage: " . __FILE__ . " <-i-or--r> [-l]\n");
    break;
    }
    ?>

    ReplyDelete
  3. Know nothing about PHP implementation, but it's not uncommon in a language runtime to leave pages unallocated at the "top" of the stack so that a segfault will occur if the stack overflows. Usually this is handled inside the runtime and either the stack is extended or a more elegant error is reported, but there could be implementations (and situations in others) where the segfault is simply allowed to rise (or escapes).

    ReplyDelete

Post a Comment

Popular posts from this blog

[韓日関係] 首相含む大幅な内閣改造の可能性…早ければ来月10日ごろ=韓国

div not scrolling properly with slimScroll plugin

I am using the slimScroll plugin for jQuery by Piotr Rochala Which is a great plugin for nice scrollbars on most browsers but I am stuck because I am using it for a chat box and whenever the user appends new text to the boxit does scroll using the .scrollTop() method however the plugin's scrollbar doesnt scroll with it and when the user wants to look though the chat history it will start scrolling from near the top. I have made a quick demo of my situation http://jsfiddle.net/DY9CT/2/ Does anyone know how to solve this problem?

Why does this javascript based printing cause Safari to refresh the page?

The page I am working on has a javascript function executed to print parts of the page. For some reason, printing in Safari, causes the window to somehow update. I say somehow, because it does not really refresh as in reload the page, but rather it starts the "rendering" of the page from start, i.e. scroll to top, flash animations start from 0, and so forth. The effect is reproduced by this fiddle: http://jsfiddle.net/fYmnB/ Clicking the print button and finishing or cancelling a print in Safari causes the screen to "go white" for a sec, which in my real website manifests itself as something "like" a reload. While running print button with, let's say, Firefox, just opens and closes the print dialogue without affecting the fiddle page in any way. Is there something with my way of calling the browsers print method that causes this, or how can it be explained - and preferably, avoided? P.S.: On my real site the same occurs with Chrome. In the ex