Comment 9 for bug 1677578

Revision history for this message
Christian Ehrhardt  (paelzer) wrote :

Also thanks Vasya for your active participation!

# mem check with
smem | grep www
# driving some requests via
for j in $(seq 1 10); do for i in $(seq 1 100); do (wget http://10.0.4.156/index.php &); done; sleep 11s; done

# getting the php imagick dependencies
$ apt-get install php-imagick
$ sudo service lighttpd restart

And I modified the test code according to our discussion to:
<?php
   header("Content-Type: text/plain");
   ini_set('max_execution_time', 10);
   $image = new \Imagick();
   $image->newImage(8192, 8192, new ImagickPixel('red'), 'jpg');
   echo "Leaking " . memory_get_usage() . "\n";
   #busy wait until killed, and consume execution time (so no sleep)
   $st_tm = time();
   $diff=0;
   while (1){
     if ((time() - $st_tm) > $diff) {
             $diff=(time() - $st_tm);
             echo "Waiting to Die " . date('h:i:s') . "\n";
             flush();
     }
   }
?>

But that is only leaking like 350k each time and never grows above.
But while doing so I think I have found that the processes stay at the size they get.

In my former examples I leaked ~50-60M and that was the size.
Here the size stayed at some hundred KB which matches the Leak.

OTOH No matter how much requests I have thrown against it it never behaved like a leak to slowly add up.
So maybe it is not a "leak" which adds up over time, but instead just excessive memory needs?.
Id expect some other max barrier to kick in, but lets check this by verifying this next.