Comment 13 for bug 1677578

Revision history for this message
Christian Ehrhardt  (paelzer) wrote :

As I said I was trying to get closer to your case.

# get some pictures
$ apt-get install ubuntu-wallpapers-zesty ubuntu-wallpapers-karmic ubuntu-wallpapers-lucid ubuntu-wallpapers-maverick ubuntu-wallpapers-natty ubuntu-wallpapers-oneiric ubuntu-wallpapers-precise ubuntu-wallpapers-quantal ubuntu-wallpapers-raring ubuntu-wallpapers-saucy ubuntu-wallpapers-trusty ubuntu-wallpapers-utopic ubuntu-wallpapers-vivid ubuntu-wallpapers-wily
$ mkdir /var/www/html/thumbs
$ chgrp www-data /var/www/html/thumbs
$ chmod g+w /var/www/html/thumbs

To process those in my case needed ~16 seconds.
So I duplicated the pictures 3 times

$ cd /usr/share/backgrounds
$ for i in *; do cp $i dup_1_$i; cp $i dup_2_$i; cp $i dup_3_$i; done

The code now is:
<?php
   header("Content-Type: text/plain");
   ini_set('max_execution_time', 30);

   $photos = [];
   if ($album_root = opendir("/usr/share/backgrounds")) {
     while (false !== ($entry = readdir($album_root))) {
       if (preg_match('/\.(jpe?g|gif|png)$/i', $entry)) {
         $photos[] = $entry;
       }
     }
     closedir($album_root);
     sort($photos);
     foreach($photos as $photo) {
       echo "Processing " . $photo . " at " . date('h:i:s') . "\n";
       $thumb = 'thumbs/'.$photo;
       if (!file_exists($thumb)) {
         $image = new \Imagick(realpath('/usr/share/backgrounds/'.$photo));
         $image->thumbnailImage(64, 48, true, false);
         $image->writeImage($thumb);
         $image->clear;
         $image = null;
       }
     }
   }

   echo "Leaking " . memory_get_usage() . "\n";
   #busy wait until killed, and consume execution time (so no sleep)
   $st_tm = time();
   $diff=0;
   while (1){
     if ((time() - $st_tm) > $diff) {
             $diff=(time() - $st_tm);
             echo "Waiting to Die " . date('h:i:s') . "\n";
             flush();
     }
   }
?>

With that it exceeds the 30 seconds just as it does in your case (needs ~40).
While running I see it changing memroy consumption between 80 and 120 MB.
But never exceeding as if some garbagde collection works as expected.

Many concurrent requests got my system stuttering via cpu consumption but nothing exceeded any memory limit.
So killed by max_execution_time while doing the same workload you have it never exceeded the 128M I had set.