To process those in my case needed ~16 seconds.
So I duplicated the pictures 3 times
$ cd /usr/share/backgrounds
$ for i in *; do cp $i dup_1_$i; cp $i dup_2_$i; cp $i dup_3_$i; done
The code now is:
<?php
header("Content-Type: text/plain");
ini_set('max_execution_time', 30);
$photos = [];
if ($album_root = opendir("/usr/share/backgrounds")) {
while (false !== ($entry = readdir($album_root))) {
if (preg_match('/\.(jpe?g|gif|png)$/i', $entry)) {
$photos[] = $entry;
}
}
closedir($album_root);
sort($photos);
foreach($photos as $photo) {
echo "Processing " . $photo . " at " . date('h:i:s') . "\n";
$thumb = 'thumbs/'.$photo;
if (!file_exists($thumb)) {
$image = new \Imagick(realpath('/usr/share/backgrounds/'.$photo)); $image->thumbnailImage(64, 48, true, false); $image->writeImage($thumb); $image->clear;
$image = null;
}
}
}
echo "Leaking " . memory_get_usage() . "\n";
#busy wait until killed, and consume execution time (so no sleep)
$st_tm = time();
$diff=0;
while (1){
if ((time() - $st_tm) > $diff) { $diff=(time() - $st_tm);
echo "Waiting to Die " . date('h:i:s') . "\n"; flush();
}
}
?>
With that it exceeds the 30 seconds just as it does in your case (needs ~40).
While running I see it changing memroy consumption between 80 and 120 MB.
But never exceeding as if some garbagde collection works as expected.
Many concurrent requests got my system stuttering via cpu consumption but nothing exceeded any memory limit.
So killed by max_execution_time while doing the same workload you have it never exceeded the 128M I had set.
As I said I was trying to get closer to your case.
# get some pictures wallpapers- zesty ubuntu- wallpapers- karmic ubuntu- wallpapers- lucid ubuntu- wallpapers- maverick ubuntu- wallpapers- natty ubuntu- wallpapers- oneiric ubuntu- wallpapers- precise ubuntu- wallpapers- quantal ubuntu- wallpapers- raring ubuntu- wallpapers- saucy ubuntu- wallpapers- trusty ubuntu- wallpapers- utopic ubuntu- wallpapers- vivid ubuntu- wallpapers- wily html/thumbs html/thumbs html/thumbs
$ apt-get install ubuntu-
$ mkdir /var/www/
$ chgrp www-data /var/www/
$ chmod g+w /var/www/
To process those in my case needed ~16 seconds.
So I duplicated the pictures 3 times
$ cd /usr/share/ backgrounds
$ for i in *; do cp $i dup_1_$i; cp $i dup_2_$i; cp $i dup_3_$i; done
The code now is: "Content- Type: text/plain"); set('max_ execution_ time', 30);
<?php
header(
ini_
$photos = []; "/usr/share/ backgrounds" )) { $album_ root))) { '/\.(jpe? g|gif|png) $/i', $entry)) { $album_ root); $photos as $photo) { exists( $thumb) ) { realpath( '/usr/share/ backgrounds/ '.$photo) );
$image- >thumbnailImage (64, 48, true, false);
$image- >writeImage( $thumb) ;
$image- >clear;
if ($album_root = opendir(
while (false !== ($entry = readdir(
if (preg_match(
$photos[] = $entry;
}
}
closedir(
sort($photos);
foreach(
echo "Processing " . $photo . " at " . date('h:i:s') . "\n";
$thumb = 'thumbs/'.$photo;
if (!file_
$image = new \Imagick(
$image = null;
}
}
}
echo "Leaking " . memory_get_usage() . "\n";
$ diff=(time( ) - $st_tm);
flush( );
#busy wait until killed, and consume execution time (so no sleep)
$st_tm = time();
$diff=0;
while (1){
if ((time() - $st_tm) > $diff) {
echo "Waiting to Die " . date('h:i:s') . "\n";
}
}
?>
With that it exceeds the 30 seconds just as it does in your case (needs ~40).
While running I see it changing memroy consumption between 80 and 120 MB.
But never exceeding as if some garbagde collection works as expected.
Many concurrent requests got my system stuttering via cpu consumption but nothing exceeded any memory limit.
So killed by max_execution_time while doing the same workload you have it never exceeded the 128M I had set.