Comment 17 for bug 1677578

Revision history for this message
Christian Ehrhardt  (paelzer) wrote :

The only thing left that came to my mind would be a single super-huge image.
char="x"; sz=8192; convert -size ${sz}x${sz} xc:white ${char}.png; ll -h ${char}.png
char="y"; sz=16384; convert -size ${sz}x${sz} xc:white ${char}.png; ll -h ${char}.png

It might be interesting that on a newer ImageMagick version (8:6.9.7.4+dfsg-2ubuntu3) it refused to create an image of that size on the commandline.

And I think that did trigger what we search at least to some extend.
I got a /usr/bin/php-cgi process that consumed 4G memory and 800M swap while the 128M limit was active.

Some more tests showed that the 8129x8129 image works normally.
But only the 16384x16384 is the one bending the limits.

Of the usual process structure it was one of the worker threads that stayed active that way
/usr/sbin/lighttpd -D -f /etc/lighttpd/lighttpd.conf
  \_ /usr/bin/php-cgi
      \_ /usr/bin/php-cgi <=== X
      \_ /usr/bin/php-cgi
      \_ /usr/bin/php-cgi
      \_ /usr/bin/php-cgi

Killing by max_execution_time only means stopping to process, not killing and re-forking the php-cgi process.
The real bug IMHO here is that the combo we have found allows php to exceed its memory limits.