Comment 9 for bug 602767

Revision history for this message
Alex Ruddick (alexrudd0) wrote :

Oops - either LP didn't send me a bug mail or I missed it.

>>
I think there might be a better workaround.

1) When memory is low, don't spawn a parallel thumbnail process for every executable in the folder -- do them one at a time instead. This probably accounts for most of the cases of deadlock.
2) Have the thumbnailer script determine how many other instances of itself are running, and if their total memory is too high wait until one finishes. Ideally this would happen in some sort of intelligent order, but that probably requires code in Nautilus.<<

This would probably be a welcome improvement, but I think can be considered separately. My problem is with large files - one at a time or parallel, they still hose the system.

>> 3) If even one at a time is too many (eg multi-gigabyte file), then blacklist it. Rather than 100 mb I would set this at something dynamic like 1/2 the system ram.<<

I didn't update my description of the patch when I posted a new one. It now reads the nautilus thumbnail limit from gconf and respects that.

Should I try to upstream? Change the patch? I'd like to get this into Natty so I don't have to keep a local diff lying around.