Comment 3 for bug 1051935

Revision history for this message
Barry Warsaw (barry) wrote : Re: [Bug 1051935] Re: Fails with SystemError when too many files are open

On Oct 03, 2012, at 11:56 PM, Jason Conti wrote:

>import apt
>
>while True:
> apt_cache = apt.Cache()
>
>It will eventually consume all of the file descriptors and crash with
>the above SystemError. I was not really successful in reproducing with
>straight apt_pkg, though apt_pkg.PackageRecords(self._cache) seems to be
>the line opening the files. It seems like the issue is that the objects
>are not being garbage collected fast enough. If,
>
>gc.collect()
>
>is run each loop it seems oscillate between 59 and 114 fds, instead of
>increasing rapidly until python runs out. Even better is:
>
>del apt_cache._records
>
>which oscillates between 3 and 59 fds. So a fix might be to add an
>explicit close() method to apt.Cache which deletes at least the _records
>object, and perhaps other objects.

It's generally a bad idea to rely on garbage collection to free up external
resources like file descriptors. Much better to add an explicit API for
releasing those resources. So your above example would be better written like
so (if the API were available in pyapt):

    while True:
        apt_cache = apt.Cache()
        apt_cache.close()

Ideally, that would run forever.

Another perhaps useful idea, or at least another way of thinking about it,
would be to implement the context manager protocol, so you could do something
like this:

    while True:
        with apt.Cache() as cache:
            # do something with cache
            pass

and this would also guarantee to free up those file descriptors.