Reading large files from GS import context hogs RAM
In parallel to the export problem reported in:
importing large files into the site can also cause excessive
memory use: it requires that the content of the file be in
memory as a string.
Some contexts might be able to support a more efficient
pattern, returning a file handle (or file-like object) which can be
used for chunked reads. The attached patch allows such contexts
to implement a new interface, 'IChunkedImport
offers a new API, 'openDataFile'.
It also implements that interface for DirectoryImport