Comment 9 for bug 125067

On Thu, 2007-08-09 at 19:04 +0000, James Y Knight wrote:
> Well, limiting a pack file to some reasonable number like 100MB seems
> pretty sane. But other than that, it seems to me that making the
> downloader able to resume the download if interrupted is clearly the
> right solution.

Versioning of .iso's or similar large documents will grow packs to
arbitrary sizes unless we add bittorrent like splitting and recombining
of data into the mix. Further, there are roundtrip costs involved in
accessing separate packs, we really want to keep the total number of
packs managable - I'm currently working on upper pack count = sum of the
digits in the revision count - which gives logarithmic backoff without
rewriting on every commit.

individual packs are currently atomically inserted, and the temporary
file is discarded. If you wanted to design a way to make reuse of such
things (remember we are not blind-copying the remote packs, as we don't
want all the data that they have in the general case (though detecting
when we do want it all and blind copying would be a nice optimisation)
so its not a matter of looking for the same pack name in a special area
or something like that.

-Rob

--
GPG key available at: <http://www.robertcollins.net/keys.txt>.