fast-export out of memory error
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
Bazaar Fast Import |
Confirmed
|
Undecided
|
Unassigned |
Bug Description
I'm trying to export my branch, however it fails with 'out of memory'.
The bzr.log file contains...
Thu 2011-07-28 15:53:33 +1000
0.046 bazaar version: 2.4b5
0.046 bzr arguments: [u'fast-export', u'C:/bzrtest2/
0.062 looking for plugins in C:/Users/
0.062 looking for plugins in C:/Program Files (x86)/Bazaar/
0.109 encoding stdout as sys.stdin encoding 'cp850'
[ 5940] 2011-07-28 15:53:33.618 INFO: 15:53:33 Calculating the revisions to include ...
[ 5940] 2011-07-28 15:53:33.634 INFO: 15:53:33 Starting export of 827 revisions ...
1.076 Adding the key (<bzrlib.
...SNIP lots of similar lines...
82.926 Adding the key (<bzrlib.
84.268 Transferred: 0kB (0.0kB/s r:0kB w:0kB)
84.268 Traceback (most recent call last):
File "bzrlib\
File "bzrlib\
File "bzrlib\
File "bzrlib\
File "bzrlib\
File "bzrlib\
File "C:/Program Files (x86)/Bazaar/
File "C:/Program Files (x86)/Bazaar/
File "C:/Program Files (x86)/Bazaar/
File "C:/Program Files (x86)/Bazaar/
File "fastimport\
File "fastimport\
MemoryError
84.268 return code 3
What can I do to work around this?
I'm running Windows 7 with 12GB RAM. Perhaps I can run 64bit, increase cache limits? Split it up? I've tried repacking texts and pulling a fresh branch. This is fairly urgent if anyone can help.
affects: | python-fastimport → bzr-fastimport |
On Thu, 2011-07-28 at 05:58 +0000, Greg wrote:
> Public bug reported:
>
> I'm trying to export my branch, however it fails with 'out of memory'.
> The bzr.log file contains...
[...]
> What can I do to work around this? I'm running Windows 7 with 12GB RAM.
> Perhaps I can run 64bit, increase cache limits? Split it up? I've
> tried repacking texts and pulling a fresh branch. This is fairly
> urgent if anyone can help.
How big are the files in the repository?
Bazaar doesn't deal with big files very well at the moment, and may keep
several copies of a file it processes in memory. This is fine for
smaller files, but it quickly becomes an issue for anything that is a
couple of hundred megabyte large.
Cheers,
Jelmer