snapshots very slow when there is a long directory listing
Bug #1185376 reported by
James Tunnicliffe
This bug affects 2 people
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
linaro-license-protection |
Fix Released
|
High
|
James Tunnicliffe |
Bug Description
http://
Related branches
lp:~dooferlad/linaro-license-protection/bigdir
- Milo Casagrande (community): Approve
-
Diff: 42 lines (+16/-3)2 files modifiedHACKING (+2/-0)
license_protected_downloads/buildinfo.py (+14/-3)
Changed in linaro-license-protection: | |
status: | New → Confirmed |
importance: | Undecided → High |
Changed in linaro-license-protection: | |
assignee: | nobody → James Tunnicliffe (dooferlad) |
status: | Confirmed → In Progress |
Changed in linaro-license-protection: | |
status: | In Progress → Fix Released |
To post a comment you must log in.
See the problem locally when have a large quantity of data in a directory. Just having 1000 small files doesn't slow it down much, but 50 large ones does. Can recreate by doing this in a directory under sampleroot and running the dev server:
#!/usr/bin/python
for index in range(0, 50):
f.write( "file_% d.txt" % index)
blob = "0" * 1024 * 20
with open("file_%d.txt" % index, "w") as f:
chunk = 0
while chunk < 10000:
f. write(blob)
chunk += 1
Just need to profile to find out what the issue is. Given the quantity of data seems to be key I suspect that on each listing all the files in the directory are being read completely.