Comment 3 for bug 1971932

Revision history for this message
Paride Legovini (paride) wrote :

Well, "large" can span orders of magnitude, depending who you ask. :-)

Can you please try to identify some steps to reproduce that include the big file generation? For example:

  # 100MB file, random data (difficult to compress)
  head -c 100M /dev/urandom > foo

  # 100MB file, all 0s (easy to compress)
  head -c 100M /dev/zero > foo

Then you can try to rsync it to localhost via ssh to a different directory (I assume you're using ssh and not the rsync protocol). Thanks!