Comment 108 for bug 453579

Revision history for this message
Starcraftmazter (starcraftmazter) wrote : Re: corruption of large files reported with linux 2.6.31-14.46 on ext4

@aldebx

Of course I realise this, perhaps I need to elaborate my idea. I mean, since the error apparently occurs when large files are edited, a test should be devised whereby changes are made to a large file, saved, and then un-done and saved - and the before and after checksums compared, to see if there in fact is a problem with writing large files.

Furthermore, since the problem allegedly happens around bits at the 512MB mark, so my idea is to write a program to take an X number of blocks before and after this bit, and swap them. X must be even to ensure every block is swapped with another. I am thinking of swapping 1000 blocks before the point with 1000 points after the point. Using fsync and running the program twice should ensure that both changes are written, and the second undoes the first - thus if two hashes of the file are taken, one before and one after the experiment, they will either be identical if no problems occured or different if there is in fact a problem.

So my question is, would this be a good test to do? I will probably have time to do it tomorrow.