Matthew Garret asked me to test a patch he located below, which appears to solve the problem. I can't manually trigger the bug, so I had to test based on whether the system crashes in reasonable time or not.
To this end, I typically crash in about an hour**, give or take 45 minutes; picking a few (6) approximate variations I come up with 27.386 minutes for a standard deviation*.
I've been up for 17h35m now, so I'm about 38.5 standard deviations above the mean; 5 standard deviations gives a 99.9999999% probability of the result being non-chance, 38.5 means the sun should burn out before this happens.
In short, I'm pretty certain the below patch fixes the problem:
In case anyone doesn't get it, sizeof(int) != sizeof(long) on 64-bit systems.
*My variations are {25,30,15,45,10,25} from 60, in minutes. It doesn't matter whether they're negative or positive; standard deviation is calculated by (sum(variation^2))^(1/2), so the sign is lost.
**My mean and variations are highly approximated. I have not made 2 hours, but I've made an hour, and also had failures within 10 minutes. Under these assumptions it is always provable to better than 99.999% that the increased system stability is related by non-chance to the changes made.
Matthew Garret asked me to test a patch he located below, which appears to solve the problem. I can't manually trigger the bug, so I had to test based on whether the system crashes in reasonable time or not.
To this end, I typically crash in about an hour**, give or take 45 minutes; picking a few (6) approximate variations I come up with 27.386 minutes for a standard deviation*.
I've been up for 17h35m now, so I'm about 38.5 standard deviations above the mean; 5 standard deviations gives a 99.9999999% probability of the result being non-chance, 38.5 means the sun should burn out before this happens.
In short, I'm pretty certain the below patch fixes the problem:
http:// webcvs. freedesktop. org/dri/ drm/shared- core/via_ mm.c?r1= 1.21&r2= 1.22&makepatch= 1&diff_ format= u
In case anyone doesn't get it, sizeof(int) != sizeof(long) on 64-bit systems.
*My variations are {25,30,15,45,10,25} from 60, in minutes. It doesn't matter whether they're negative or positive; standard deviation is calculated by (sum(variation^ 2))^(1/ 2), so the sign is lost.
**My mean and variations are highly approximated. I have not made 2 hours, but I've made an hour, and also had failures within 10 minutes. Under these assumptions it is always provable to better than 99.999% that the increased system stability is related by non-chance to the changes made.