The debug log shows the fade type being set to FADE_TYPE_GAMMA_RAMP in check_gamma_extension, due to the presence of the XF86VM extension. The xtrace log shows XF86VidModeGetGammaRampSize in gamma_fade_setup return a size of zero, at which point the fade type is changed to FADE_TYPE_GAMMA_NUMBER. XF86VidModeGetGamma then successfully returns the current gamma value.
However, when trying to set the gamma in xf86_whack_gamma, XF86VidModeSetGamma causes a BadValue error on the first iteration (as shown in xtrace), even though all the values are sane. This appears to be not handled properly on KVM, and with no way of knowing this beforehand, the error should be trapped correctly
The debug log shows the fade type being set to FADE_TYPE_ GAMMA_RAMP in check_gamma_ extension, due to the presence of the XF86VM extension. The xtrace log shows XF86VidModeGetG ammaRampSize in gamma_fade_setup return a size of zero, at which point the fade type is changed to FADE_TYPE_ GAMMA_NUMBER. XF86VidModeGetGamma then successfully returns the current gamma value.
However, when trying to set the gamma in xf86_whack_gamma, XF86VidModeSetGamma causes a BadValue error on the first iteration (as shown in xtrace), even though all the values are sane. This appears to be not handled properly on KVM, and with no way of knowing this beforehand, the error should be trapped correctly