Tried the kernel, but it didn't seem to fix my Mac Mini.
uname -a
Linux macmini 2.6.28-11-generic #43~lp349314apw5 SMP Tue Apr 21 16:44:04 UTC 2009 x86_64 GNU/Linux
Still says:
(**) intel(0): Tiling enabled
(EE) intel(0): Failed to set tiling on front buffer: rejected by kernel
(EE) intel(0): Failed to set tiling on back buffer: rejected by kernel
(EE) intel(0): Failed to set tiling on depth buffer: rejected by kernel
Tried the kernel, but it didn't seem to fix my Mac Mini.
uname -a
Linux macmini 2.6.28-11-generic #43~lp349314apw5 SMP Tue Apr 21 16:44:04 UTC 2009 x86_64 GNU/Linux
Still says:
(**) intel(0): Tiling enabled
(EE) intel(0): Failed to set tiling on front buffer: rejected by kernel
(EE) intel(0): Failed to set tiling on back buffer: rejected by kernel
(EE) intel(0): Failed to set tiling on depth buffer: rejected by kernel
in my Xorg.0.log.
glxgears gives me:
get fences failed: -1
param: 6, val: 0
and because of this, XBMC runs very slowly now.
Am I missing a setting somewhere?