Comment 36 for bug 988079

Revision history for this message
Sam Spilsbury (smspillaz) wrote : Re: [nvidia] Dismal compiz performance on HP Z600 with 30" landscape monitor

> If it were actually running at 170 FPS then it wouldn't be exhibiting the jerky animation it does on screen. Perhaps compiz is competing for the GPU's fill rate with other OpenGL applications on the system, but if this were true, then the app shouldn't be able to provide frames to compiz at a rate faster than the screen refresh rate.

On this point, I can assure that there isn't any such competition for fill rate because we don't do full-screen redraws every frame and even if we were doing fullscreen redraws every frame that would impact application performance substantially. As Daniel already pointed out, respecting the swap interval is the responsibility of both the driver and the application. One thing to note here is that compiz uses glXSwapIntervalEXT only on full-screen redraws and falls back to GLX_SGI_video_sync on partial redraws. "Force fullscreen redraws" causes it to use the former path all the time.

Now what could be causing the visible "dropping frames" in GLMark2 is higher than normal CPU usage when trying to deal with the damage events coming from the directly rendered application. Daniel's looked into that already and come up with some ideas for it, but needs some more time to ensure it works perfectly. This is bug 1007299 . What's also worth looking into I believe is the server-side CPU usage as well.

It seems to me that Unity and / or Nux is doing something that causes GLmark to take a slow path in the driver, or a software path. As such, I think its a good idea to engage NVIDIA on this subject. Can you attach an nvidia bug report while running Unity and GLMark2?