Severe performance regression with xserver 1.15

Bug #1293314 reported by luke
8
This bug affects 1 person
Affects Status Importance Assigned to Milestone
libsdl1.2 (Ubuntu)
Fix Released
Undecided
Unassigned
xorg-server (Ubuntu)
Fix Released
Undecided
Unassigned

Bug Description

With Xserver 1.15, OpenGL performance in all games is reduced by almost half, at least with both Radeon/r600g driver Not sure which actual package has the issue. As long a hyper-z is explicitly enabled, mesa version 10.2 (from oibaf PPA) does not change this regression. With hyper-z disabled (the new default) another 10-20% reduction in performance resulted after an approximately 40% loss from the new x server.

Critter still ran faster than the screen refresh rate, so it was not a matter of vblank sync being enabled due to any effect of the dri3 transition on an unchanged ~/.drirc . I am still wondering, however, if the DRI3 transition is the source of this severe performance loss.

Package tested:

xserver-xorg-core_2%3a1.15.0-1ubuntu6_amd64.deb

Is the version of xserver-xorg-core tested against the last 1.14 version I have back to back with same mesa and kernel versions. Framerates in both Critter and Scorched3d were cut roughly in half, 0ad is CPU bound and mimimally affected. I don't have any other GPU intensive games nor the bandwidth to download them so cannot benchmark them.

CPUs tested have been AMD FX-8120 and Phenom II x4
GPU's have been Radeon HD6750 and HD5770
proprietary drivers have NOT been tested

Here are the configuration files used with the AMD based desktops:

I use this xorg,conf file:

    Identifier "Layout0"
    Screen 0 "Screen0"
    InputDevice "Keyboard0" "CoreKeyboard"
    InputDevice "Mouse0" "CorePointer"
EndSection
Section "Files"
EndSection

Section "InputDevice"

    # generated from default
    Identifier "Mouse0"
    Driver "mouse"
    Option "Protocol" "auto"
    Option "AccelerationScheme" "none"
    Option "Device" "/dev/psaux"
    Option "Emulate3Buttons" "no"
    Option "ZAxisMapping" "4 5"
EndSection

Section "InputDevice"

    # generated from default
    Identifier "Keyboard0"
    Driver "kbd"
EndSection

Section "Monitor"
    Identifier "Monitor0"
    VendorName "Unknown"
    ModelName "Unknown"
    HorizSync 28.0 - 33.0
    VertRefresh 43.0 - 72.0
EndSection

Section "Device"
    Identifier "Device0"
    Driver "radeon"
    Option "SwapbuffersWait" "off" #not the default
    Option "ColorTiling" "on"
    Option "ColorTiling2d" "on" #not the default
EndSection

Section "Screen"
    Identifier "Screen0"
    Device "Device0"
    Monitor "Monitor0"
    DefaultDepth 24
    SubSection "Display"
        Depth 24
        Modes "1920x1080"
    EndSubSection

EndSection

And this ~/.drirc:

<driconf>
    <device screen="0" driver="dri2">
        <application name="Default">
            <option name="fthrottle_mode" value="2" />
            <option name="pp_celshade" value="0" />
            <option name="pp_jimenezmlaa" value="3" />
            <option name="pp_jimenezmlaa_color" value="0" />
            <option name="vblank_mode" value="0" />
            <option name="pp_nored" value="0" />
            <option name="pp_nogreen" value="0" />
            <option name="allow_large_textures" value="1" />
            <option name="pp_noblue" value="0" />
        </application>
        <application name="cinnamon" executable="cinnamon">
            <option name="vblank_mode" value="1" />
        <application name="mplayer" executable="mplayer">
            <option name="vblank_mode" value="1" />
        <application name="gnome-mplayer" executable="gnome-mplayer">
            <option name="vblank_mode" value="1" />
        <application name="mpv" executable="mpv">
            <option name="vblank_mode" value="1" />
        <application name="totem" executable="totem">
            <option name="vblank_mode" value="1" />
        <application name="criticalmass" executable="critter">
            <option name="vblank_mode" value="0" />
        </application>
        <application name="glxgears" executable="glxgears">
            <option name="vblank_mode" value="0" />
        </application>
        <application name="scorched3d" executable="scorched3d">
            <option name="vblank_mode" value="0" />
        </application>
    </device>
    <device screen="0" driver="r600">
        <application name="Default">
            <option name="fthrottle_mode" value="2" />
            <option name="pp_celshade" value="0" />
            <option name="pp_jimenezmlaa" value="0" />
            <option name="pp_jimenezmlaa_color" value="0" />
            <option name="vblank_mode" value="1" />
            <option name="force_glsl_extensions_warn" value="false" />
            <option name="pp_nored" value="0" />
            <option name="pp_nogreen" value="0" />
            <option name="allow_large_textures" value="1" />
            <option name="pp_noblue" value="0" />
        </application>
    </device>
</driconf>

And this ~/.profile is used to enable hyper-Z and ensure the sb backend is used in Mesa:

export R600_DEBUG=sb
export R600_HYPERZ=1

Revision history for this message
luke (lukefromdc) wrote :

Updates from 3/17 Xorg benchmarking:

On 3-17-2014, I updated X to the latest Trusty packages (xserver-xorg-core=xserver-xorg-core_2%3a1.15.0-1ubuntu7_amd64.deb) and retested.

The Critter benchmark may not be of any great importance as it is a 2d game in Opengl that runs very fast, but the regression is on the order of a 49% drop in framerate. This is absolutely repeatable (on two different machines) by leaving only one opponent on the screens and intentionally permitting all shields to be destroyed.

The Scorched3d benchmark gave inconsistant results. With the previous X server, I was getting 50-70fps, with the new version I sometimes got 25-35fps, but sometimes got right back to the 50-70fps range, though the highest speeds did not appear as often as with the older version of X. Scorched3d can be a bit difficult to benchmark as which map appears cannot be controlled.

In February I got nearly unplayable results in Scorched3d (11-25fps), though some of that was a since-resolved hardware issue and some was the hyper-Z issue with the first versions of Mesa 10.2 installed at that time. No change at all in Critter on Radeon, don't know if these results will translate into regressions on openGL loads I do not have or not.

On my Intel Atom netbook, by comparsion, Critter is barely playable due to dropped frames. When the new xserver came out, it was worse, I remember just over 60fps but with worse framedropping than ever. Now about 110 fps on the netbook with fewer dropped frames.

My conclusion is that some progress might be being made somewhere, but I don't know what changes in what package are helping if any.

Revision history for this message
luke (lukefromdc) wrote :

Further tests with today's Mesa 10.2 (3-17-2014)

Getting hard to reliably benchmark Scorched3d due to varying loads, but seemed to run a little faster than earlier test today. Still a bit inconsistant with some screens showing 25-30fps, but more of the screens now at 50-70, and saw 70fps a bit more.

Critter was interesting: little change in MAXIMUM framerate, but minimum framerate is now at 75% of maximum,with few drops below 300fps from 360, wich is 83% of minimim. Used to be maximum of 690fps with big drops under load to about (sometimes below) 400fps-all faster than now but much less steady with minumum at 55% or less of maximum. Again, this is not much of a benchmark due to high framerates on a 2d game, but still interesting.

It now seems to me that driver work in Mesa for dri3 may be killing this bug one leg at a time.

Revision history for this message
luke (lukefromdc) wrote :

This bug was finally resolved with updates to sdl, used by all the games I saw it in. When I first changed repos to move to Utopic Unicorn, a new SDL package returned both Critter and Scorched3d to full performance. Then another update reverted some change, and performance fell right back to where it had been. Finally a 3ed update fixed it again, hopefully for good.

libsdl1.2debian_1.2.15-8ubuntu2_amd64.deb brought the original fix

libsdl1.2debian_1.2.15-9ubuntu1_amd64.deb rolled it back
libsdl1.2debian_1.2.15-9ubuntu2_amd64.deb fixed it for good, had this changelog entry:

libsdl1.2 (1.2.15-9ubuntu2) utopic; urgency=medium

  * Restore accidentally-clobbered changes from 1.2.15-8ubuntu2.

  [ Timo Jyrinki ]
  * debian/patches/sdl-check-for-SDL_VIDEO_X11_BACKINGSTORE.patch:
    - Restore old backingstore behavior to prevent tearing
      (LP: #1280665)

Had to revert HUNDREDS of packages one at a time to find this! Any kind of anti-tearing behavior can be resource intensive, I've never had tearing problems in games with r600 but do need to use vsync with video players.

Revision history for this message
Oibaf (oibaf) wrote :

So, can this bug be closed now?

Revision history for this message
luke (lukefromdc) wrote :

Yes, it can be. As of now performance has totally and completely recovered. Most of it was the SDL changes that were so helpful in DRI3 X servers, and a few other optimizations seem to have come down the pike. Yesterday I saw framerates in both games as high as anything I have ever seen

Revision history for this message
Oibaf (oibaf) wrote :

Thanks, closing.

Changed in xorg-server (Ubuntu):
status: New → Fix Released
Changed in libsdl1.2 (Ubuntu):
status: New → Fix Released
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.