Comment 0 for bug 544496

Revision history for this message
In , Dawitbro (dawitbro) wrote :

Overview:

I recently built two packages from the xf86-video-ati git tree to
update my system:

    xf86-video-ati-6.12.99+git20100205.4f9d171 [Feb. 5]
    xf86-video-ati-6.12.99+git20100215.47136fa [Feb. 15]

Using xorg-server builds of 1.7.4.902 and 1.7.5, I have found a serious
regression in performance: using the game 'torcs', which features a
frame rate performance indicator in the upper right corner, my average
frame rate drops from ~50 fps to ~20 fps. This may not be a reliable
benchmark, but game play clearly becomes sluggish and the poor frame rate
is quite visible and obvious.

  Steps to reproduce:

1. Install 'torcs', build and install radeon from xf86-video-ati at
commit 4f9d171, and play the game observing the reported frame rate.

2. Build and install radeon from xf86-video-ati at commit 47136fa, and
play the game observing the reported frame rate.

  Actual results:

Performance on my particular system is cut back by 1/2 to 2/3 from the
Feb. 5 version to the Feb. 15 version. This behavior is 100% reproducible.

  Expected results:

Performance should have been the same -- or even better, given the two
optimization commits by Pauli Nieminen (78e7047 and 3ec25e5).

  System info:

GPU:
    GIGABYTE GV-R485MC-1GI Radeon HD 4850 (RV770)
    1GB VRAM, 256-bit GDDR3
    PCI Express 2.0 x16

Kernel + architecture: [uname -r -m]
    2.6.33-rc8-git.100213.desktop.kms x86_64

Linux distribution:
    Debian unstable

Machine: self-built
    MSI 790FX-GD70 motherboard
        socket AM3
        AMD 790FX and SB750 Chipset
    OCZ OCZ3P1600EB4GK 2x2GB DDR3 1600
    AMD Phenom II X4 955

Software versions:
    xf86-video-ati: [see above]

    mesa: mesa-7.7+100211-git04d3571 [from git]

    libdrm: libdrm-2.4.17-1-dw+2.6.33-rc8 [from Debian git repo]

    xorg-server: xorg-server-1.7.4.902-0upstream [from tarball]
                        xorg-server-1.7.5-0upstream [from tarball]
                        xserver-xorg-core-2:1.7.5-1 [from Debian unstable]

    torcs: 1.3.1-2 [from Debian unstable]

  Additional Information:

I actually updated the "radeon" driver and the X server to the following versions at the same time:

    xf86-video-ati-6.12.99+git20100215.47136fa
    xorg-server-1.7.5-0upstream

I wasn't sure which package (or both) was causing the problem. I tried each
combination of old/new DDX driver with old/new X server, and found that the
X server version had little or no effect on performance with the Feb. 5 DDX,
but both (1.7.4.902 & 1.7.5) X servers performed miserably with the Feb. 15
DDX.

I don't actually play games much, except to test the performance of new
hardware and software. Since November 2009, with each update of kernels,
mesa packages, X server packages, and DDX packages, I have tested using a long
(and growing) list of 3D games -- looking for crashes, bugs, and/or performance
problems; this is part of my own testing of Linux support for HD 4850 (RV770),
which I purchased last Fall. In this case, it happens to be 'torcs' that
best reveals the difference between versions of the "radeon" driver.

I would like to bisect this with git, since there are only 6 new commits
between these two versions of xf86-video-ati. Unfortunately I have to work
for the next few days. I may get a chance to try a bisect on Friday, though
maybe someone else will acknowledge the problem and provide a fix before then.

Wasn't sure what to set for "Severity" and "Priority," so I left them at the
Bugzilla defaults.