xresprobe drops highest available resolution on certain lcd's
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
xresprobe (Ubuntu) |
Fix Released
|
High
|
Bryce Harrington |
Bug Description
Currently there is a problem with xresprobe & the Samsung 243T LCD
screen. It will not show 1920x1200 as a usable resolution...I figured
out why. I have a Geforce 6600GT Dual DVI hooked up via DVI to it and
it works fine when I add 1920x1200 to the xorg file manually. I would
really like it so that next version of Ubuntu just works out of the
box with this.
When ddprobe is looking up the edid info on this monitor the
monitor does not set the "digital" flag. So when you get the output
shows "input: analog signal.". In the "ddcprobe.sh" it sets this
monitor to "crt" instead of digital because it expects to see "input:
digital". I belive most likely most desktop LCD screens may not set
this since they can take both analgue and DVI inputs. This would be
ok but on line 56 of the ddcprobe.sh you remove the first highest
resolution that the screen can supported for crt:
OUTTIMINGS="$(echo "$TIMINGS" | tail -n "$(($NTIMINGS-
WHY???
This should be changed so that all resolutions that are supported can be used:
OUTTIMINGS="$(echo "$TIMINGS" | tail -n "$(($NTIMINGS))")"
That is one way to solve the problem. Another way could be to build a
quirk list. So if you see the "id: 00f7" then set SCREENTYPE="lcd".
But I think the first version solution is a lot better.
I also have a forum post on this issue:
http://
Below are my current outputs from ddprobe & xrespobe:
ddprobe:vbe: VESA 3.0 detected.
oem: NVIDIA
vendor: NVIDIA Corporation
product: nv43 Board - p218h2 Chip Rev
memory: 131072kb
mode: 640x400x256
mode: 640x480x256
mode: 800x600x16
mode: 800x600x256
mode: 1024x768x16
mode: 1024x768x256
mode: 1280x1024x16
mode: 1280x1024x256
mode: 320x200x64k
mode: 320x200x16m
mode: 640x480x64k
mode: 640x480x16m
mode: 800x600x64k
mode: 800x600x16m
mode: 1024x768x64k
mode: 1024x768x16m
mode: 1280x1024x64k
mode: 1280x1024x16m
edid: 1 3
id: 00f7
eisa: SAM00f7
serial: 4e423234
manufacture: 17 2005
input: analog signal.
screensize: 52 32
gamma: 2.600000
dpms: RGB, active off, no suspend, no standby
timing: 720x400@70 Hz (VGA 640x400, IBM)
timing: 720x400@88 Hz (XGA2)
timing: 640x480@60 Hz (VGA)
timing: 640x480@67 Hz (Mac II, Apple)
timing: 640x480@72 Hz (VESA)
timing: 640x480@75 Hz (VESA)
timing: 800x600@60 Hz (VESA)
timing: 800x600@72 Hz (VESA)
timing: 800x600@75 Hz (VESA)
timing: 832x624@75 Hz (Mac II)
timing: 1024x768@87 Hz Interlaced (8514A)
timing: 1024x768@70 Hz (VESA)
timing: 1024x768@75 Hz (VESA)
timing: 1280x1024@75 (VESA)
ctiming: 1600x1200@60
ctiming: 1280x1024@60
ctiming: 1152x864@75
dtiming: 1920x1200@59
monitorrange: 30-80, 55-75
monitorname: SyncMaster
monitorserial: H4KY400649
xrespobe:
root@desktop:
id: SyncMaster
res: 1600x1200 1280x1024 1152x864 1024x768 832x624 800x600 720x400 640x480
freq: 30-80 55-75
disptype: crt
Changed in xresprobe: | |
assignee: | daniels → nobody |
Changed in xresprobe: | |
status: | Unconfirmed → Confirmed |
I believe my theory that desktop LCD monitors not setting the digital bit in the
EDID is correct. The HP L2335 also seems to have this problem. A post on newegg
reviews shows the following:
"Once or twice when Linux boots up the monitor has auto-synced to 1600 x 1200 www.newegg. com/Product/ Product. asp?Item= N82E16824176018
instead of 1920 x 1200 (which is what is being driven).". Link is here:
http://