Comment 10 for bug 589485

Revision history for this message
In , Nick Bowler (nbowler) wrote :

(In reply to comment #6)
> This is intentional and follows the practice seen in many other desktop
> environments where the logical DPI of the screen is used as an application and
> font scaling factor.

This is definitely a good idea, and it's what many X applications already did. For example, I can tell Firefox to lay out text in a column that is 16cm wide, or I can tell urxvt to use a 9pt font (9pt is 1/8 of an inch). However, these units are only meaningful if the DPI is correct. If X is configured to assume 96dpi, then an application which wants to display something 16cm wide will translate that to 605 pixels. But when it is then output to the attached 129dpi display device, the result is only 12cm wide.

> If you don't like it, you're welcome to configure the X server to taste; we
> haven't removed any of those knobs. And, you can even use xrandr to change
> things on the fly.

I don't mind having to configure the X server to get the desired behaviour. However, after perusing the man pages, I can't find the knob which restores the earlier behaviour of computing the DPI value automatically based on the geometry of the attached monitor. The best solution I've come up with so far (save for patching the server) is adding

xrandr --fbmm `xrandr | sed -n '/ connected / {s/.* \([0-9]\+\)mm x \([0-9]\+\)mm/\1x\2/p;q}'`

to my startup scripts. So yes, the actual size information is available, and yes, the resolution tuned on the fly, but why is pulling a meaningless DPI value out of thin air the default?