Comment 55 for bug 589485

Revision history for this message
Sergio Callegari (callegar) wrote :

This looks like one good reason to go with NVIDIA whenever you have a decent screen :-)

Could someone be so kind to explain on what basis it is being stated that the current behavior leaves the same flexibility as before?

Former behavior:
- do nothing to get the dpi from the screen, write 96 dpi in a config file to get 96 dpi (or alternatively any distro could easily do the latter for you by default and you could remove it if you wanted to).

Current behavior:
- do nothing and you get 96 dpi. And then? What should I write in the config file to get the screen dpi? I guess here I need a script to in xsession to read from somewhere the real dpi and tell xrandr to apply it. Otherwise any single machine of mine needs to have a different config. Otherwise if I have a laptop that I sometimes use with an external screen at home and another external screen at work, I can never solve the issue with a single hardwired dpi.

In other words, with the previous behavior anyone and any distro could get the current behavior at almost no cost and with a solution that could easily be reverted by anybody

With the current behavior, to get the former behavior requires complex scripting.

I fail to see this as an improvement at all.

Not to mention that:

1) xdpyinfo now reports false data by default (which should at least be indicated in the manual).

2) To know the real dpy you need to parse the xrandr output which is likely to be locale dependent and to make a floating point calculation (really what you want to do in sh scripts).

... and all this to merely solve a problem in web pages that is already solved by most browsers.