Ignores physical display size and calculates based on 96DPI

Bug #589485 reported by Chris Halse Rogers on 2010-06-04
278
This bug affects 55 people
Affects Status Importance Assigned to Milestone
X.Org X server
Confirmed
Wishlist
xorg-server (Ubuntu)
Low
Unassigned

Bug Description

The X server, starting with 1.7, ignores the physical size reported by the EDID or in xorg.conf and calculates it based on screen resolution and a DPI of 96.

This is rather annoying for users of high DPI screens.

GNOME and KDE (used?) to set 96 DPI by default in their settings. We should check whether they still do, and if so let them handle this; I don't think X should be handling this.

Created an attachment (id=29210)
X log from 1.6.3 xserver (working setup)

Created an attachment (id=29211)
X log from 1.7.0 rc0 (not working properly setup)

Created an attachment (id=29212)
xorg.conf used

fff00df94d7ebd18a8e24537ec96073717375a3f

can you try reverting that from 1.7 and seeing if it helps?

Reverting that patch (+ small build fix) makes my setup working again:

$ xdpyinfo|grep dimens
  dimensions: 1440x900 pixels (303x190 millimeters)

Now am I supposed to configure this manually via xorg.conf as commit log says?

The 'screen size' as reported in the core protocol now respects the DPI value given by the user or config file and ignores the actual monitor DPI of any connected monitor.

The real monitor size is reported through the RandR extension if you really need to know the physical size of the display.

This is intentional and follows the practice seen in many other desktop environments where the logical DPI of the screen is used as an application and font scaling factor.

If you don't like it, you're welcome to configure the X server to taste; we haven't removed any of those knobs. And, you can even use xrandr to change things on the fly.

"now respects the DPI value given by the user or config file" - ok but I don't have DPI set in config and don't have DPI set in command line.

I would expect some sane/matching my LVDS value to be used by default in such case.

Also xrandr --dpi 120/LVDS _almost_ works because it's wrong by 1mm. Will try to set size via config then.

> The 'screen size' as reported in the core protocol now respects the DPI value
> given by the user or config file and ignores the actual monitor DPI of any
> connected monitor.

Hmmh but what's the point? Over the few past years, my xorg.conf has completely disappeared, either because the config is done somewhere else, or because all the settings are correctly autodetected by X. Why having to revert that and specify display size (which the user usually don't really know anyway, while the server does)?

> The real monitor size is reported through the RandR extension if you really
> need to know the physical size of the display.

Yeah so the user needs to use X to get his display size, then edit xorg.conf to tell X the display size. I find that a bit painful and inconsistent

> This is intentional and follows the practice seen in many other desktop
> environments where the logical DPI of the screen is used as an application and
> font scaling factor.

Hmhm, I don't get it. I'm not really comfortable with all that, but if my screen is a high dpi one and my OS supports it, why would I need to do as other crappy os do, and stick with a low resolution, 96 dpi screen?

> If you don't like it, you're welcome to configure the X server to taste; we
> haven't removed any of those knobs. And, you can even use xrandr to change
> things on the fly.

That's good, but I really don't know why it's necessary. Wasn't autodetection working correctly for the majority of users? Couldn't other people use the DisplaySize stuff?

Cheers,

*** Bug 25897 has been marked as a duplicate of this bug. ***

(In reply to comment #6)
> This is intentional and follows the practice seen in many other desktop
> environments where the logical DPI of the screen is used as an application and
> font scaling factor.

This is definitely a good idea, and it's what many X applications already did. For example, I can tell Firefox to lay out text in a column that is 16cm wide, or I can tell urxvt to use a 9pt font (9pt is 1/8 of an inch). However, these units are only meaningful if the DPI is correct. If X is configured to assume 96dpi, then an application which wants to display something 16cm wide will translate that to 605 pixels. But when it is then output to the attached 129dpi display device, the result is only 12cm wide.

> If you don't like it, you're welcome to configure the X server to taste; we
> haven't removed any of those knobs. And, you can even use xrandr to change
> things on the fly.

I don't mind having to configure the X server to get the desired behaviour. However, after perusing the man pages, I can't find the knob which restores the earlier behaviour of computing the DPI value automatically based on the geometry of the attached monitor. The best solution I've come up with so far (save for patching the server) is adding

xrandr --fbmm `xrandr | sed -n '/ connected / {s/.* \([0-9]\+\)mm x \([0-9]\+\)mm/\1x\2/p;q}'`

to my startup scripts. So yes, the actual size information is available, and yes, the resolution tuned on the fly, but why is pulling a meaningless DPI value out of thin air the default?

(In reply to comment #8)
> > The 'screen size' as reported in the core protocol now respects the DPI value
> > given by the user or config file and ignores the actual monitor DPI of any
> > connected monitor.
>
> Hmmh but what's the point? Over the few past years, my xorg.conf has completely
> disappeared, either because the config is done somewhere else, or because all
> the settings are correctly autodetected by X. Why having to revert that and
> specify display size (which the user usually don't really know anyway, while
> the server does)?
>
> > The real monitor size is reported through the RandR extension if you really
> > need to know the physical size of the display.
>
> Yeah so the user needs to use X to get his display size, then edit xorg.conf to
> tell X the display size. I find that a bit painful and inconsistent
>
> > This is intentional and follows the practice seen in many other desktop
> > environments where the logical DPI of the screen is used as an application and
> > font scaling factor.
>
> Hmhm, I don't get it. I'm not really comfortable with all that, but if my
> screen is a high dpi one and my OS supports it, why would I need to do as other
> crappy os do, and stick with a low resolution, 96 dpi screen?
>
> > If you don't like it, you're welcome to configure the X server to taste; we
> > haven't removed any of those knobs. And, you can even use xrandr to change
> > things on the fly.
>
> That's good, but I really don't know why it's necessary. Wasn't autodetection
> working correctly for the majority of users? Couldn't other people use the
> DisplaySize stuff?
>
> Cheers,

+1

My current screen is a 13.3" 1440x900 LCD, resulting in about 127DPI. I was like "WTF has happend???" when I upgraded to 1.7 and saw everything in 96DPI.

Why ignore a valid information we already have? When someone wants to override this, he perfectly can do so. I don't see the point why everyone who wants to have the correct behavior has to configure it by telling xorg "you have the information, use it", it should be "the information you have is wrong, please use X instead" as the old behavior was...

Oh, and btw, which "other desktop environments"? DPI is dots per inch, there can't be any other interpretation than "how many pixels/dots are used for one inch on screen".

Thanks
Evgeni

*** Bug 26194 has been marked as a duplicate of this bug. ***

(In reply to comment #6)
> The 'screen size' as reported in the core protocol now respects the DPI value
> given by the user or config file and ignores the actual monitor DPI of any
> connected monitor.

I've read the rest of the comments here and I'm still not sure I understand. What DPI value given by the user or config file? I haven't specified DPI *anywhere* (my xorg.conf file is empty except for 4 lines to set the video driver to 'nouveau' or 'nvidia'). And yet, xdpyinfo reports 96dpi, when the real resolution is 112dpi.

This sounds incredibly inconsistent and not very useful. You're saying I'm forced to either override Xfce's detected DPI (which will only "unbreak" GTK apps, and nothing else), or add entries to my xorg.conf file to set the display size? That makes no logical sense to me. *Please* tell me I'm misunderstanding something here.

(I suspect I may be: when I use nouveau, I get an incorrect DPI from xdpyinfo, even if the logging in Xorg.0.log prints out the correct display size and DPI. When I use nvidia, my DPI is correct everywhere.)

(In reply to comment #13)
> What DPI value given by the user or config file? I haven't specified DPI
> *anywhere*
Me too.

I do not see any voting possibility here so can only add:

+1/me too.

Same here, upgrade from 1.6 to 1.7, all my gtk apps started writing
much smaller, it hurts the eyes...

Created an attachment (id=33081)
Add "DontLie" server flag, to encourage honesty.

In case anyone's interested, here's the patch I've been applying which adds a so-called "knob" to restore the original behaviour. Maybe the flag can be extended to avoid other lies in the future.

xfree86: Add DontLie server flag.

Since commit fff00df94d7ebd18a8e24537ec96073717375a3f, RandR 1.2 drivers
lie about the resolution of the attached screen by default. When the
reported resolution is wrong, fonts and other UI elements that use
physical units are not sized correctly.

This patch adds a new server flag, DontLie, which encourages the server
to be honest by default.

Signed-off-by: Nick Bowler <email address hidden>

I'm sorry, but this really looks like an uncalled for change to me.

>This is intentional and follows the practice seen in many other desktop
>environments where the logical DPI of the screen is used as an application and
>font scaling factor.

http://en.wikipedia.org/wiki/Font_size
A point is an unit of measure, 0.353 mm.
If I set a character to be 10 points high, it has to be 3.5mm high on a screen.
I understand this could be not the most natural behaviour on projectors, but the vast majority of people use screens, there's no need to cause them troubles.
(I don't even know if projectors report DPI in their Edid anyway.)
Unless explicitly overridden, X should respect Edid.

Fiddling gratuitously with the DPI makes default configurations almost unusable on high resolution screens (fonts are rendered too small).
It screws up the 1:1 display of documents and images; and I suppose it messes with input devices like tablets and touchscreens.

Even the default of 96dpi doesn't make sense, this resolution is getting less and less common every day.
Users
To further the annoyance, I haven't found a way to ovverride the 96dpi default in xorg.conf:
DisplaySize gets ignored, and so does Option "DPI".
Right now I'm stuck with xrandr --dpi in .xsessionrc - not what I'd call user friendly.
Please reconsider this change in behaviour.
What bug was it supposed to fix?

Regards,

Luca

(In reply to comment #18)
> http://en.wikipedia.org/wiki/Font_size
> A point is an unit of measure, 0.353 mm.
> If I set a character to be 10 points high, it has to be 3.5mm high on
> a screen.

Just to clarify, setting the font size to 10pt defines the size of an
"em"; the exact meaning of which depends on the selected font (it might
not exactly correspond to the height of a character on screen).

> I understand this could be not the most natural behaviour on
> projectors, but the vast majority of people use screens, there's no
> need to cause them troubles.

The main problem here is that our method of specifying font sizes is not
well suited for devices such as projectors or TVs because it does not
take the viewing distance into account. However, lying about the DPI
doesn't actually improve the situation.

> Fiddling gratuitously with the DPI makes default configurations almost
> unusable on high resolution screens (fonts are rendered too small).

And when it doesn't make fonts unreadable, it makes them ugly.

> Even the default of 96dpi doesn't make sense, this resolution is
> getting less and less common every day.

I have owned exactly one display device in my lifetime with this
resolution: a 17" LCD monitor with 1280x1024 pixels. Most of my CRTs
have higher resolution, and most of my other "external" LCDs have lower.
My laptops have significantly higher resolution than all my other
devices. So from my personal experience, 96 is almost always the wrong
choice. The number seems to have come out of nowhere and makes little
sense as a default.

> Please reconsider this change in behaviour.
> What bug was it supposed to fix?

The commit message says

  Reporting the EDID values in the core means applications get
  inconsistent font sizes in the default configuration.

This makes no sense, since font sizes are consistent only when the DPI
correctly reflects reality! This change *causes* font sizes to be
inconsistent.

(In reply to comment #19)
> 96 is almost always the wrong
> choice. The number seems to have come out of nowhere and makes little
> sense as a default.

It was chosen in order to make display of web pages using Xorg more consistent with the way they get displayed on Windows, which by default assumes 96. http://blogs.msdn.com/fontblog/archive/2005/11/08/490490.aspx explains the origin of 96.

(In reply to comment #19)
> (In reply to comment #18)
> Reporting the EDID values in the core means applications get
> inconsistent font sizes in the default configuration.

> This makes no sense, since font sizes are consistent only when the DPI
> correctly reflects reality! This change *causes* font sizes to be
> inconsistent.

Actually, "correct" DPI only theoretically causes consistency. As a practical matter, the differing number if device pixels required to generate a glyph of some particular physical size typically results in an apparent difference when compared to the same physical size at a different DPI. This is because with the most commonly used fonts each unique pixel size is a physically unique design, not a simple magnification or demagnification of a single design. This is most commonly noticeable in sizes in the vicinity of 16px to 20px. At some point in this range, stem weight changes from 1.00px to 2.00px. Without any applied font smoothing, this is always quite clear. When various smoothing effects are applied, the difference is generally less obvious, but does produce an apparent inconsistency.

(In reply to comment #21)
> Actually, "correct" DPI only theoretically causes consistency. As a
> practical matter, the differing number if device pixels required to
> generate a glyph of some particular physical size typically results in
> an apparent difference when compared to the same physical size at a
> different DPI.

With correct DPI, 9pt fonts are readable on all my display devices. I
can take a ruler and measure the glyph size, and it is effectively the
same. With incorrect, 96 DPI, 9pt fonts are too small on my laptop to
be legible without a magnifying glass. On a display with non-square
pixels (common on CRTs), the problems are even more pronounced.

While there can obviously be rasterisation differences (the higher
resolution display will look better), this is not merely a theoretical
issue.

(In reply to comment #20)
> It was chosen in order to make display of web pages using Xorg more consistent
> with the way they get displayed on Windows, which by default assumes 96.
> http://blogs.msdn.com/fontblog/archive/2005/11/08/490490.aspx explains the
> origin of 96.

Firefox has the ability to render based on a fixed DPI (which indeed
solves the problem with broken websites not rendering correctly). I'm
not sure why we need the feature in X.org as well. Of course, enabling
that feature makes it impossible to read any text on web pages, but I
suppose that's only a _minor_ inconvenience...

Is the goal of X.org to be bug for bug compatible with Microsoft
Windows?

(In reply to comment #22)

> While there can obviously be rasterisation differences (the higher
> resolution display will look better), this is not merely a theoretical
> issue.

I guess I could have done a better job of making my point. Basically what I meant was because of rasterization differences, there is less practical consistency than ideal, and thus practice in fact falls short of theoretical. Still, the results aren't all that different from ideal, and using actual DPI in practice is much better than an inane assumption of 96 without regard to actual.

> Firefox has the ability to render based on a fixed DPI (which indeed
> solves the problem with broken websites not rendering correctly).

By default, FF uses the greater of actual DPI as reported to it by the environment, or 96. As a practical matter this infrequently makes much difference on web sites, as sites styling in such DPI-dependent physical measurements as mm, in or pt aren't particularly common, and FF's defaults are set in px, which is not affected by DPI.

> I'm not sure why we need the feature in X.org as well.

IMO Xorg has no business making that inane assumption. Nevertheless, there is some rationale to do it that results in some distros doing it, e.g. http://wiki.mandriva.com/en/2009.0_Notes#Font_size_and_physical_DPI

Download full text (3.5 KiB)

(In reply to comment #23)
> IMO Xorg has no business making that inane assumption. Nevertheless, there is
> some rationale to do it that results in some distros doing it, e.g.
> http://wiki.mandriva.com/en/2009.0_Notes#Font_size_and_physical_DPI

While I cannot complain about the decisions of a distribution that I
don't use (maybe Mandriva users appreciate this), I must address some of
the points on that wiki page:

| ... bizarre results when the DPI detection system fails

Indeed, such as when it decides for some reason to set the DPI to 96
instead of 130+ like it should.

| no desktop environment's interface is yet fully resolution independent

Agreed: Things like the cover art images in my music player are rendered
at a fixed pixel size and are therefore somewhat small. Of course,
lying about the DPI doesn't actually fix this, but it does make it
impossible to correct the problem at the source (e.g. make my music
player display a 3x3 centimetre image instead of a 120x120 pixel image).

| characters could be much larger than the interface elements they are
| supposed to match

I have never, ever seen this. It certainly isn't a problem in any of
the GTK+, Qt, Open Motif or other applications that I use daily
(including applications that don't use any toolkit at all).

| Similar problems often occur on websites

I concede this point. Of course, as hinted above, this could be solved
by setting layout.css.dpi to 96 and disabling any minimum font size in
the default firefox configuration (I don't know if other browsers have
similar parameters). On a high resolution display, the price of this
configuration is that you only get to gawk at pretty layouts instead of
actually reading any information.

Personally, I find the "back" feature of my web browser to be a suitable
means of navigating websites which do not work on my computer.

| ... as many users are accustomed to in Microsoft Windows and Apple OS X

A serious question: How are high resolution displays usable at all on
these operating systems, considering how bad things look on a 135 DPI
when Xorg assumes it's a 96 DPI display? What do they do differently?

| ... can still adjust the DPI value in the KDE or GNOME Control Center,
| or simply increase the default font sizes.

There are two obvious problems with this solution:

1) With hundreds of applications, with different mechanisms for setting
their font sizes, one literally needs to edit dozens of config files to
increase the default font sizes for all programs. In an ideal world,
one would say "I like 9pt fonts for most text" and every program would
use that, but this is sadly not the case today.

2) Even if you fix all the default font sizes, or adjust the DPI value
in the KDE or GNOME Control Center, or with xrandr --fbmm for those who
don't use KDE or GNOME, such that everything's perfect: you have to do
it all over again when you change display devices. It also makes it
impossible to share config files between computers (e.g. an NFS-mounted
home directory).

I don't actually care if the *default* behaviour for Xorg is to use 96
DPI unconditionally. My gripe is that there is no (documented) way to
restore the autodetection (wi...

Read more...

(In reply to comment #24)
> I don't actually care if the *default* behaviour for Xorg is to use 96
> DPI unconditionally. My gripe is that there is no (documented) way to
> restore the autodetection (without patching the server): a config option
> such as the one introduced by my patch solves this issue 100% for me.

May I second it. I am afraid, this discussion took the wrong track (while being quite interesting and exceptionally useful for me personally). It is not about whether 96DPI is better than any other. It is about leaving no choice to configure previous behaviour. That is the real bug.

(In reply to comment #20)
> It was chosen in order to make display of web pages using Xorg more consistent
> with the way they get displayed on Windows, which by default assumes 96.

Is it an official position of X.org developers? Is it documented anywhere?

On Tue, Feb 16, 2010 at 10:58:07PM -0800, <email address hidden> wrote:
> --- Comment #26 from Andrey Rahmatullin <email address hidden> 2010-02-16 22:58:04 PST ---
> (In reply to comment #20)
> > It was chosen in order to make display of web pages using Xorg more consistent
> > with the way they get displayed on Windows, which by default assumes 96.
>
> Is it an official position of X.org developers? Is it documented anywhere?

Sure, consider it an official position. I don't think it's
unreasonable. Especially if you assume that lower-DPI displays are
likely to be higher-resolution and thus physically huge, meaning that
people sit further away from them, and that displays with a meaningfully
higher DPI are almost always found in phones (as well as some laptops)
these days, meaning that people hold them a great deal closer to their
face.

I do agree that being able to configure the reported DPI (or just Option
"DontForce96DPI") would be entirely useful, but I can't see us changing
anything in the near future, particularly if it breaks web page display.
Saying 'well, don't go to that website then' isn't helpful to anyone at
all, and makes us look like we value strict technical correctness ('but
don't you know what the true definition of a point is?!?') over an
actual working system. While we do value strict technical correctness,
we don't value it to the point of crippling everything else.

(In reply to comment #27)

> I do agree that being able to configure the reported DPI (or just Option
> "DontForce96DPI") would be entirely useful,

So why not just add it (Option "DontForce96DPI") and finish this long thread for good?

(In reply to comment #27)
> Sure, consider it an official position. I don't think it's
> unreasonable. Especially if you assume that lower-DPI displays are
> likely to be higher-resolution and thus physically huge, meaning that
> people sit further away from them, and that displays with a meaningfully
> higher DPI are almost always found in phones (as well as some laptops)
> these days, meaning that people hold them a great deal closer to their
> face.

As was mentioned earlier in this thread, web browsers such as Firefox do
the Right Thing(tm) by default on lower resolution displays, and don't
need any help from Xorg. Firefox can be configured to do the same thing
for high resolution displays, still without any help from Xorg. I agree
that extremely low resolution displays are very likely to be TVs or
projectors.

> I do agree that being able to configure the reported DPI (or just Option
> "DontForce96DPI") would be entirely useful, but I can't see us changing
> anything in the near future, particularly if it breaks web page display.
> Saying 'well, don't go to that website then' isn't helpful to anyone at
> all, and makes us look like we value strict technical correctness ('but
> don't you know what the true definition of a point is?!?') over an
> actual working system. While we do value strict technical correctness,
> we don't value it to the point of crippling everything else.

When the DPI is falsely set to 96 on a high resolution laptop display,
the result is *NOT* an "actual working system".

> --- Comment #28 from Andrey Borzenkov <email address hidden> 2010-02-17 04:31:18 PST ---
> So why not just add it (Option "DontForce96DPI") and finish this long thread
> for good?
>
we're waiting for your patch.

(In reply to comment #30)

> we're waiting for your patch.
>

Patch was already posted in comment #17.

(In reply to comment #29)
> (In reply to comment #27)
> While we do value strict technical correctness,
> > we don't value it to the point of crippling everything else.

> When the DPI is falsely set to 96 on a high resolution laptop display,
> the result is *NOT* an "actual working system".

Arguably it is "working", but "working" isn't _usable_ for many of us with less than perfect vision. cf. bug 26608

>Sure, consider it an official position. I don't think it's
>unreasonable. Especially if you assume that lower-DPI displays are
>likely to be higher-resolution and thus physically huge, meaning that
>people sit further away from them, and that displays with a meaningfully
>higher DPI are almost always found in phones (as well as some laptops)
>these days, meaning that people hold them a great deal closer to their
>face.

The problem with fixed dpi is not really with very large screens - you're right, people use them from farther away; not with phones either - I sincerely doubt they use the default X configuration, so they don't really care about this.
It is for laptops, what I would say the majority of the new linux users have - how many student have you seen in a university that use mainly a fixed pc?
With laptops, you can't really choose the distance at which you sit from the screen, you're bound by the keyboard.
Still, laptops and netbooks have greatly varying screen resolutions, from 96dpi to 150 and more.
One size fits all is not going to work here.

>Saying 'well, don't go to that website then' isn't helpful to anyone at
>all, and makes us look like we value strict technical correctness ('but
>don't you know what the true definition of a point is?!?') over an
>actual working system. While we do value strict technical correctness,
>we don't value it to the point of crippling everything else.

When I recalled the definition of point, I did for a reason.
Programs count on that to work correctly, to not show fonts too small to be easily read; to make "100% zoom" even make sense.

At resolution over 130dpi (not rare at all today on portable systems) default fonts get hard to read if you stick to 96 "logical" dpi.

Now, this is not what I'd call an actual working system.

Moreover, if the problem resides in websites, than it must be addressed in browsers. I read that firefox already does this; I don't know if it really does, on Debian it uses system dpi, but then you should file a bug if you think it shouldn't; breaking the browser's more logical behaviour to fix broken web design is a solution, breaking the whole X is not...

The only real argument against using the real screen size is for huge screens, like TV (for projectors the problem doesn't subsist, they don't have an intrinsic screen size.)
I don't even know how many TVs report their real physical size on EDID; I tried only one and it didn't. (As for large computer screens, the most widespread size I see around these days is about 24" 16:9, 1920x1080 points. Which makes a lucky 96dpi, so they don't get hurt whatever the choice.)
Anyway, if you care for the lots of huge LCD owners, you could set a maximum size over which switch to 96dpi.
I ask you again to reconsider your decision.

I have a laptop with 14' LCD 1400x1050 (286mm x 214mm). So screen is very smooth and my correct DPI is about 124x124. Now, with forced 96x96 DPI I have to use very small fonts (DejaVu Sans 7pt) to get reasonable output on this screen.

I think, there should be a freedom in choosing own DPI.

Thanks.

(II) intel(0): Output LVDS1 using initial mode 1400x1050
(II) intel(0): Using default gamma of (1.0, 1.0, 1.0) unless otherwise stated.
(**) intel(0): Display dimensions: (286, 214) mm
(**) intel(0): DPI set to (363, 486)
...
(WW) intel(0): Option "DPI" is not used

This is really fun on an XO-1 with 200 resp. 267 DPI native resolution...

(In reply to comment #34)
> I have a laptop with 14' LCD 1400x1050 (286mm x 214mm). So screen is
> very smooth and my correct DPI is about 124x124. Now, with forced
> 96x96 DPI I have to use very small fonts (DejaVu Sans 7pt) to get
> reasonable output on this screen.

If the X server resolution is set smaller than the actual, then fonts
will be smaller than expected, not larger.

> (**) intel(0): DPI set to (363, 486)

This looks to me like a different problem: your log shows the resolution
being set to a phenomenally huge (and not even square) value.

(In reply to comment #36)

> > (**) intel(0): DPI set to (363, 486)
>
> This looks to me like a different problem: your log shows the resolution
> being set to a phenomenally huge (and not even square) value.
>
I think that this line is some nonsense. But, how to force correct resolution without DPI option in the xorg.conf?

(In reply to comment #37)
> But, how to force correct resolution without DPI option in the
> xorg.conf?

For non-xrandr-1.2 drivers, one can use DisplaySize [width] [height] option
in xorg.conf. For xrandr-1.2 drivers, one can use xrandr --fbmm
[width]x[height] after starting the server, but before starting any other
clients. In either case, [width] and [height] are set in millimetres.

(In reply to comment #6)
> The 'screen size' as reported in the core protocol now respects the DPI value
> given by the user or config file and ignores the actual monitor DPI of any
> connected monitor.
>
> The real monitor size is reported through the RandR extension if you really
> need to know the physical size of the display.
>
> This is intentional and follows the practice seen in many other desktop
> environments where the logical DPI of the screen is used as an application and
> font scaling factor.

Yes, and the default font size should be set to match the screen resolution, as is the practice in many other desktop environments.

Note that even on Windows that completely ignore actual screen DPI the system integrators do preset the font scaling factor to approximately match the actual DPI.

>
> If you don't like it, you're welcome to configure the X server to taste; we
> haven't removed any of those knobs. And, you can even use xrandr to change

You just set them intentionally to the wrong value. Thank you very much.

*** Bug 27660 has been marked as a duplicate of this bug. ***

My workaround right now is set

xrandr --dpi 130

in a file I put in my /etc/X11/xinit/xinitrc.d.

I'd prefer not to do that.

Thanks.

Chris Halse Rogers (raof) wrote :

The X server, starting with 1.7, ignores the physical size reported by the EDID or in xorg.conf and calculates it based on screen resolution and a DPI of 96.

This is rather annoying for users of high DPI screens.

GNOME and KDE (used?) to set 96 DPI by default in their settings. We should check whether they still do, and if so let them handle this; I don't think X should be handling this.

Changed in xorg-server (Ubuntu):
status: New → Incomplete
importance: Undecided → Low

(In reply to comment #38)
> (In reply to comment #37)
> > But, how to force correct resolution without DPI option in the
> > xorg.conf?
>
> For non-xrandr-1.2 drivers, one can use DisplaySize [width] [height] option
> in xorg.conf. For xrandr-1.2 drivers, one can use xrandr --fbmm
> [width]x[height] after starting the server, but before starting any other
> clients. In either case, [width] and [height] are set in millimetres.

This does not work for me. DPI ist stuck at 96x96 (I would like to use 135). I've tried to set it
1) via DisplaySize in xorg.conf
2) via xrandr --dpi 135
3) via xrandr --fbmm.
xdpyinfo always reports a 96x96 DPI value.
My video dirver is intel, X.Org X Server 1.7.6.

Anything else I could try?

(In reply to comment #42)
> ...for me. DPI ist stuck at 96x96 (I would like to use 135).
> I've tried to set it
> 1) via DisplaySize in xorg.conf
> 2) via xrandr --dpi 135
> 3) via xrandr --fbmm.
> xdpyinfo always reports a 96x96 DPI value.
> My video dirver is intel, X.Org X Server 1.7.6.

> Anything else I could try?

It's possible your distro sets Xft.dpi at 96, which typically will override Xorg's setting in at least some apps. If 'xrdb -query | grep dpi' produces 96, you need to find out where that's getting set and disable it, or change it to 135.

It's also possible your DTE (Gnome?, KDE?, other?) is forcing 96. You'll have to look into its settings to see.

xdpyinfo does not always report the DPI used by all apps. There are different ways an app can detect DPI. http://fm.no-ip.com/Auth/dpi-screen-window.html will report the DPI Firefox is using, which may or may not match what xdpyinfo reports, and likely won't if Xft.dpi is set and your screen is higher than typical resolution.

A completely helpful response may depend on your video chip model, distro/version, & video driver/version in addition to your Xorg server version. The most appropriate place and/or time to run xrandr commands can vary due to distro-specific nuances in X implementation. To make -fbmm work in openSUSE 11.3, I had to put it in /etc/X11//xinit/xinitrc in the section labeled "#Add your own lines here...".

(In reply to comment #43)
> (In reply to comment #42)
> > ...for me. DPI ist stuck at 96x96 (I would like to use 135).
> > I've tried to set it
> > 1) via DisplaySize in xorg.conf
> > 2) via xrandr --dpi 135
> > 3) via xrandr --fbmm.
> > xdpyinfo always reports a 96x96 DPI value.
> > My video dirver is intel, X.Org X Server 1.7.6.
>
> > Anything else I could try?
>
> It's possible your distro sets Xft.dpi at 96, which typically will override
> Xorg's setting in at least some apps. If 'xrdb -query | grep dpi' produces 96,
> you need to find out where that's getting set and disable it, or change it to
> 135.
>

Thanks for your fast reply!

> It's also possible your DTE (Gnome?, KDE?, other?) is forcing 96. You'll have
> to look into its settings to see.

First of all, distribution is Arch Linux.
I'm using e16 (version 1.0.2). As far as I know there is no way to set DPI in E (but I will check it).
Display manager is xdm. I've checked everything under /etc/X11/xdm/, no configuration file mentions any DPI setting.

>
> xdpyinfo does not always report the DPI used by all apps. There are different
> ways an app can detect DPI. http://fm.no-ip.com/Auth/dpi-screen-window.html
> will report the DPI Firefox is using, which may or may not match what xdpyinfo
> reports, and likely won't if Xft.dpi is set and your screen is higher than
> typical resolution.

The website from your link reports also 96 DPI.

>
> A completely helpful response may depend on your video chip model,
> distro/version, & video driver/version in addition to your Xorg server version.
> The most appropriate place and/or time to run xrandr commands can vary due to
> distro-specific nuances in X implementation. To make -fbmm work in openSUSE
> 11.3, I had to put it in /etc/X11//xinit/xinitrc in the section labeled "#Add
> your own lines here...".

OK.
distro: Arch Linux, version 2010.05
Xorg version: 1.7.6
video device: Xorg.0.log reports it as

(--) PCI:*(0:0:2:0) 8086:2a42:144d:c063 Intel Corporation Mobile 4 Series Chipset Integrated Graphics Controller rev 7, Mem @ 0xfa000000/4194304, 0xd0000000/268435456, I/O @ 0x00001800/8
(--) PCI: (0:0:2:1) 8086:2a43:144d:c063 Intel Corporation Mobile 4 Series Chipset Integrated Graphics Controller rev 7, Mem @ 0xfa400000/1048576

The vendor claims it to be an "Intel GMA 4500MHD".
video driver: xf86-video-intel 2.10.0

(In reply to comment #44)
> (In reply to comment #43)
> > It's possible your distro sets Xft.dpi at 96, which typically will override
> > Xorg's setting in at least some apps. If 'xrdb -query | grep dpi' produces 96,
> > you need to find out where that's getting set and disable it, or change it to
> > 135.

(In reply to comment #45)
> (In reply to comment #44)
> > (In reply to comment #43)
> > > It's possible your distro sets Xft.dpi at 96, which typically will override
> > > Xorg's setting in at least some apps. If 'xrdb -query | grep dpi' produces 96,
> > > you need to find out where that's getting set and disable it, or change it to
> > > 135.

Uh, sorry, have forgotten to post the result...
xrdb -query gives no output.

I've just set "Xft.dpi: 135" in /etc/X11/xdm/Xresources with the result that the fonts and the mouse pointer in xdm are bigger now so it seems that xdm has now the right DPI setting.
As soon as I log in to e16 the mouse pointer gets smaller and xdpyinfo still reports the wrong DPI setting.
Setting Xft.dpi in ~/.Xresources or /etc/X11/Xresources according to http://www.mozilla.org/unix/dpi.html has no effect, xrdb -query still gives no output.

(In reply to comment #46)
> xrdb -query gives no output.

That probably means something is unsetting Xft.dpi (making null) before the e16 desktop populates.

(In reply to comment #42)
> Anything else I could try?

Grep through /etc for a startup script containing something akin to '-quiet -nolisten tcp vt5 -dpi 96 dpms"'. If you find something, change '-dpi 96' to '-dpi 135', or delete it.

v1.7.6 is really too old to be discussing here. Bugzilla is not supposed to be a help forum. You have a distro and/or window manager specific problem and should try their help forums and/or bug trackers:
http://www.archlinux.org/mailman/listinfo
http://bbs.archlinux.org/
http://wiki.archlinux.org/
http://www.enlightenment.org/p.php?p=support&l=en
irc://freenode/#e

This change was introduced for all RandR 1.2 drivers by http://cgit.freedesktop.org/xorg/xserver/commit/?id=fff00df . (Notably, the nvidia binary driver is not affected.)

GNOME does also force X back to 96 DPI when gnome-settings-daemon starts up (bug 157398, bug 246718). It’s even more annoying about it, too: although you can configure the GNOME DPI to any value with gnome-appearance-properties, the X screen itself is forced to 96 DPI regardless of the GNOME value.

So now, in order to use my 130 DPI screen without squinting at tiny fonts, I need to create a /etc/X11/xorg.conf that configures DisplaySize, _and_ configure the following script to start on login with gnome-session-properties:

#!/bin/sh
xrandr --fbmm $(xrandr -q | sed -n 's/^.* connected .* \([0-9]*\)mm x \([0-9]*\)mm.*$/\1x\2/p')
gconftool-2 --unset /desktop/gnome/font_rendering/dpi

Changed in xorg-server (Ubuntu):
status: Incomplete → Confirmed

FWIW, here is another +1 for the option to keep the correct DPI. I was just trying out nouveau and first thought it was a bug in the driver. The Debian package maintainer pointed me to this bug.

I'm back to using nvidia for now, no need to go through extra hoops to use and test nouveau, when it sets the wrong DPI and I do need nvidia from time to time anyway (for the VDPAU support, my CPU is not powerful enough for 1080p movies).

Many thanks to all Xorg maintainers for their great work on such an important piece of software. I do hope they reconsider this decision, or at least include the patch proposed above.

Regards,
Andrei

+1 here to restore original behavior. Or at least provide option to disable this MS way of handling screen resolutions

(In reply to comment #49)
> +1 here to restore original behavior. Or at least provide option to disable
> this MS way of handling screen resolutions

In fact, this can no longer be validly called an "MS" way -- if your Windows 7 install disk has drivers for your video card, it will use correct DPI (rounded down to increments of 25%) automatically, by default! We used to have the same here with Xorg.

We need an xorg.conf flag -- for example, NVIDIA has "UseEDIDDpi".

Also, to note: I don't have any specific definition of "correct" behavior in multi-monitor mixed-DPI environments. I'm guessing Windows 7 may set it to the higher of the two -- err on the side of "too big", not "aaugh, I can't read anything!"

I wasn't aware about Win7. So this is a new "x11 way" =)

As about multiple different monitors (like laptop + monitor), I don't know how is better. Mac OS just uses DPI from primary one (if it use it at all) and it sucks... because just moving window from one display to another changes it physical size....

(In reply to comment #51)
> As about multiple different monitors (like laptop + monitor), I don't know
> how is better. Mac OS just uses DPI from primary one (if it use it at all)
> and it sucks... because just moving window from one display to another
> changes it physical size....

In a non-Zaphod multi-head configuration, I don't think there's any "right"
answer, especially when you consider that outputs can overlap. So you just
need to pick something arbitrarily in this case. No matter what choice you
make there will be very weird consequences, such as what you describe above.

Changed in xorg-server:
importance: Unknown → Medium
status: Unknown → Confirmed
Changed in xorg-server:
importance: Medium → Unknown
Sergio Callegari (callegar) wrote :

This looks like one good reason to go with NVIDIA whenever you have a decent screen :-)

Could someone be so kind to explain on what basis it is being stated that the current behavior leaves the same flexibility as before?

Former behavior:
- do nothing to get the dpi from the screen, write 96 dpi in a config file to get 96 dpi (or alternatively any distro could easily do the latter for you by default and you could remove it if you wanted to).

Current behavior:
- do nothing and you get 96 dpi. And then? What should I write in the config file to get the screen dpi? I guess here I need a script to in xsession to read from somewhere the real dpi and tell xrandr to apply it. Otherwise any single machine of mine needs to have a different config. Otherwise if I have a laptop that I sometimes use with an external screen at home and another external screen at work, I can never solve the issue with a single hardwired dpi.

In other words, with the previous behavior anyone and any distro could get the current behavior at almost no cost and with a solution that could easily be reverted by anybody

With the current behavior, to get the former behavior requires complex scripting.

I fail to see this as an improvement at all.

Not to mention that:

1) xdpyinfo now reports false data by default (which should at least be indicated in the manual).

2) To know the real dpy you need to parse the xrandr output which is likely to be locale dependent and to make a floating point calculation (really what you want to do in sh scripts).

... and all this to merely solve a problem in web pages that is already solved by most browsers.

Changed in xorg-server:
importance: Unknown → Medium
tags: added: regression-release

In #23705 it was stated that this is not a bug.

Please consider this as a feature request for new option to use automatically detected screen DPI.

Patch: https://bugs.freedesktop.org/attachment.cgi?id=33081

Changed in xorg-server:
status: Confirmed → Invalid
Changed in xorg-server:
importance: Medium → Unknown
status: Invalid → Unknown
Changed in xorg-server:
importance: Unknown → Wishlist
status: Unknown → Confirmed

Created attachment 51560
Patch rebased against current git.

Since the original patch no longer applies to current master, and I've had an
updated patch lying around for some time, I'm posting it here so it doesn't get
lost. It seems to still work but it's only been lightly tested.

I no longer care about this issue, since the available solutions are sufficient
for me. I'll leave the rest to the people who do care.

Cheers.

I renamed the option to match comment #27 https://bugs.freedesktop.org/show_bug.cgi?id=23705#c27 in #27505: DontForce96DPI. There are Fedora 15 i386 and x86_64 rpms at: http://koji.fedoraproject.org/koji/taskinfo?taskID=3372856 and of course the source rpm.

I tested it and actually didn't see what I expected, so I'm not sure what's going wrong:

$ rpm -q xorg-x11-server-Xorg
xorg-x11-server-Xorg-1.10.4-1.3.fc15.x86_64
$ grep -C2 Dont /etc/X11/xorg.conf
Section "ServerFlags"
        Option "AIGLX" "on"
        Option "DontForce96DPI" "on"
EndSection

$ xrandr|grep LV
LVDS connected 1366x768+0+0 (normal left inverted right x axis y axis) 293mm x 164mm
$ xdpyinfo|grep dimens
  dimensions: 1366x768 pixels (361x203 millimeters)

I rebased Nick's original patch but I don't think I ended up with something any different than his most recently-posted, rebased patch.

Created attachment 51580
Patch rebased against Fedora 15's xorg-x11-server-Xorg

the manpage change is in a separate patch because of how Fedora's rpm applies the patches, but that change is irrelevant for the detection/setting of DPI.

I'm not a coder, so this could be totally off the wall, but I wouldn't expect to 'Option "DontForce96DPI" "on"' to be a server flag, but rather a Monitor option. Then again 'Option "TargetRefreshRate" "60"'' as a Monitor option hasn't been working for me at least since 1.9.3, maybe ever, so maybe the problem you see is not related to the patch itself.

Created attachment 51582
patch rebased against Ubuntu 1.10.4 package + add option to set the defalt DPI value

This patch works for me. When I set DefaultDPI to auto I get 86 DPI on my 15" screen. Somewhat odd is that setting DefaultDPI to 123 gives 124 DPI on my 17" screen.

(In reply to comment #4)
> I'm not a coder, so this could be totally off the wall, but I wouldn't expect
> to 'Option "DontForce96DPI" "on"' to be a server flag, but rather a
> Monitor option. Then again 'Option "TargetRefreshRate" "60"'' as a Monitor
> option hasn't been working for me at least since 1.9.3, maybe ever, so maybe
> the problem you see is not related to the patch itself.

Would it be good to mirror what nvidia's already using? I know their driver called it "UseEDIDDPI", but I don't recall what Section it went in.

(In reply to comment #5)
> Created an attachment (id=51582) [details]
> patch rebased against Ubuntu 1.10.4 package + add option to set the defalt DPI
> value
>
> This patch works for me. When I set DefaultDPI to auto I get 86 DPI on my 15"
> screen. Somewhat odd is that setting DefaultDPI to 123 gives 124 DPI on my 17"
> screen.

Reviewed-by: Jeremy Huddleston <email address hidden>

Please send the patch to the xorg-devel mailing list for more eyes.

Removing blocker status. While this will likely land in 1.12, it won't block it.

Are there any progress on this patch? I will probably give my laptop to the dev who not allowing this patch to upstream, he should burn his eyes with it.

Any updates on this? Original bugreport was filled almost 3 years ago.

Current behavior is really annoying on laptops + external displays....

Graeme Hewson (ghewson) wrote :

I wonder how much configuration iOS users need to do to get a Retina display set up? The answer "none" comes to mind.

May the so called developers with "I know better" attitude be treated by physicists with the same attitude!

They broke the thing for BUG compatibility with an obsolete piece of crap for a single use case already handled in corresponding application (a web browser), they told us to go sink in the mailing lists and now they are too busy to at least accept the knob.

Folks, hallo! Anyone there? Or everyone reading mail on a low DPI 64" plasma?

Sergio Callegari (callegar) wrote :

With the new Retina displays and the trend that they impose we are shortly going to have displays dpi values from 90 (old large monitors) to over 300 (newer laptops and tablets). Assuming 96 dpi for everything will surely lead to interesting times.

Does anyone work on this?

Patch has been posted here and on the xorg-devel list.

Feel free to rebase/resend/nag somebody with commit access until it's applied.

(In reply to comment #14)
> See also https://help.ubuntu.com/community/AsusZenbook#LCD
Heh, UX31A I'm typing this at has 166 dpi.

Those who forced the 96dpi kludge into xorg should be forced to walk in my shoes till the end of their lives with no chance to change those.

See http://pastebin.com/vtzyBK6e for #xorg-devel discussion about this.

(In reply to comment #16)
> See http://pastebin.com/vtzyBK6e for #xorg-devel discussion about this.
Some comments:

> <ohsix> not a lot of people are bothered, since getting the per display dpi
> right is a hard problem, even if you can set it for one single monitor in
> particular, 'fixed' is handling some difference in dpi across displays,
> which doesn't happen in the toolkits or anything

Wrong. *I* am bothered, and *I* operate a few dual-monitor setup including those with different display DPI. That ohsix windows migrant would have a hard time telling me that forcing DPI to a semi-arbitrary value to follow the obsolete windows suit is right (and that it is worth breaking what used to work since last century).

> <ohsix> maybe you misunderstood me, i was telling you what's expected to do it

By whom? Those who smoked windows crack and a gazillion of tray notifiers?

Thanks but no thanks. I've seen enough weird video hardware (e.g. Acer V550 monitors reported those funny EDID values) but those are rather *exceptions* to be handled, and one can even automate that -- if a display has DPI less than e.g. 30 or higher than e.g. 300 (as of today) then it might be treated as a reason to fall back to default (96 is ok here) since those who operate special cases *can* be expected to know their ways around hi-res displays or display walls.

> <ohsix> any cobbled together thing where nobody really cares
> is going to miss details like that

This bastard should not continue to erode free software. *He* doesn't care.

Seems that Red Hat has hired too many dumb morons who took their windows habits and attitude there, see also http://people.freedesktop.org/~cbrill/dri-log/?channel=dri-devel&date=2012-12-13 /fedora -- and recall the F12 PackageKit saga of Richard Hughes "fame": https://bugzilla.redhat.com/show_bug.cgi?id=534047#c9

PS: just in case, I'm using and developing free software since 1998 and have done numerous migrations for people and companies. I know that care *is* crucial. Good luck to Xorg team with preserving that.

Download full text (3.5 KiB)

More from that IRC conversation:
<alesguzik> I'm not asking about making it default, but when screen size can be detected and resolution is known, what is the problem with dpi?
<alesguzik> It worked at some point in the past
<ohsix> it never worked

Ohsix's definition of "never" must be different from the dictionary's. As alesguzik said, automatically matching DPI to display density did work in the past:

$ cat /etc/SuSE-release
openSUSE 10.2 (i586)
VERSION = 10.2
$ head -n15 /var/log/Xorg.0.log | tail -n6
X Window System Version 7.1.99.902 (7.2.0 RC 2)
Release Date: 13 November 2006
X Protocol Version 11, Revision 0, Release 7.1.99.902
Build Operating System: openSUSE SUSE LINUX
Current Operating System: Linux m7ncd 2.6.18.8-0.10-default #1 SMP Wed Jun 4 15:46:34 UTC 2008 i686
Build Date: 02 June 2008
$ grep Output /var/log/Xorg.0.log | egrep -v 'disconnected|no monitor'
$ grep -v ^\# /etc/X11/xorg.conf.d/50-monitor.conf | grep DisplaySize
grep: /etc/X11/xorg.conf.d/50-monitor.conf: No such file or directory
$ grep -v ^\# /etc/X11/xorg.conf | grep DisplaySize
$ grep -v ^\# /etc/X11/xorg.conf.d/50-monitor.conf | grep PreferredMode
grep: /etc/X11/xorg.conf.d/50-monitor.conf: No such file or directory
$ grep -v ^\# /etc/X11/xorg.conf | grep PreferredMode
$ xrdb -query | grep dpi
$ xdpyinfo | egrep 'dime|ution'
  dimensions: 1600x1200 pixels (402x302 millimeters)
  resolution: 101x101 dots per inch
$ xrandr | head -n5
 SZ: Pixels Physical Refresh
*0 1600 x 1200 ( 402mm x 302mm ) *85 75 70 65 60
 1 1400 x 1050 ( 402mm x 302mm ) 75 60
 2 1280 x 960 ( 402mm x 302mm ) 85 60
 3 1152 x 864 ( 402mm x 302mm ) 75

more Xorg.0.log excerpts:
(II) RADEON(0): EDID data from the display on port 1 ----------------------
(II) RADEON(0): Manufacturer: NEC Model: 61da Serial#: 5356
(II) RADEON(0): Year: 2002 Week: 39
(--) RADEON(0): Virtual size is 1600x1200 (pitch 1664)
(**) RADEON(0): *Default mode "1600x1200": 229.5 MHz, 106.2 kHz, 85.0 Hz
(--) RADEON(0): Display dimensions: (400, 300) mm
(--) RADEON(0): DPI set to (101, 101)
$ xrandr -v
Server reports RandR version 1.1

$ cat /etc/SuSE-release
openSUSE 10.2 (i586)
VERSION = 10.2
$ head -n15 /var/log/Xorg.0.log | tail -n6
X Window System Version 7.1.99.902 (7.2.0 RC 2)
Release Date: 13 November 2006
X Protocol Version 11, Revision 0, Release 7.1.99.902
Build Operating System: openSUSE SUSE LINUX
Current Operating System: Linux m7ncd 2.6.18.8-0.10-default #1 SMP Wed Jun 4 15:46:34 UTC 2008 i686
Build Date: 02 June 2008
$ grep Output /var/log/Xorg.0.log | egrep -v 'disconnected|no monitor'
$ grep -v ^\# /etc/X11/xorg.conf | grep DisplaySize
$ grep -v ^\# /etc/X11/xorg.conf | grep PreferredMode
$ xrdb -query | grep dpi
$ xdpyinfo | egrep 'dime|ution'
  dimensions: 1920x1440 pixels (403x302 millimeters)
  resolution: 121x121 dots per inch
$ xrandr | head -n5
 SZ: Pixels Physical Refresh
*0 1920 x 1440 ( 403mm x 302mm ) *75 60
 1 1856 x 1392 ( 403mm x 302mm ) 75 60
 2 1792 x 1344 ( 403mm x 302mm ) 75 60
 3 1600 x 1200 ( 403mm x 302mm ) 85 75 70 65 60
$ xrandr -v
Server reports RandR version ...

Read more...

Sergio Callegari (callegar) wrote :

I may not agree with the tone that appears a bit aggressive, but the observations by Tfa7 appear to be correct.

Furthermore, with reference to the justification that there are monitors that report wrong/crazy EDID values, I really do not think that the majority of users who have bought and paid properly working monitors should suffer from others' broken hardware by having their system ignore the correct EDID info that their monitors can report.

Given the discussion of the xorg developers, I really think that this should be fixed downstream or (temporarily) in a PPA.

And it should be done *quickly*, since things like newer Apple machines with retina displays, Google's Chromebook Pixel, Toshiba's Kirabook, newer Sharp's displays at 11.6", 14", and 15.6" (235-262 PPI), Sharp's 13.3" display (221 PPI), Samsung's newer 13.3" displays (276 PPI) and tons of newer high-end displays will be all badly broken by the hardwired 96dpi setup.

Please rise the priority accordingly to this new wave of hardware.
And if possible, have someone remove the wishlist status on the freedesktop bug tracker, since this looks more like a regression.

Please remove the "enhancement" status. This used to work and now it is not working anymore, thus it is a regression.

Also note that the experience on all newer high end hardware by Apple, Sharp, Samsung, Google (for instance see http://www.macrumors.com/2013/05/20/samsung-and-sharp-introduce-new-ultra-high-resolution-notebook-displays/) is broken by the hardwired 96dpi.

It dawned on me that a "proposed replacement" might lack such an, um, "feature" of hardwired 96 dpi. Now if that will be called progress I'll invest some time into finding those who arranged that and ruining their remnants of reputation.

(creating and improving is vastly more important but the feeling of impunity results in *evil* things, unfortunately)

Folks, hey let's just get this crap fixed, are you still listening to the community? XFree86 project used to skip that at times, please don't.

Bachi (m-bachmann) wrote :

I would assume that by now a great deal of people do have a hi-res display and are affected. This bug should receive highest priority, certainly over feature enhancements as it ruins the experience.

I too thing this should be fixed. The best argument I see for it is that at least Gentoo tells you to rely on autodetection of the settings. There was a time where you could generate an xorg.conf and then tweak it but we've progressed past it because the config always broke. Now, requiring someone to fall back xorgr.conf is just silly.

madbiologist (me-again) wrote :

I'm not sure if this is being worked on for the Unity desktop in Ubuntu, but you might like to try the Ubuntu GNOME 14.04 "Trusty Tahr" beta 1 which is based on GNOME 3.10 - see https://wiki.ubuntu.com/TrustyTahr/Beta1/UbuntuGNOME for info and download links. As per https://help.gnome.org/misc/release-notes/3.10/more-core-ux.html.en GNOME 3.10 includes High-Resolution Display Support. Further improvements are coming to the GNOME Shell in GNOME 3.12 which is scheduled for release in March or early April - see http://www.phoronix.com/scan.php?page=news_item&px=MTYwMjE

Tom Fields (udzelem) wrote :

This should definitely be fixed for Ubuntu 14.04 (Trusty).

This is NOT a wishlist item, this is a BUG.
Using a default DPI of 96 on a 200 DPI monitor is /NOT/ helpful at all!

madbiologist (me-again) wrote :

A couple of last minute HiDPI fixes have just been added to GNOME shell 3.12.

More relevantly, Unity 7 on Ubuntu 14.04 "Trusty Tahr" will have support for HiDPI displays.

Sergio Callegari (callegar) wrote :

Thanks for the info! However, let me recall that the bug is not against unity and gnome, but xorg. For many reasons one may have X starting without unity or gnome (because of using another desktop or a transient misconfiguration). In this case, X should be able to provide readable chars even if one is on a hi-dpi monitor. X setting the DPI at 96 prevents this. Please, even if gnome and unity fix the HiDPI issue, let still have a configuration option for X allowing one to switch between:
1) Ignore the EDID and fix the dpi to 96 (current behavior, good for older hardware)
2) Respect the EDID info about dpi (good for newer hardware, but risky for projectors and TVs)
3) Respect the EDID info, with same safety checks (e.g. force some default if EDID data looks too small or too large to be credible)

madbiologist (me-again) wrote :

Fair point. Unfortunately from what I can see the xorg developers are not interested in fixing this and believe that the current behaviour is both correct and acceptable. Some of them seem to be suggesting that it is the job of the window manager/desktop environment to pass the desired settings to xrandr after receiving the monitor config from xrandr. When and if they change their mind we might be able to close this bug. In the meantime I thought I might pass on some more info...

Ubuntu developers are planning HiDPI support for the Ubuntu Linux boot experience - when dealing with GRUB2's boot-loader menu and the Plymouth boot splash screen. They are also investigating HiDPI support for Linux virtual consoles. Specifically, "Developers at this week's vUDS session will be working on a 2x version of the GRUB2 font for basic HiDPI support by just appearing twice as big, working on GRUB2 support for returning a display's physical dimensions, work out support for a scaling factor within Ubuntu's Plymouth theme, and support for a scaling factor within the console-setup font configuration."

There is also bug #1292218 which is described as "The lockscreen has correct Panel HiDPI support, but the prompt doesn't scale right now."

Some of this work might be ready for Ubuntu 14.04 "Trusty Tahr" but it is unlikely that all of the HiDPI improvements will be ready for that release.

Here are recent update https://bugs.launchpad.net/ubuntu/+source/xorg-server/+bug/589485 but this bug hasn't been touched for over a year. Could someone provide an update on the status?

Michael Shigorin (shigorin) wrote :

> Could someone provide an update on the status?
Upstream insists on keeping xorg broken along winxp lines; I wouldn't be surprised if they "fix" it in wayland some day, years after winvista and ages after it worked in X. I'd call this "$name_that_sexual_minority default" based on the methods used to push it :-/

ping

Sergio Callegari (callegar) wrote :

A couple more notes.

IMHO, the biggest issues with the current situation are the following:

1) The physical display size is available via xrandr. However xrandr delivers it in a format that is rather uncomfortable to parse.
2) The hardware only provides a physical display size and not the expected viewing distance. However, only with the *two* ingredients one can reliably compute a good 'virtual' dpi value to pass to the applications in order to get correctly sized fonts, icons and graphical elements.
3) When it is the desktop environment (DE) alone to decide the dpi value rather than xorg, there is always the risk of getting an unusual working environment when the user: (i) tests a different DE; (ii) has issues with the DE at DE startup. For these reasons, it would be good to have xorg provide a sane environment even before the DE starts.
4) For some reason some java apps seem to ignore the dpi value set by the DE. I'm told that matlab is among them.

So, please:
1) in the short sun re-introduce the xorg patch with the switch that lets one control whether the system should pick the EDID dpi or not as the NVIDIA driver does.
2) in the long run, it may make sense to have some heuristics in place capable of guessing if a monitor is a desktop monitor (normal viewing distance), a TV (larger than normal viewing distance), a projector (larger than normal viewing distance /and/ screen size that may vary depending on distance) or a pad (smaller than normal viewing distance and small screen).

Sergio Callegari (callegar) wrote :

I would also add that in the case of multiple screens with different features (e.g. laptop screen + desktop monitor), it would be great to have different dpi values and apps notified of the dpi change when they are moved from one screen to another.

toothbrush (s-baul-s) wrote :

Argh. I am yet another hapless user clamouring for help.

My hardware: Apple MacBook 13" Retina, DPI of 227.
X.Org's infinite wisdom: DPI of 96.

My eyes are now broken, and i find myself tweaking individual apps in the hope that they'll become usable. Inevitably, proportions become crooked.

Please please please give us the ability to set DPI to native!

To post a comment you must log in.
This report contains Public information  Edit
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.