Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

dunst does not use the correct DPI for rendering text #240

Closed
stapelberg opened this issue Oct 18, 2015 · 47 comments
Closed

dunst does not use the correct DPI for rendering text #240

stapelberg opened this issue Oct 18, 2015 · 47 comments
Labels
Milestone

Comments

@stapelberg
Copy link
Contributor

When using a hi-dpi display (also often called “retina display”) and the default “monospace 8” as font, the fonts rendered in dunst are significantly smaller than in other GTK applications and e.g. i3.

I think dunst should use code similar to https://github.com/i3/i3/blob/fec61791e1b26ecb7fedcd86c0c4b68634dd4169/libi3/font.c#L37-L55 instead of calling pango_cairo_create_layout directly in x.c.

To reproduce, use xrandr --dpi 192 and set Xft.dpi: 192 in ~/.Xresources

@winpat
Copy link

winpat commented Jan 17, 2016

+1 is there currently a workaround for this?

@mxmv
Copy link

mxmv commented Feb 16, 2016

+1, more people would hit this as high dpi monitors on laptops/desktops are getting popular.

@blueyed
Copy link

blueyed commented Feb 18, 2016

JFI: the obvious workaround is to increase the font size, e.g. font = Ubuntu Mono 18 in dunstrc.

@mathstuf
Copy link

That doesn't increase the icon sizes though. It also makes cross-machine configuration annoying.

@shearn89
Copy link

shearn89 commented Jun 1, 2016

Yep, currently running into this which is forcing me to uninstall dunst and use notify-osd instead :(

@mathstuf
Copy link

mathstuf commented Jun 1, 2016

I have a branch which does work. Just need to do some more work to store the DPI and use it in the places where 256 is hardcoded right now. Also, the DPI should be updated and everything redrawn when the monitor is changed in case monitors have different DPIs.

@Dinduks
Copy link

Dinduks commented Jul 18, 2016

@mathstuf Hi, any update on this? How can we help?

@mathstuf
Copy link

The code needs to get the real value instead of 256 at the bottom of the diff. Hadn't found the function when I was looking and haven't gone back since.

@Telkhine
Copy link

I've also hit this snag. Glad to see some work is going into it.

@mathstuf
Copy link

FYI, still haven't worked on it in a while :( . Feel free to update my branch to get the real DPI from X.

@Telkhine
Copy link

I'll pull and see what I can do.

@cliscum
Copy link

cliscum commented Jan 2, 2017

Here is a patch that uses an Xresources setting, dunst.dpi.

@Dinduks
Copy link

Dinduks commented Jan 2, 2017

Nice job @cliscum. Maybe we can use @mathstuf's work to get the DPI automatically from X?

@nmschulte
Copy link

nmschulte commented Jan 11, 2017

Nice job @cliscum. Maybe we can use @mathstuf's work to get the DPI automatically from X?

I'm not certain, but won't the client need both values (or just the XResource value)? IIUC, xrandr --dpi (xorg.conf/DisplaySize) changes what X believes is real about the physical display, and the Xresources/Xft.dpi setting is what clients should use when painting. Thus, if the user modifies Xft.dpi and the client obeys, X will do the right thing and paint w/ whatever appropriate (or overridden, if the auto-detect is wrong) DPI. It's not clear if X does the scaling application, or if the X clients need to ask it to (either semantically, or by doing the calculation themselves), though.

@stapelberg
Copy link
Contributor Author

X does no scaling, the clients need to do it themselves. And yes, one should use the Xresources value only, which is what all the big toolkits and applications do.

@mathstuf
Copy link

Hmm. All I do is set the DPI with xrandr; no X resource setting at all and non-GTK2 apps using toolkits look OK with just that.

@stapelberg
Copy link
Contributor Author

Some toolkits will set the X resource for you based on the X11 dpi.

@nmschulte
Copy link

nmschulte commented Jan 11, 2017

It seems the user has two levels of control here (assuming the user can configure the X server; I think this is generally not the case, but typically most systems are single-user, self-administered, and thus ...):

  1. via the X server's configuration, .../xorg.conf, xrandr (of what it thinks the display is in reality)
  2. via the X client's configuration .../.Xresources, xrdb (of what DPI the user wants the UI) [*note]

I think you should prefer to make your X server configuration reflect reality, and let the X clients do the right thing, given the user preferences/configuration. X server includes scaling mechanisms other than this DPI setting, which you might prefer (xrandr --scale?); AFAIU, the DPI setting is mainly useful when X can't properly determine it due to bad/old hardware.

I recently started using a laptop w/ a 15.6" 4K display; I've configured X to properly set the physical dimensions of the display (§ Monitor | DisplaySize ww hh), and I use Xft.dpi to scale the UI as I like (~288 X DPI / native PPI, 144 Xft.dpi).

It's worth noting that X's xorg.conf/DisplaySize configuration and xrandr's --dpi control the same thing; changing the DPI updates the determined displaysize, and v-v.

* note: How is it that Xft (X FreeType) winds up controlling the DPI (and via Xresources config, to boot)? How "standard" is this actually? What about fontconfig?


Clients that don't respect Xft.dpi cannot be controlled; the only way to get them to render the same as the apps that do respect Xft.dpi is to keep maintain the DPI assumption (commonly; 96 dpi; so keep Xft.dpi at 96). Then, one may use X's DisplaySize/DPI to apply DPI scaling. This may breakdown in multi-screen/monitor setups, though, so it's a big caveat and really should be properly fixed.

@tsipinakis tsipinakis added this to the 2.0 milestone Jan 25, 2017
@WhyNotHugo
Copy link
Contributor

IMHO the right thing to do is respect the xrandr setting. The most important factor is that it can be set per-display (Xft.dpi cannot, so we'd always have one broken display). Users can also modify it, so it can really also be the client configuration.

@stapelberg
Copy link
Contributor Author

IMHO the right thing to do is respect the xrandr setting. The most important factor is that it can be set per-display (Xft.dpi cannot, so we'd always have one broken display). Users can also modify it, so it can really also be the client configuration.

Xft.dpi can be set per X11 DISPLAY as well, e.g. using echo Xft.dpi: 192 | DISPLAY=:0 xrdb -merge.

Note that an X11 DISPLAY is not the same as a RandR output, and most setups use multiple RandR outputs on the same X11 DISPLAY these days. X11 does not support different dpi values on different RandR outputs.

I still strongly recommend to go with Xft.dpi first and fall back to RandR values if at all.

@nmschulte
Copy link

nmschulte commented Feb 8, 2017

I discovered a few weeks ago that xrandr --help shows that the --dpi option takes a <dpi>/<output>; I haven't had time to see what it actually does, but it implies a per-output DPI setting. There's nothing in the man-page about it.

2017_02_08-01_29_04-1053x627

@stapelberg
Copy link
Contributor Author

See https://sources.debian.net/src/x11-xserver-utils/7.7%2B7/xrandr/xrandr.c/#L3514 for the source. AFAICT, the option will look up the DPI of the specified output (based on its physical height) and will set it for the entire X11 DISPLAY.

This is not a per-output setting.

@WhyNotHugo
Copy link
Contributor

@stapelberg Sorry, "display" was a bad choice of word for me. I was refering to a physical display (not Xorg displays), meaning an actualy output. xrandr can set this per-output, which Xft.dpi cannot:

xrandr --output eDP1 --dpi 160

@stapelberg
Copy link
Contributor Author

Okay. Setting the DPI per output and relying on applications to query that setting implies:

  • applications need to know on which output(s!) they are displayed and chose the correct DPI accordingly
  • applications would need to re-initialize their entire rendering/toolkit/… when moved across outputs (does not apply to dunst)

Given dunst’s special nature (using non-moveable overlay windows), I can see how picking up the per-output DPI value seems like a cool trick. I still don’t think it’s a clean solution, and given the abundant disregard for DPI settings by popular apps and toolkits (which all just use Xft.dpi instead), likely only very few users will ever notice this feature exists.

That being said, I don’t care in the end what is implemented — I just want you to understand the trade-offs you’re making by deviating from the de-facto standard of using Xft.dpi.

@WhyNotHugo
Copy link
Contributor

WhyNotHugo commented Feb 8, 2017

Xft.dpi is the standard, but it's actually broken; it assumes all outputs have the same DPI.

Like you said, dunst is in a very special situation (non-moveable windows). But it also has another special nature: it's generally used by people who know how to RTFM (compared to say, gnome users), so cleanly documenting which DPI value dunst uses should have any this working properly for any interested users.

Also, the only way to break with the broken de-facto standard is to start using the other one that actually makes the right assumptions! 😄

That being said, I don’t care in the end what is implemented

I actually do care, because Xft.dpi means broken notifications when I use an external monitor (or viceversa).

@mathstuf
Copy link

mathstuf commented Feb 8, 2017

Xrandr DPI is also per-DISPLAY, not per-output (X itself only supports a single DPI value; Xrandr isn't going to fix that). The flag being associated with a display just makes it be the one used for calculations rather than some other display or otherwise calculated DPI. You need Wayland for per-output DPI settings. The closet you can get now is to standardize in the lowest DPI across your screens and use Xrandr to zoom/magnify the higher DPI monitors.

@tsipinakis
Copy link
Member

tsipinakis commented Mar 4, 2017

An effort was done to fix this in the fixdpi branch. Can someone with a high dpi monitor test these changes before they are merged? Any feedback is appreciated.

Dunst should give priority to the Xft.dpi X resource, after that there is a fallback to a naive calculation which should correctly handle multiple monitors with different dpi.

@WhyNotHugo
Copy link
Contributor

WhyNotHugo commented Mar 4, 2017

Latest stable:

2017-03-04t13 47 25 282228210-03 00

Branch fixdpi (with Xft.dpi: 160):

2017-03-04t13 47 49 385588839-03 00

I'm gonna have to re-tune my font and size and stuff (it's too big and bold now), but it looks like it's working.


UPDATE: Looks like naive calculation is not working (this is 15", @2880x1800).

Branch fixdpi (with no Xft.dpi):

2017-03-04t13 51 56 578231357-03 00

@tsipinakis
Copy link
Member

tsipinakis commented Mar 4, 2017

Weird, make sure you compiled with Xrandr enabled (It should be on by default if you didn't make any changes to config.mk)

What's the output of xrandr?

@WhyNotHugo
Copy link
Contributor

I installed using this:

make X11INC=/usr/include/X11 X11LIB=/usr/lib/X11
make PREFIX=/usr install
$ xrandr
Screen 0: minimum 320 x 200, current 2880 x 1800, maximum 8192 x 8192
eDP1 connected primary 2880x1800+0+0 (normal left inverted right x axis y axis) 331mm x 207mm
   2880x1800     59.99*+
   2048x1536     60.00
   1920x1440     60.00
   1856x1392     60.01
   1792x1344     60.01
   1600x1200     60.00
   1400x1050     59.98
   1280x1024     60.02
   1280x960      60.00
   1024x768      60.00
   800x600       60.32    56.25
   640x480       59.94
DP1 disconnected (normal left inverted right x axis y axis)
HDMI1 disconnected (normal left inverted right x axis y axis)
DP2 disconnected (normal left inverted right x axis y axis)
HDMI2 disconnected (normal left inverted right x axis y axis)
HDMI3 disconnected (normal left inverted right x axis y axis)

@tsipinakis
Copy link
Member

I pushed a commit with some debug output(I should really add some proper logging sometime).
What does dunst output after a notification is displayed?

@WhyNotHugo
Copy link
Contributor

Sorry, looking at the debug output make me look closer at my setup, it turns out that I was setting Xft.dpi: 106 on the latter test (I'd remove the 160 override I used for my laptop, but that left the default behind).

Testing properly again, detection works fine:

Tried to acquire dpi using Xresources(if 0 the operation failed): 0.000000
Trying dpi autodetection for screen:
Position: X: 0 Y: 0
Resolution: W: 2880 H: 1800
Size: mmH: 207

Calculation result: 220.869565

Sorry for the mixup.

I do admit though, that font looks bolder and larger than anticipated. Is this just scaling, or is the font-size itself being increased?

@tsipinakis
Copy link
Member

I do admit though, that font looks bolder and larger than anticipated. Is this just scaling, or is the font-size itself being increased?

I assume the font size is being increased but I can't be sure since all we're doing internally is passing the dpi value to cairo and it handles the font setup and rendering.

Calculation result: 220.869565

Looks like the dpi calculation thinks your dpi is 220 which contradicts the 160 you had set as the dpi value later so that might be why the font looks bigger than usual.

@WhyNotHugo
Copy link
Contributor

I assume the font size is being increased but I can't be sure since all we're doing internally is passing the dpi value to cairo and it handles the font setup and rendering.

TBH, the result is bold and fat. Closer inspection makes me think that the font is being grown more than it should:

For example, here's the screenshot at 106 dpi, manually scaled 150% with GIMP (this is what scaling to 160dpi should look like):

scaled-106-160

Here's a screenshot taken with Xft.dpi: 160:

160dpi

Note that the font is actually larger than expected.

Looks like the dpi calculation thinks your dpi is 220 which contradicts the 160 you had set as the dpi value later so that might be why the font looks bigger than usual.

Oh, the screen is physically 220dpi (so this is working fine). I set it to 160dpi because that's the scaling that I prefer (but that's totally personal). At 220dpi, everything looks too big for me (and, the increased pixel density lets me read smaller text, etc).

To be extra clear on that: the calculation was fine; 160dpi is a personal adjustment I make and does not reflect the actual screen's dpi.

@WhyNotHugo
Copy link
Contributor

Finally, kudos on the per-screen dpi detection, it's (sadly) not something seen very commonly.

@tsipinakis
Copy link
Member

Finally got some time to work on this again after 2 weeks of inactivity.

TBH, the result is bold and fat. Closer inspection makes me think that the font is being grown more than it should

That might have been because of the decimal places, I made the calculation round down the result which should improve if not completely fix this.

Additionally, I found that using a different dpi per screen can cause some visual inconsistencies if you have multiple screens with slightly different dpis so I moved the per-screen dpi detection feature as an opt in experimental feature in the dunstrc until we figure out what can be done about that.

@tsipinakis
Copy link
Member

Changes have been merged to master this should now be fixed.

@nmschulte
Copy link

nmschulte commented Apr 2, 2017

Dunst should give priority to the Xft.dpi X resource, after that there is a fallback to a naive calculation which should correctly handle multiple monitors with different dpi.

@tsipinakis, can you clarify this statement, please? How is it that Dunst is able to support a per-monitor (X11 Output) DPI? You yourself noted that X11 does not support this, and that such support is a major goal of the Wayland project. As I understand, support for this under X11 would require the user to configure Dunst directly to understand which Outputs have which DPIs; X11 has no specification to store or communicate this data/configuration.

https://github.com/dunst-project/dunst/blob/master/dunstrc#L189 -- seems to indicate that this feature only works when one doesn't specify an Xft.dpi value via Xresources. But, doesn't Xft.dpi get a "default" value in this case? Of, perhaps, 96, even?
I have a feeling that this experimental support has very limited real-world use-case, due to the assumptions it has to make about how to interpret X11's per-Output DisplaySize/DPI (they are reciprocal; they are the same thing, essentially). I don't believe an X11 client should be interpreting these values at all.

I'd like to test this functionality (primarily, the major goal of getting Dunstrc to respect Xft.dpi, but secondarily, to understand this per-X11 Output DPI mechanism), if you could give me some guidance.

@tsipinakis
Copy link
Member

tsipinakis commented Apr 2, 2017

First of all, due to some visual inconsistencies that was changed from a fallback to an opt-in experimental feature, see the example dunstrc.

The per-monitor dpi section refers to per randr output. We use the physical screen dimensions as reported by randr to calculate approximately that screens dpi. See here for the code(Don't ask me how that calculation works, it's still magic to me :) ).

Which is one of the reason we changed the X11 monitor extension we use from xinerama to randr.

Documentation is my next big goal before we release the next version so it should also help clarify how parts of dunst work.

@tsipinakis
Copy link
Member

tsipinakis commented Apr 2, 2017

To respond to your edit, this was added as an experiment since it was mentioned earlier in this thread that it would be a welcome feature and due to the way we implemented dpi handling it was a very simple change. It is still unclear how well it will work in a real-world scenario, I haven't even tested it that well myself since all my monitors have basically the same dpi, which is why it's experimental.

According to my tests Xft.dpi does not return a value if it is not set(the function call to retrieve it returns 0). In that case we either use the calculation mentioned above if per_monitor_dpi is true else we simply fall back to the default value, 96.

@nmschulte
Copy link

nmschulte commented Apr 2, 2017

I tested the base functionality here -- it works well. I will test the experimental bit once I get to a second monitor this week.

Question: dunst runs as a systemd user service on my setup (Debian Sid) -- I have dunst built and installed to ~/.local; I am trying to set DefaultEnvironment="PATH=/home/nmschulte/.local/bin:$PATH in ~/.config/systemd/user.conf, and the configuration seems to be effective as now I have to manually start the Dunst daemon. But, how can I get DefaultEnvironment to work, so it only modifies the environment for the selected user (as the documentation says ~/.config/systemd/user.conf does; see item 1 https://wiki.archlinux.org/index.php/Systemd/User#Environment_variables), so that it matches the user's PATH set in ~/.profile (or ~/.bash_profile/~/.bashrc if you're limiting to Bash shells)? The ArchWiki seems to indicate that systemctl --user set-environment is for all users, unlike the user.conf file.


Additionally, does Dunst support a "reload configuration" signal, like say, i3, does? i3 uses this signal to re-inspect the value of Xft.dpi / Xresources at large, and it seems sensible for Dunst too. However, it seems the daemon life-cycle is magical in some regard (due to d-bus? what actually manages the daemon's lifecycle?), and simply killing the daemon works in the cases where it immediately comes back up; it picks up the new values at startup.

@nmschulte
Copy link

@tsipinakis, this changeset does not properly increase the scale of the icon for the notification.

@tsipinakis
Copy link
Member

tsipinakis commented Apr 2, 2017

I don't understand what you're trying to accomplish in your first question, can you give a few more details? Why are you trying to set the PATH?

Additionally, does Dunst support a "reload configuration" signal

Not at the moment, killall dunst is as close as it gets. Feel free to file a feature request for it.

this changeset does not properly increase the scale of the icon for the notification.

If you use the max_icon_size setting, it is not scaled on purpose since the value is in pixels and it is expected to be set to a sane pixel value for the monitor dpi you're using.

@nmschulte
Copy link

nmschulte commented Apr 2, 2017

Regarding my first question: I am trying to get the Dunst instance that runs in my system's setup (an out-of-the-box config from Debian Sid) to use the version I built with PREFIX=~/.local. I set this directory on my PATH via ~/.profile. I was trying to configure systemd to do this, thinking systemd is what managed the Dunst instance (as htop showed dunst owned by systemd --user). This didn't work, and in turn actually seems to have completely broken the Dunst instance.

In fact, Dunst is managed by dbus, which is managed by systemd. After greping, I found that the /usr/share/dbus-1/service/org/knopwob.dunst.service file has Exec=/usr/bin/dunst, which is the magic I was looking for.

Re: max_icon_size -- understood. This has a default of 32, currently; a minimal improvement may be to default to 0. Otherwise, a ratio/scale factor seems appropriate, perhaps with a configurable (or intelligent, assuming it's warranted) basis. Or, swap the intelligence (or "primary" variable), or allow that to be a preference too. I think the experimental bit would apply straight-forwardly (read: the same way it does for font-size), if you're concerned about that.


I can't figure out what I've broken; it seems D-Bus doesn't run Dunst anymore after I've compiled and installed Dunst with PREFIX=~/.local, and have set in ~/.profile PATH=/home/nmschulte/.local/bin:$PATH. I've since removed my ~/.config/dunst/dunstrc, thinking it may be causing Dunst to crash somehow, and subsequently the actual ~/.local/bin/dunst binary, thinking it may be crashing or interfering w/ the D-Bus/SystemD execution somehow. But, now I'm stuck, and # aptitude reinstall dunst hasn't helped. I'd appreciate help w/ this, but I don't want to de-rail the subject.
I moved ~/.local/share/dbus-1/ to ~/.local/share/dbus-1.bak/, and it seems now Dunst runs as it did once again. I wonder what is going wrong here; maybe dbus won't load the services because they have the same name or something?
Okay, I discovered the issue; the packaged D-Bus service file calls out a systemd service, dunst.service, which it did not used to. It seems I don't need this configuration, but I'm not sure why it was introduced.
dbus-daemon[1056]: Activation via systemd failed for unit 'dunst.service': Unit dunst.service not found.

@tsipinakis
Copy link
Member

tsipinakis commented Apr 3, 2017

Okay, I discovered the issue; the packaged D-Bus service file calls out a systemd service, dunst.service, which it did not used to. It seems I don't need this configuration, but I'm not sure why it was introduced.

I assume you run dbus under systemd so it's started with the --systemd-activation flag, this was added in #295 but apparently I forgot to account for cases where dbus is managed by systemd but the dunst service file is not installed.

@mathstuf
Copy link

mathstuf commented Apr 3, 2017

Note that session DBus services aren't really supported by systemd anyways. See systemd/systemd#892.

@tsipinakis
Copy link
Member

Right, we've hijacked this issue enough lets move the discussion to #314.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests