SUMMARY kwin uses a size-based heuristic to map touchscreens (absolute pointing devices) to their physical screen. This heuristic fails when the size of the screen, as reported by xrandr or similar, doesn't match the size of the pointing device, as reported by libinput. When the heuristic fails, the pointing device is mapped to the first screen, regardless of the presence of a different touch input device on the first screen, and with no way to change the mapping. STEPS TO REPRODUCE 1. To the first slot of a GPU, attach a normal monitor. 1. To the second slot of a GPU, attach a touchscreen monitor, which reports inconsistent sizes between the screen and touch device, the Asus VT229H is one such device, a Wacom Bamboo would be another. 2. Note that the input is mapped to the wrong (first) monitor. 3. Swap the monitor slots. 4. Note that the input is mapped to the correct (first) monitor. OBSERVED RESULT The touch input is always mapped to the first monitor, unless the first monitor is hot-plugged, at which point it remains mapped to the correct monitor until kwin restarts. Virtual display positioning does not matter. EXPECTED RESULT Some way to override the (failed) heuristic to manually map an absolute positioning input device to an arbitrary section of the virtual display. SOFTWARE/OS VERSIONS Linux: gentoo 17.0/desktop/plasma Kernel: 5.2.1 KDE Plasma Version: 5.16.4 KDE Frameworks Version: 5.16.4 Qt Version: 5.12 ADDITIONAL INFORMATION Also tried it on an old touchscreen laptop, both the integrated screen and the Asus map to the integrated screen. For laptop users, the workaround of changing screen order is not possible. A way to manually specify the mapping also would support non-screen absolute positioning devices, like the Wacom bamboo, where there is no clear map between the input device and a screen.
Are you on X11 or Wayland?
(In reply to Vlad Zahorodnii from comment #1) > Are you on X11 or Wayland? Wayland.
So I finally had a bit more time to dig into this, with a bit of grepping, I managed to find the relevant function: $kwin/libinput/connection.cpp:applyScreenToDevice Looks like there is no simple way to remap APDs to arbitrary virtual screen offsets. That said, adding additional heuristics for the mapping is pretty simple (for one, if there is only a single display attached, grab it with reasonable confidence, also, if there is already a APD attached, grab the next screen in line). It also should be easy to add a config file to override the auto-detection. Given the information available in the function, looks like mapping from touch device to EDID is going to be the most reliable. I'm not certain how to identify the touch device, as USB serial numbers are unreliable, and bus location is worse. I'll poke that a bit more and see what shakes out. All that said, I'm happy to attach a patch once I get it working, if I can get some help making sure it won't just waste everyone's time. To that end, I have a few questions. 1. Is it better to add a new section to one of the existing $HOME/.config/k*rc files, or to make a new ktouchrc (or some other name) file? 2. Do I put the config reading/writing inline in the `applyScreenToDevice` function, or in its own function (and where do I put that)? 3. Is there a better choice than EDID to remember the display, and is there an obvious choice for the input device?
Can you confirm if you have the same problem with kwin_x11 ? We need to know if its Wayland specific problem, or generic kwin. And also, is 5.19, 5.20, 5.21 affected? Thank you for your report, this sounds like an important problem for people with touch screens and dual devices. Dev interested fixing this bug should be offered necessary hardware.
Yes, it's still a problem in 5.19 and 5.20. I assume in 5.21 as well (5.21 is not available in gentoo yet), but I doubt this has been spontaneously fixed after a year and 3 months. It is Wayland specific. Under X11, touchscreen input is translated by the X server into generic mouse input, so kwin doesn't interface with libinput directly. Under wayland, kwin "talks" to libinput, using libinput/connection.cpp. Line 614 is where the touchscreen is assigned to the first screen if it can't guess which screen to use. Since I'm on gentoo, I just tossed a patch into my user patches to hardcode it to the right screen. As I wrote in my third message (2019-09-20), I'm happy to turn my patch into a proper fix, since I have hardware which needs it and the programming skill to write it, but I want to know there's a reasonable chance of getting the fix included before I go to the work.
Dear Bug Submitter, This bug has been in NEEDSINFO status with no change for at least 15 days. Please provide the requested information as soon as possible and set the bug status as REPORTED. Due to regular bug tracker maintenance, if the bug is still in NEEDSINFO status with no change in 30 days the bug will be closed as RESOLVED > WORKSFORME due to lack of needed information. For more information about our bug triaging procedures please read the wiki located here: https://community.kde.org/Guidelines_and_HOWTOs/Bug_triaging If you have already provided the requested information, please mark the bug as REPORTED so that the KDE team knows that the bug is ready to be confirmed. Thank you for helping us make KDE software even better for everyone!
*** This bug has been marked as a duplicate of bug 398977 ***
How on Earth do you figure this is a duplicate of a crash bug? How do you figure a bug in the Wayland compositor is a duplicate of a crash under X11? I'm going to assume it was an accident that this got marked duplicate, it's certainly not resolved by fixing a kwin-x11 bug. Heck, this bug isn't even in the same *file* as any of those touched. Please explain if I'm missing something...
Just for clarification, with X11 you can use xinput to force which display the touchscreen is bound to, unfortunately with wayland then it sounds like this is the responsibility of the compositor to track/maintain it
(In reply to logan from comment #5) > As I wrote in my third message (2019-09-20), I'm happy to turn my patch into > a proper fix, since I have hardware which needs it and the programming skill > to write it, but I want to know there's a reasonable chance of getting the > fix included before I go to the work. It's been a while but if you're still up for it, feel free to do it! Noone will object to improving the used heuristics or making it possible to set the relevant display in the settings. If you need help, do ask for it, preferably in #kwin:kde.org on matrix
FWIW, I have a very similar issue with a SMART Podium 500 device, which is an external screen with a stylus. I can also get it recognized as an absolute pointer device, but then KWin maps the pointer movement to the entire virtual desktop (which comprises the external screen on the left, and my built-in laptop screen on the right). This happens with Plasma 5.24.4 running wayland. The libinput people have been very helpful debugging this. There is lots of technical information at https://gitlab.freedesktop.org/libinput/libinput/-/issues/769 Do you think this is the same issue, or should I open a separate report?
(In reply to Oliver Sander from comment #11) > FWIW, I have a very similar issue with a SMART Podium 500 device, which is > an external screen with a stylus. I can also get it recognized as an > absolute pointer device, but then KWin maps the pointer movement to the > entire virtual desktop (which comprises the external screen on the left, and > my built-in laptop screen on the right). > > This happens with Plasma 5.24.4 running wayland. The libinput people have > been very helpful debugging this. There is lots of technical information at > https://gitlab.freedesktop.org/libinput/libinput/-/issues/769 > > Do you think this is the same issue, or should I open a separate report? that is ... interesting. If I'm reading https://invent.kde.org/plasma/kwin/-/blob/master/src/backends/libinput/connection.cpp#L547 correctly it should always pick one output to apply (and sometimes the wrong one) Is the input split across both screens, Oliver, or is it exclusively on the wrong one? If it's the former this may hint at a somewhat unrelated issue
Gnome seems to have a similar heuristic (which may or may not produce better results), but also has a config entry for manually mapping touchscreens to outputs: https://github.com/GNOME/gsettings-desktop-schemas/blob/master/schemas/org.gnome.desktop.peripherals.gschema.xml.in#L211 As far as I can tell there is no graphical way to apply that config though
I just checked again. The input really is split across the two screens.
I can reproduce this bug on 5.24.5 (Fedora 36 Kernel 5.18.6). In my case, I have an external touchscreen connected to a laptop, both screens are 1920x1080. Laptop: Thinkpad T470s (Mesa Intel HD 620) Touchscreen HID device: ILI Multi-Touch-V3000 (USB 222A:004D) on Inknoe Lite Touchscreen display device: reports as unknown in info center The touchscreen maps to the laptop internal display when it should map to the external display. An added complication that can mess up the heuristic is that the display output connects through HDMI while the touch device connects through a separate USB port (this is a rather old device) so going by EDID may not always work. The easiest way to cover all use cases is probably to have an option for a manual override in system settings to map a touchscreen device to a specific display output (something like the target display setting in the drawing tablets page).
(In reply to Abel Yang from comment #15) > I can reproduce this bug on 5.24.5 (Fedora 36 Kernel 5.18.6). In my case, I > have an external touchscreen connected to a laptop, both screens are > 1920x1080. > > Laptop: Thinkpad T470s (Mesa Intel HD 620) > Touchscreen HID device: ILI Multi-Touch-V3000 (USB 222A:004D) on Inknoe Lite > Touchscreen display device: reports as unknown in info center > > The touchscreen maps to the laptop internal display when it should map to > the external display. An added complication that can mess up the heuristic > is that the display output connects through HDMI while the touch device > connects through a separate USB port (this is a rather old device) so going > by EDID may not always work. > > The easiest way to cover all use cases is probably to have an option for a > manual override in system settings to map a touchscreen device to a specific > display output (something like the target display setting in the drawing > tablets page). Worth noting that the resolution of the screens does not matter. It is the screens' reported physical sizes, compared to the touch device's reported size. The issue is your external display does not report the same physical size for the touch and display components. If they are off by *any* amount, it does not pick the closest fit, it just falls back to the first display. For a first step, if it hits the "don't know, just guess" part of the code, it could guess the screen with the closest matching size, but that does not handle the "two identical displays with two identical APDs" issue. A fingerprint of the touchscreen, along with remembering its bus address (USB or otherwise) should be able to handle remembering which device goes where across restarts. Use the fingerprint, ignoring the bus address, unless there are duplicate devices in play. But a graphical way to reattach them would make the (re)configuration much easier. In March I went to the recommended matrix channel, and got a bit of a runaround about how best to implement this. There was some suggestion of making it use an extension system rather than a core part of kwin. There was some suggestion of making it use custom udev rules, so that anyone without write access to /etc/udev/rules.d is screwed (not to mention requiring a replug (impossible on integrated i2c screens) or restart of kwin). I'll try again next time I'm between paying work.
To me, the UI-based and guesswork-free solution suggested by Abel Yang ("something like the target display setting in the drawing tablets page") is precisely what I would have expected. (I, too, am affected by this bug, see: https://forum.kde.org/viewtopic.php?f=289&t=175856&p=457237#p457237 )
(In reply to Oliver Sander from comment #14) > I just checked again. The input really is split across the two screens. I checked again yesterday, with an updated stack from Debian testing. This time, the input was *not* split across the two screens. Rather, the input is now exclusively on the wrong screen, as for everybody else in this thread. [shrugs] Operating System: Debian GNU/Linux KDE Plasma Version: 5.25.5 KDE Frameworks Version: 5.98.0 Qt Version: 5.15.4 Kernel Version: 5.19.0-2-amd64 (64-bit) Graphics Platform: Wayland
A possibly relevant merge request was started @ https://invent.kde.org/plasma/kwin/-/merge_requests/3032
Looking at the possible merge request, it seems to make two assumptions - that you only have two possible screens and that the screen isn't reporting any physical dimensions - which is likely to fix one use case but not one whether there is more than 2 screens or where the device is reporting dimensions that are different to the dimensions being reported by the touchscreen - examples of one or the other being listed in the comments on this bug,
That is correct. I cannot think of any use case that is made *worse* by this change, but it does not solve the underlying problem. I'm not sure if the additional code complexity is worth catching one, likely rare, case. We could write a dozen similar-sized patches and still not actually solve the issue (I do not think any heuristic can exist to handle the 2-identical-screens issue).
FWIW: I was reading through the notes on the merge request when I noticed a comment saying that the ability to set it was already in there, it just wasn't exposed via a UI. I was able to find the input device via DBUS and changed the outputDevice to my touchscreen ( I have DVI-D-1, DP-1, DP-2, with touchscreen being DP-2 but it selects DVI-D-1 as is the primary screen and touchscreen dimensions don't match ) and now I can now control the touchscreen by touching it.
(In reply to David Sutton from comment #22) > FWIW: I was reading through the notes on the merge request when I noticed a > comment saying that the ability to set it was already in there, it just > wasn't exposed via a UI. I was able to find the input device via DBUS and > changed the outputDevice to my touchscreen ( I have DVI-D-1, DP-1, DP-2, > with touchscreen being DP-2 but it selects DVI-D-1 as is the primary screen > and touchscreen dimensions don't match ) and now I can now control the > touchscreen by touching it. That should make the problem solvable in the short term; a script that can read the output of `lsusb -vvv` to figure out which display should be used should be pretty easy to put together. Do you mind sharing the dbus command you used? It would save digging through the mess that is dbus debugging in writing the script. Obviously, a kde-native way (looks like they are discussing creating a KCM for it) would be the proper long-term solution.
So the first thing I had to do was find out which event was related to the touch screen - in my case event4 was the one I needed: "qdbus org.kde.KWin /org/kde/KWin/InputDevice/event4 org.freedesktop.DBus.Properties.Get org.kde.KWin.InputDevice touch" returns "true" I went ahead and used kinfocenter ( Graphics -> Wayland ) to confirm which monitor I needed to assign it to - it gives the mapping from the monitor name to the device name. I then set event4 to the right monitor ( outputName ) : "qdbus org.kde.KWin /org/kde/KWin/InputDevice/event4 org.freedesktop.DBus.Properties.Set org.kde.KWin.InputDevice outputName DP-2"
That appears to work, will verify next time it connects itself to the wrong screen. If you get it connected to the correct screen, you can get the current value via `qdbus org.kde.KWin /org/kde/KWin/InputDevice/eventX org.freedesktop.DBus.Properties.Get org.kde.KWin.InputDevice outputName`, then when it gets connected wrong, use `...Properties.Set` to restore it to the current value. Probably need to pair with a udev rule to autodetect the input event given that those change when devices get hotplugged.
A possibly relevant merge request was started @ https://invent.kde.org/plasma/plasma-desktop/-/merge_requests/1227
*** Bug 443490 has been marked as a duplicate of this bug. ***
Git commit c32f3be935913f76b9c46bd93ef6af086ba4be76 by Nicolas Fella. Committed on 22/11/2022 at 11:38. Pushed by nicolasfella into branch 'master'. Add touchscreen KCM There are various properties of a touchscreen that the user may want/need to configure First the user may want to disable the touchscreen altogether Then there are some cases where the touchscreen isn't automatically assigned to the correct display and a manual override is needed This new KCM provides a way to do both It is Wayland-only Related: bug 455619 M +1 -0 kcms/CMakeLists.txt A +44 -0 kcms/touchscreen/CMakeLists.txt A +2 -0 kcms/touchscreen/Messages.sh A +185 -0 kcms/touchscreen/devicesmodel.cpp [License: GPL(v2.0+)] A +51 -0 kcms/touchscreen/devicesmodel.h [License: GPL(v2.0+)] A +103 -0 kcms/touchscreen/inputdevice.cpp [License: GPL(v2.0+)] A +167 -0 kcms/touchscreen/inputdevice.h [License: GPL(v2.0+)] A +17 -0 kcms/touchscreen/kcm_touchscreen.json A +125 -0 kcms/touchscreen/kcmtouchscreen.cpp [License: LGPL(v2.0+)] A +40 -0 kcms/touchscreen/kcmtouchscreen.h [License: LGPL(v2.0+)] A +83 -0 kcms/touchscreen/package/contents/ui/main.qml [License: GPL(v2.0+)] https://invent.kde.org/plasma/plasma-desktop/commit/c32f3be935913f76b9c46bd93ef6af086ba4be76