Bug 439920 - DragArea requires press-and-hold with a touchscreen, which is unintuitive and undiscoverable
Summary: DragArea requires press-and-hold with a touchscreen, which is unintuitive and...
Status: CONFIRMED
Alias: None
Product: frameworks-kdeclarative
Classification: Frameworks and Libraries
Component: general (show other bugs)
Version: 5.84.0
Platform: Other Linux
: NOR normal
Target Milestone: ---
Assignee: Marco Martin
URL:
Keywords: usability, wayland-only
Depends on:
Blocks:
 
Reported: 2021-07-15 22:45 UTC by Thiago Sueto
Modified: 2022-02-17 12:53 UTC (History)
5 users (show)

See Also:
Latest Commit:
Version Fixed In:
Sentry Crash Report:


Attachments

Note You need to log in before you can comment on or make changes to this bug.
Description Thiago Sueto 2021-07-15 22:45:28 UTC
It is possible to add widgets to the desktop when using touch by double tapping a widget, which adds the widget to the upper left corner; this behavior is the same with mouse/touchpad and double clicking.

Unlike the mouse, it's not possible to drag a widget from the widget explorer to the desktop with touch.

The only way to add widgets to the desktop with a touchscreen is to double tap, enter edit mode and move it around to where you want it.
Comment 1 Nate Graham 2021-08-03 18:33:16 UTC
Can confirm.
Comment 2 Nate Graham 2021-08-05 22:42:00 UTC
The problem appears to be that DragAndDrop.DragArea ignores touches.
Comment 3 Nate Graham 2021-08-05 23:19:54 UTC
While reading the code to figure out how to handle touch events, I discovered that they are already handled! It's just that a press-and-hold is required; a regular old drag doesn't work. This never would have occurred to me.

I imagine the reason for this is logical enough: to distinguish between potentially ambiguous touch actions because a drag may look like a swipe, which you would want to forward to a scrollview if the view is scrollable. However it is nonetheless a pretty poor UX. Both Thiago and I never even thought to try this.

I could imagine a couple of ways we fix the situation:

1. Add opt-in code to allow a horizontal touch drag to be interpreted as a drag, but a vertical drag to be interpreted as a scroll swipe. This would work for the Widget Explorer because it's a sidebar and most of the time you'd be dragging the widget horizontally

2. Add smart code to DragArea to try to disambiguate a swipe from a drag by looking at velocity, vector, and so on.

3. Add a new on-demand overlay to the Widget Explorer that shows show additional buttons or explanations when an item is tapped or dragged without pressing-and-holding, to help the user figure out what to do.

#3 I could probably do, but 1 and 2 are beyond my current abilities, and are likely better solutions. Since those have to be implemented here, I'm leaving the bug here for now.
Comment 4 Thiago Sueto 2021-08-11 05:44:51 UTC
(In reply to Nate Graham from comment #3)
> While reading the code to figure out how to handle touch events, I
> discovered that they are already handled! It's just that a press-and-hold is
> required; a regular old drag doesn't work. This never would have occurred to
> me.

I actually did try this, but the drag-and-hold behavior was broken for me. I agree this wasn't the first thing I thought of, but it didn't seem far-fetched either given that Edit Mode requires holding as well. Kinda makes sense actually, moving apps on Android is done like that too.

> #3 I could probably do, but 1 and 2 are beyond my current abilities, and are
> likely better solutions. Since those have to be implemented here, I'm
> leaving the bug here for now.

Yeah, option 1 seems ideal to me. The widget window currently has no use for horizontal dragging.
Comment 5 Nate Graham 2021-08-11 14:41:08 UTC
Huh, touch-and-hold-and-drag works fine for me on both Wayland and X11. I'm starting to suspect that there is something generally wrong with your touchscreen drivers or something. :)
Comment 6 Thiago Sueto 2021-08-11 16:42:09 UTC
(In reply to Nate Graham from comment #5)
> Huh, touch-and-hold-and-drag works fine for me on both Wayland and X11. I'm
> starting to suspect that there is something generally wrong with your
> touchscreen drivers or something. :)

It doesn't seem to be the case, because once I tap and hold for what seems to be half a second the widget drawer darkens accordingly, and when I drag my finger outwards and release it goes back to its original state, so it seems the actual gestures are recognized properly, it's just not doing what it's supposed to do.

Is there some way to test for such issues? I don't think that's it, but it's true that my current distro (openSUSE Krypton) is the only one where I experience lack of screen rotation in iio-sensor-proxy[1], so perhaps there is something happening there and I should test this hardware with another distro.

Although I remember this happening on Arch, Fedora and Kubuntu Impish.

[1] https://bugzilla.opensuse.org/show_bug.cgi?id=1188224

In any case, seems like something that should be in a different bug report or still needs confirmation from other reporters.
Comment 7 Zamundaaa 2021-08-29 15:53:32 UTC
Can confirm (on Manjaro). I remember this working in the past but maybe that was on X.