When using a touch events the metaphor doesn't translate. We want to be able to pull the panel from the outside instead of pushing it from the border. Reproducible: Always
KWin does the autohide stuff, re-assigning
There's not much we can do here. KWin on X11 is not supporting any touch events and on Wayland we are still blocked by the Visual Design Group. I requested the VDG to come up with a proposal on how to handle global touch gestures quite some time ago and they haven't come up with a plan yet. See https://forum.kde.org/viewtopic.php?f=285&t=125570 As long as there is no plan on what our touch gestures are supposed to do and how to look like, we cannot implement them. Setting to resolved later as there is just nothing we can do while we are blocked.
I don't think we can simply ignore the issue because it's something that was discussed over 1 year ago and no conclusion was extracted. In fact, reading the forum thread it's quite clear that edge swipe is something that keeps being mentioned.
Then please nag the VDG to come up with a proposal. I cannot implement this without any instructions on how to do it. Sorry.
I'm on it.
Git commit aa6c8f81168e4f89c67b9e88065aee675e306d1a by Martin Gräßlin. Committed on 27/03/2017 at 15:44. Pushed by graesslin into branch 'master'. Add support for activating screenedges through touch swipe gestures Summary: Each Edge creates a SwipeGesture for touch activation. The swipe needs to be a single finger starting from the edge into the screen for at least 20 %. The SwipeGesture and GestureRecognizer is extended to support the use cases of the touch screen edge swipe. New features supported by the gesture system are: * minimum and maximum position * a minimum delta for the swipe * progress signal based on the minimum delta * starting a swipe with a start point The Edge has the progress signal connected to its approach signal, thus visual feedback is provided through the screen edge effect. The screen edge system supports touch only for the edges (corners are too difficult to activate on touch screens). At the moment the following features are supported: * screen edge show/raise of windows (e.g. auto hidden panels) * trigger the configured action * trigger the configured callback function (e.g. script) In future it might make sense to add a touch specific configuration action to support different actions for screen edges activated by mouse and touch. Test Plan: configured a screen edge and triggered through touch, added an auto-hiding panel and triggered through touch Reviewers: #kwin, #plasma_on_wayland Subscribers: plasma-devel Tags: #plasma_on_wayland Differential Revision: https://phabricator.kde.org/D5106 M +1 -0 autotests/CMakeLists.txt M +234 -1 autotests/test_gestures.cpp M +67 -3 gestures.cpp M +87 -1 gestures.h M +41 -0 input.cpp M +2 -2 plugins/platforms/x11/standalone/edge.cpp M +2 -2 plugins/platforms/x11/standalone/edge.h M +67 -0 screenedge.cpp M +12 -7 screenedge.h https://commits.kde.org/kwin/aa6c8f81168e4f89c67b9e88065aee675e306d1a