SUMMARY A touchscreen can be configured to do some actions when pulling from a screen edge, e.g. open the task switcher. This is already useful, but in my opinion could benefit a lot from some additions: 1) Being able to trigger the gestures while in fullscreen. Currently the gestures don't work in fullscreen, which at the same time is one of the most useful situation for them since you don't have the panel available to switch between apps, open the launcher etc. 2) Triggering Keyboard Shortcuts This can be used for different cases but one major usecase for me would be the fullscreen shortcut. Some apps (e.g. okular) are impossible to unfullscreen without a keyboard, which is quite annoying. This would need point 1 as well. Instead it could also be solved by adding a "toggle fullscreen" action but keyboard shortcuts would be more flexible. 3) Running Custom Command Can also be used in many different ways. Personally I would use it to launch a touchscreen focused menu / launcher. The keyboard shortcuts would also be enough for this specific usecase. I know that improved (non edge) touchscreen gestures are being worked on, depending on how these can be configured the usecases mentioned might already be covered.
This sounds like it would be a usability improvement for sure