Touchscreen interactions should take priority over mouse and temporarily disable it

Bug #1525979 reported by Andrea Bernabei
10
This bug affects 2 people
Affects Status Importance Assigned to Milestone
Canonical System Image
Confirmed
Medium
Zoltan Balogh
Ubuntu UX
Triaged
Medium
Andrea Bernabei
qtbase-opensource-src (Ubuntu)
Confirmed
Undecided
Unassigned

Bug Description

It is possible, at the moment (r199, krillin, rc-proposed), to use both touch and mouse at the same time.

Because of:
- QtQuick's touch-to-mouse events synthesis feature;
- the fact that most of QML code in this world relies on MouseArea to handle input (touch as well);
- the fact that there is no QML component that handles both Touch and Mouse events and gives the developer a good API to handle both;
- the fact that making both touch and mouse usable at the same time easily leads to unexpected and broken UX;

I suggest we make it so that only one input device can be used at any given time by default (exceptional cases to be handled separately).
That is, mouse and touch can of course be both plugged in, and both can be used, just not at the *very same* time. For instance, as long as the user is dragging a surface using the touchscreen, he should not be able to click/move/interact using the mouse at the same time. After the finger is released from the touchscreen, then the mouse can be used again (and viceversa).

Moreover, I think it would be a good idea to give touch events a priority over mouse events, i.e. mouse stops working when the user touches the screen, but not viceversa.

I also think the final decision should take into account the conventions the users are already accustomed to.
I played with a laptop that features touchscreen (Microsoft's Surface) and Win10, and here's what I found:

- in the default browser: interacting with the touch stops and hides the mouse, and gives priority to (multi)-touch gestures. The mouse pointer stays still (i.e. doesn't follow the fingers). The only way I found that let me take control of the mouse again was to perform a single tap and then wait a short amount of time before moving the mouse.

- in other apps that did not feature special touch handling, interacting with the touch would still disable the mouse, but in this case the mouse pointer followed my finger (I guess this is a "if nothing consumes touch events -> then do mouse simulation")

I believe this bug is a show stopper for the convergent experience.

It is currently possible to trigger flickering and broken UX in multiple places in Unity8. Basically anything that relies on MouseMove events is broken and causes flickering.

A few examples:
- username vertical scrolling in the login manager (just drag the username with your finger and then move the mouse).
- window positioning (same as above)
- indicators horizontal scrolling
- scrolling in ANY Flickable/ListView based views inside applications and platform menus
- side scrolling in the Dash
- etc, etc etc...

NOTE: after a discussion on IRC with Saviq, we agreed that it would be awesome if MouseArea would be able to handle different input devices. I already researched this before writing this post, and I didn't see any way how MouseArea could (with the current APIs) be able to do that. That means, imho, a looong waiting time before we actually implement such a feature in Qt itself. Hence I proposed the solution above as a workaround, while we get all the rest of the pieces working as we expect.

============= UX UPDATE ===================
This was discussed during today's (15th Dec) team UX review meeting.

The outcome of the meeting was:
- The UX team will start a research project to handle this matter in more detail.
- We all agreed it makes sense to prevent multiple input devices from being active at the same time, i.e. mouse disables touch, touch disables mouse. This is however just a quick consideration done during the meeting, the details have to be considered as part of the research project described in the previous point.

Andrea Bernabei (faenil)
summary: - Touch should take priority over mouse and disable it while touchscreen
- is being used
+ Touchscreen interactions should take priority over mouse and disable it
Andrea Bernabei (faenil)
description: updated
Changed in ubuntu-ux:
status: New → Triaged
importance: Undecided → Medium
assignee: nobody → Andrea Bernabei (faenil)
Revision history for this message
Andrea Bernabei (faenil) wrote : Re: Touchscreen interactions should take priority over mouse and disable it

This was discussed during today's (15th Dec) team UX review meeting.

The outcome of the meeting was:
- The UX team will start a research project to handle this matter in more detail.
- We all agreed it makes sense to prevent multiple input devices from being active at the same time, i.e. mouse disables touch, touch disables mouse. This is however just a quick consideration done during the meeting, the details have to be considered as part of the research project described in the previous point.

description: updated
Changed in canonical-devices-system-image:
status: New → Confirmed
importance: Undecided → Medium
assignee: nobody → Zoltan Balogh (bzoltan)
milestone: none → backlog
Revision history for this message
Lorn Potter (lorn-potter) wrote :

I don't agree that it makes sense to disable one input method if another is available. Make it a user configurable option to use one or the other, but do not disable using both.

I have a touchscreen desktop with second screen. I want both mouse and touchscreen available. I use both at the same time. What bothers me is the virtual keyboard pops up on the non touch screen if that is in focus.

With convergence phone, I think there might be more of a case for doing this. But I would still want and need to be able to use the touchscreen at times, even if I have a mouse attached.

Revision history for this message
Andrea Bernabei (faenil) wrote :

Of course, you have to be able to use both the devices!

The point here is *temporarily* disabling one, i.e. disable mouse while you're dragging with the touch, for instance.

You can still use touch, then mouse, then touch again.

Do you agree?

Revision history for this message
Lorn Potter (lorn-potter) wrote :

In that case, yes I agree.

If it means I can only use one or the other then no.

Revision history for this message
Andrea Bernabei (faenil) wrote :

That would be quite a bad UX :)

Yes, I definitely meant *temporarily* disable one input device while the other is being used, more or less like when you enable the palm detection on your touchpad setting :)

summary: - Touchscreen interactions should take priority over mouse and disable it
+ Touchscreen interactions should take priority over mouse and temporarily
+ disable it
description: updated
Revision history for this message
Launchpad Janitor (janitor) wrote :

Status changed to 'Confirmed' because the bug affects multiple users.

Changed in qtbase-opensource-src (Ubuntu):
status: New → Confirmed
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.