Activity log for bug #1525979

Date Who What changed Old value New value Message
2015-12-14 16:26:33 Andrea Bernabei bug added bug
2015-12-14 16:28:42 Andrea Bernabei summary Touch should take priority over mouse and disable it while touchscreen is being used Touchscreen interactions should take priority over mouse and disable it
2015-12-14 16:37:40 Michał Sawicz bug task added ubuntu-ux
2015-12-14 16:54:29 Andrea Bernabei description It is possible, at the moment (r199, krillin, rc-proposed), to use both touch and mouse at the same time. Because of: - QtQuick's touch-to-mouse events synthesis feature; - the fact that most of QML code in this world relies on MouseArea to handle input (touch as well); - the fact that there is no QML component that handles both Touch and Mouse events and gives the developer a good API to handle both; - the fact that making both touch and mouse usable at the same time easily leads to unexpected and broken UX; I suggest we make it so that only one input device can be used at any given time by default (exceptional cases to be handled separately). Moreover, I think it would be a good idea to give touch events a priority over mouse events, i.e. mouse stops working when the user touches the screen, but not viceversa. I also think the final decision should take into account the conventions the users are already accustomed to. I played with a laptop that features touchscreen (Microsoft's Surface) and Win10, and here's what I found: - in the default browser: interacting with the touch stops and hides the mouse, and gives priority to (multi)-touch gestures. The mouse pointer stays still (i.e. doesn't follow the fingers). The only way I found that let me take control of the mouse again was to perform a single tap and then wait a short amount of time before moving the mouse. - in other apps that did not feature special touch handling, interacting with the touch would still disable the mouse, but in this case the mouse pointer followed my finger (I guess this is a "if nothing consumes touch events -> then do mouse simulation") I believe this bug is a show stopper for the convergent experience. It is currently possible to trigger flickering and broken UX in multiple places in Unity8. Basically anything that relies on MouseMove events is broken and causes flickering. A few examples: - username vertical scrolling in the login manager (just drag the username with your finger and then move the mouse). - window positioning (same as above) - indicators horizontal scrolling - scrolling in ANY Flickable/ListView based views inside applications and platform menus - side scrolling in the Dash - etc, etc etc... It is possible, at the moment (r199, krillin, rc-proposed), to use both touch and mouse at the same time. Because of: - QtQuick's touch-to-mouse events synthesis feature; - the fact that most of QML code in this world relies on MouseArea to handle input (touch as well); - the fact that there is no QML component that handles both Touch and Mouse events and gives the developer a good API to handle both; - the fact that making both touch and mouse usable at the same time easily leads to unexpected and broken UX; I suggest we make it so that only one input device can be used at any given time by default (exceptional cases to be handled separately). Moreover, I think it would be a good idea to give touch events a priority over mouse events, i.e. mouse stops working when the user touches the screen, but not viceversa. I also think the final decision should take into account the conventions the users are already accustomed to. I played with a laptop that features touchscreen (Microsoft's Surface) and Win10, and here's what I found: - in the default browser: interacting with the touch stops and hides the mouse, and gives priority to (multi)-touch gestures. The mouse pointer stays still (i.e. doesn't follow the fingers). The only way I found that let me take control of the mouse again was to perform a single tap and then wait a short amount of time before moving the mouse. - in other apps that did not feature special touch handling, interacting with the touch would still disable the mouse, but in this case the mouse pointer followed my finger (I guess this is a "if nothing consumes touch events -> then do mouse simulation") I believe this bug is a show stopper for the convergent experience. It is currently possible to trigger flickering and broken UX in multiple places in Unity8. Basically anything that relies on MouseMove events is broken and causes flickering. A few examples: - username vertical scrolling in the login manager (just drag the username with your finger and then move the mouse). - window positioning (same as above) - indicators horizontal scrolling - scrolling in ANY Flickable/ListView based views inside applications and platform menus - side scrolling in the Dash - etc, etc etc... NOTE: after a discussion on IRC with Saviq, we agreed that it would be awesome if MouseArea would be able to handle different input devices. I already researched this before writing this post, and I didn't see any way how MouseArea could (with the current APIs) be able to do that. That means, imho, a looong waiting time before we actually implement such a feature in Qt itself. Hence I proposed the solution above as a workaround, while we get all the rest of the pieces working as we expect.
2015-12-14 16:55:15 Michał Sawicz bug task added qtbase-opensource-src (Ubuntu)
2015-12-15 09:11:51 Magdalena Mirowicz ubuntu-ux: status New Triaged
2015-12-15 09:11:54 Magdalena Mirowicz ubuntu-ux: importance Undecided Medium
2015-12-15 09:12:01 Magdalena Mirowicz ubuntu-ux: assignee Andrea Bernabei (faenil)
2015-12-15 11:21:38 Andrea Bernabei description It is possible, at the moment (r199, krillin, rc-proposed), to use both touch and mouse at the same time. Because of: - QtQuick's touch-to-mouse events synthesis feature; - the fact that most of QML code in this world relies on MouseArea to handle input (touch as well); - the fact that there is no QML component that handles both Touch and Mouse events and gives the developer a good API to handle both; - the fact that making both touch and mouse usable at the same time easily leads to unexpected and broken UX; I suggest we make it so that only one input device can be used at any given time by default (exceptional cases to be handled separately). Moreover, I think it would be a good idea to give touch events a priority over mouse events, i.e. mouse stops working when the user touches the screen, but not viceversa. I also think the final decision should take into account the conventions the users are already accustomed to. I played with a laptop that features touchscreen (Microsoft's Surface) and Win10, and here's what I found: - in the default browser: interacting with the touch stops and hides the mouse, and gives priority to (multi)-touch gestures. The mouse pointer stays still (i.e. doesn't follow the fingers). The only way I found that let me take control of the mouse again was to perform a single tap and then wait a short amount of time before moving the mouse. - in other apps that did not feature special touch handling, interacting with the touch would still disable the mouse, but in this case the mouse pointer followed my finger (I guess this is a "if nothing consumes touch events -> then do mouse simulation") I believe this bug is a show stopper for the convergent experience. It is currently possible to trigger flickering and broken UX in multiple places in Unity8. Basically anything that relies on MouseMove events is broken and causes flickering. A few examples: - username vertical scrolling in the login manager (just drag the username with your finger and then move the mouse). - window positioning (same as above) - indicators horizontal scrolling - scrolling in ANY Flickable/ListView based views inside applications and platform menus - side scrolling in the Dash - etc, etc etc... NOTE: after a discussion on IRC with Saviq, we agreed that it would be awesome if MouseArea would be able to handle different input devices. I already researched this before writing this post, and I didn't see any way how MouseArea could (with the current APIs) be able to do that. That means, imho, a looong waiting time before we actually implement such a feature in Qt itself. Hence I proposed the solution above as a workaround, while we get all the rest of the pieces working as we expect. It is possible, at the moment (r199, krillin, rc-proposed), to use both touch and mouse at the same time. Because of: - QtQuick's touch-to-mouse events synthesis feature; - the fact that most of QML code in this world relies on MouseArea to handle input (touch as well); - the fact that there is no QML component that handles both Touch and Mouse events and gives the developer a good API to handle both; - the fact that making both touch and mouse usable at the same time easily leads to unexpected and broken UX; I suggest we make it so that only one input device can be used at any given time by default (exceptional cases to be handled separately). Moreover, I think it would be a good idea to give touch events a priority over mouse events, i.e. mouse stops working when the user touches the screen, but not viceversa. I also think the final decision should take into account the conventions the users are already accustomed to. I played with a laptop that features touchscreen (Microsoft's Surface) and Win10, and here's what I found: - in the default browser: interacting with the touch stops and hides the mouse, and gives priority to (multi)-touch gestures. The mouse pointer stays still (i.e. doesn't follow the fingers). The only way I found that let me take control of the mouse again was to perform a single tap and then wait a short amount of time before moving the mouse. - in other apps that did not feature special touch handling, interacting with the touch would still disable the mouse, but in this case the mouse pointer followed my finger (I guess this is a "if nothing consumes touch events -> then do mouse simulation") I believe this bug is a show stopper for the convergent experience. It is currently possible to trigger flickering and broken UX in multiple places in Unity8. Basically anything that relies on MouseMove events is broken and causes flickering. A few examples: - username vertical scrolling in the login manager (just drag the username with your finger and then move the mouse). - window positioning (same as above) - indicators horizontal scrolling - scrolling in ANY Flickable/ListView based views inside applications and platform menus - side scrolling in the Dash - etc, etc etc... NOTE: after a discussion on IRC with Saviq, we agreed that it would be awesome if MouseArea would be able to handle different input devices. I already researched this before writing this post, and I didn't see any way how MouseArea could (with the current APIs) be able to do that. That means, imho, a looong waiting time before we actually implement such a feature in Qt itself. Hence I proposed the solution above as a workaround, while we get all the rest of the pieces working as we expect. ============= UX UPDATE =================== This was discussed during today's (15th Dec) team UX review meeting. The outcome of the meeting was: - The UX team will start a research project to handle this matter in more detail. - We all agreed it makes sense to prevent multiple input devices from being active at the same time, i.e. mouse disables touch, touch disables mouse. This is however just a quick consideration done during the meeting, the details have to be considered as part of the research project described in the previous point.
2015-12-15 13:05:39 Jean-Baptiste Lallement canonical-devices-system-image: status New Confirmed
2015-12-15 13:05:41 Jean-Baptiste Lallement canonical-devices-system-image: importance Undecided Medium
2015-12-15 13:05:48 Jean-Baptiste Lallement canonical-devices-system-image: assignee Zoltan Balogh (bzoltan)
2015-12-15 13:06:19 Jean-Baptiste Lallement canonical-devices-system-image: milestone backlog
2015-12-15 22:00:13 Andrea Bernabei summary Touchscreen interactions should take priority over mouse and disable it Touchscreen interactions should take priority over mouse and temporarily disable it
2015-12-15 22:03:55 Andrea Bernabei description It is possible, at the moment (r199, krillin, rc-proposed), to use both touch and mouse at the same time. Because of: - QtQuick's touch-to-mouse events synthesis feature; - the fact that most of QML code in this world relies on MouseArea to handle input (touch as well); - the fact that there is no QML component that handles both Touch and Mouse events and gives the developer a good API to handle both; - the fact that making both touch and mouse usable at the same time easily leads to unexpected and broken UX; I suggest we make it so that only one input device can be used at any given time by default (exceptional cases to be handled separately). Moreover, I think it would be a good idea to give touch events a priority over mouse events, i.e. mouse stops working when the user touches the screen, but not viceversa. I also think the final decision should take into account the conventions the users are already accustomed to. I played with a laptop that features touchscreen (Microsoft's Surface) and Win10, and here's what I found: - in the default browser: interacting with the touch stops and hides the mouse, and gives priority to (multi)-touch gestures. The mouse pointer stays still (i.e. doesn't follow the fingers). The only way I found that let me take control of the mouse again was to perform a single tap and then wait a short amount of time before moving the mouse. - in other apps that did not feature special touch handling, interacting with the touch would still disable the mouse, but in this case the mouse pointer followed my finger (I guess this is a "if nothing consumes touch events -> then do mouse simulation") I believe this bug is a show stopper for the convergent experience. It is currently possible to trigger flickering and broken UX in multiple places in Unity8. Basically anything that relies on MouseMove events is broken and causes flickering. A few examples: - username vertical scrolling in the login manager (just drag the username with your finger and then move the mouse). - window positioning (same as above) - indicators horizontal scrolling - scrolling in ANY Flickable/ListView based views inside applications and platform menus - side scrolling in the Dash - etc, etc etc... NOTE: after a discussion on IRC with Saviq, we agreed that it would be awesome if MouseArea would be able to handle different input devices. I already researched this before writing this post, and I didn't see any way how MouseArea could (with the current APIs) be able to do that. That means, imho, a looong waiting time before we actually implement such a feature in Qt itself. Hence I proposed the solution above as a workaround, while we get all the rest of the pieces working as we expect. ============= UX UPDATE =================== This was discussed during today's (15th Dec) team UX review meeting. The outcome of the meeting was: - The UX team will start a research project to handle this matter in more detail. - We all agreed it makes sense to prevent multiple input devices from being active at the same time, i.e. mouse disables touch, touch disables mouse. This is however just a quick consideration done during the meeting, the details have to be considered as part of the research project described in the previous point. It is possible, at the moment (r199, krillin, rc-proposed), to use both touch and mouse at the same time. Because of: - QtQuick's touch-to-mouse events synthesis feature; - the fact that most of QML code in this world relies on MouseArea to handle input (touch as well); - the fact that there is no QML component that handles both Touch and Mouse events and gives the developer a good API to handle both; - the fact that making both touch and mouse usable at the same time easily leads to unexpected and broken UX; I suggest we make it so that only one input device can be used at any given time by default (exceptional cases to be handled separately). That is, mouse and touch can of course be both plugged in, and both can be used, just not at the *very same* time. For instance, as long as the user is dragging a surface using the touchscreen, he should not be able to click/move/interact using the mouse at the same time. After the finger is released from the touchscreen, then the mouse can be used again (and viceversa). Moreover, I think it would be a good idea to give touch events a priority over mouse events, i.e. mouse stops working when the user touches the screen, but not viceversa. I also think the final decision should take into account the conventions the users are already accustomed to. I played with a laptop that features touchscreen (Microsoft's Surface) and Win10, and here's what I found: - in the default browser: interacting with the touch stops and hides the mouse, and gives priority to (multi)-touch gestures. The mouse pointer stays still (i.e. doesn't follow the fingers). The only way I found that let me take control of the mouse again was to perform a single tap and then wait a short amount of time before moving the mouse. - in other apps that did not feature special touch handling, interacting with the touch would still disable the mouse, but in this case the mouse pointer followed my finger (I guess this is a "if nothing consumes touch events -> then do mouse simulation") I believe this bug is a show stopper for the convergent experience. It is currently possible to trigger flickering and broken UX in multiple places in Unity8. Basically anything that relies on MouseMove events is broken and causes flickering. A few examples: - username vertical scrolling in the login manager (just drag the username with your finger and then move the mouse). - window positioning (same as above) - indicators horizontal scrolling - scrolling in ANY Flickable/ListView based views inside applications and platform menus - side scrolling in the Dash - etc, etc etc... NOTE: after a discussion on IRC with Saviq, we agreed that it would be awesome if MouseArea would be able to handle different input devices. I already researched this before writing this post, and I didn't see any way how MouseArea could (with the current APIs) be able to do that. That means, imho, a looong waiting time before we actually implement such a feature in Qt itself. Hence I proposed the solution above as a workaround, while we get all the rest of the pieces working as we expect. ============= UX UPDATE =================== This was discussed during today's (15th Dec) team UX review meeting. The outcome of the meeting was: - The UX team will start a research project to handle this matter in more detail. - We all agreed it makes sense to prevent multiple input devices from being active at the same time, i.e. mouse disables touch, touch disables mouse. This is however just a quick consideration done during the meeting, the details have to be considered as part of the research project described in the previous point.
2015-12-16 14:39:57 Launchpad Janitor qtbase-opensource-src (Ubuntu): status New Confirmed