To elaborate... what you see on a "traditional" stylus-based Wacom tablet is probably what Qemu is implementing. That means:
* Stylus is near (within a few centimetres) but not touching the screen: Mouse cursor visible and follows the pen (but ideally shows something different to an arrow, like a pencil cursor).
* Stylus touches the screen: That's a touch or primary button-down event, and button up event for release.
* Stylus moves away from the screen: No cursor is visible.
I think that's what Qemu is trying to emulate and it's probably a good model for our controlling a virtual touch screen with a mouse too. So we should fix libinput and/or Mir to work in this environment. Obviously it did until recently.
To elaborate... what you see on a "traditional" stylus-based Wacom tablet is probably what Qemu is implementing. That means:
* Stylus is near (within a few centimetres) but not touching the screen: Mouse cursor visible and follows the pen (but ideally shows something different to an arrow, like a pencil cursor).
* Stylus touches the screen: That's a touch or primary button-down event, and button up event for release.
* Stylus moves away from the screen: No cursor is visible.
I think that's what Qemu is trying to emulate and it's probably a good model for our controlling a virtual touch screen with a mouse too. So we should fix libinput and/or Mir to work in this environment. Obviously it did until recently.