Comment 80 for bug 882274

Revision history for this message
Paul Sladen (sladen) wrote :

Alex: It's probably useful if you spell out what you've discovered that is broken about Unity and needs looking into (I'm so, I'm not a mind reader—so even if you feel that both yourself and Mark know the item you have in mind, myself and others following likely don't know).

A lot of the time, what Mark is doing is breaking deadlocks (aka "making design decisions") are in the circumstances when a data-driven solution has not been forthcoming.

Dwayne: yes, separate interaction models is what eg. Unity and the rest of the interface already do. You have different interaction models for different situations. So in Unity you move the mouse leftwards to open the Launcher, but with multi-touch you use a gesture that is a swipe *rightwards*. Similarly, for a PDF reader like Evince, you click little buttons for Zoom/Rotate with a pointer, but with multi-touch you just interact with the content area directly. This is why the overlay scrollbars are there: they are a user-interface element that /can/ be used with a pointer, but which then only appears in a passive (feedback) mode when using multi-touch for interaction directly with the canvas/content, and occupies the least about of real-estate in doing so.