2016-01-22 20:58:21 |
Olivier Tilloy |
description |
Filing this bug for webbrowser-app, but it’s likely that it’s affecting other applications.
I’m not aware of design guidelines regarding the use of haptics feedback when pressing/tapping UI elements on a touch device. In the UITK, it seems that anything that is a button (including anything inheriting from AbstractButton) will by default trigger a haptic response. List items are not button, so tapping them doesn’t trigger haptic feedback. However it’s pretty easy to build custom list items that embed an AbstractButton, and thus introduce inconsistency.
An example of this inconsistent behaviour is the Settings screen in the browser app: the first two items ("search engines" and "homepage") are custom, they are haptics-enabled. The remaining items are not.
We need clear design guidelines. Once we have them, we need to go through every UI element of the apps and fix them if they don’t comply with the guidelines. |
Filing this bug for webbrowser-app, but it’s likely that it’s affecting other applications.
I’m not aware of design guidelines regarding the use of haptics feedback when pressing/tapping UI elements on a touch device. In the UITK, it seems that anything that is a button (including anything inheriting from AbstractButton) will by default trigger a haptic response. List items are not buttons, so tapping them doesn’t trigger haptic feedback. However it’s pretty easy to build custom list items that embed an AbstractButton, and thus introduce inconsistency.
An example of this inconsistent behaviour is the Settings screen in the browser app: the first two items ("search engines" and "homepage") are custom, they are haptics-enabled. The remaining items are not.
We need clear design guidelines. Once we have them, we need to go through every UI element of the apps and fix them if they don’t comply with the guidelines. |
|