Works with stylus but not finger: HP Elitebook 2740P Tablet PC

Bug #1456452 reported by whitis
This bug affects 2 people
Affects Status Importance Assigned to Milestone
onboard (Ubuntu)

Bug Description

As I will explain below, it appears that this bug is probably a manifestation of onboard using a completely incorrect algorithm for detecting button presses and releases that only accidentally works on legacy input devices and fails completely on a touchscreen.

HP EliteBook 2740p is a convertible Tablet PC with a screen that can rotate from Laptop mode into tablet mode. In tablet mode, the screen covers the keyboard and an on screen keyboard is a necessity. This machine has 4+ input devices:
   - multitouch touch screen
   - stylus on screen
   - pointing stick
   - trackpad
   - also, 3D accelerometer which could theoretically be used to move cursor.
These are without plugging in an external 3D space navigator, mouse, joystick, gamepad, eyemouse, etc.

I can type with:
   - Touch screen stylus on screen
   - Touchpad (cumbersome and not at all useful in tablet mode)
   - pointing stick (AKA erasor/clitoris/trackpoint(tm)/etc) (cumbersome and not at all useful in tablet mode)
   - (physical keyboard)
I can not type using
  - Finger on multitouch screen

This is not a calibration issue. This is not a rotation issue; even though I have been playing with rotation, the screen and input devices are currently unrotated and I can type in xvkbd or florence on screen keyboards with stylus or finger.

onboard complet

 xsetwacom --list
Serial Wacom Tablet WACf00e stylus id: 13 type: STYLUS
Serial Wacom Tablet WACf00e eraser id: 15 type: ERASER
Serial Wacom Tablet WACf00e touch id: 16 type: TOUCH # finger
# Not shown: synaptics touchpad

The following error message appears frequently in the terminal tab from which onboard was started (enough to render the tab unusable):
(onboard:5529): Gdk-CRITICAL **: gdk_device_get_axis_use: assertion 'index_ < device->axes->len' failed
This error appears whether using stylus or finger (but not touchpad or stick) and occurs more than once per key press as you hover.

Note the following potentially relevant general behavior differences between these 4 pointing devices. With the three that work, the cursor tracks movement when not clicking/dragging. The one that doesn't warps the mouse directly to the point touched and generates events at that point and hides the mouse cursor since the mouse cursor can't track a raised finger.

Default onboard preferences were used.

xev shows Button Press, Motion Notify, and Button release events when touching with finger. Similar with stylus.
Differences, stylus sends motion notify updates when near screen, finger doesn't.
Finger sends motion notify updates while being held still touching screen, stylus does not.

Here are events captured with both buton and stylus:
ButtonRelease event, serial 37, synthetic NO, window 0x5600001,
    root 0x80, subw 0x0, time 9012106, (110,108), root:(175,676),
    state 0x100, button 1, same_screen YES

ButtonRelease event, serial 37, synthetic NO, window 0x5600001,
    root 0x80, subw 0x0, time 9065337, (96,85), root:(161,653),
    state 0x100, button 1, same_screen YES

Another difference is the "state" in the motion notify events:
   Stylus: 0x0, 0x0, ... 0x100, 0x100, 0x100, ... 0x0, 0x0, ....
   Finger: 0x100 events only.
  erasor or touchpad: generate 0x0 events.

This suggests that the bug in the software is to look at the "state" variable in motion notify events to determine presses and releases instead of using the correct algorithm of looking at buttonpress and buttonrelease events. State can tell you the difference between a drag and a mere traverse but it is not a valid source of press/release event detection.

The stylus can report its position when hovering over, but near, the glass. The finger is only reported when it is actually pressing. Thus, there are no motion notify events when the finger is withdrawn.

Multitouch trivia: double finger pan movements get converted to buttons 4,5,6,7. In other words, they are converted to scrollwheel motion. The finger motion is not reported as cursor motion. pinch zoom in/out and rotate don't seem to be meaningfully reported. You apparently have to register for xinput 2.2 events or something to get the position of each finger.

Note that in order to simulate this problem without a touchscreen device, one could use a second trackpad or wacom digitizer with a modified driver send only touchscreen compatible events. Or write a program to pop up a window in which mouse movements (with button down only) are sent to another window as synthetic mouse movements.

ProblemType: Bug
DistroRelease: Ubuntu 14.04
Package: onboard 1.0.0-0ubuntu4
ProcVersionSignature: Ubuntu 3.16.0-30.40~14.04.1-generic 3.16.7-ckt3
Uname: Linux 3.16.0-30-generic x86_64
ApportVersion: 2.14.1-0ubuntu3.10
Architecture: amd64
CurrentDesktop: Unity
Date: Tue May 19 00:17:18 2015
InstallationDate: Installed on 2015-05-16 (2 days ago)
InstallationMedia: Ubuntu 14.04.2 LTS "Trusty Tahr" - Release amd64 (20150218.1)
SourcePackage: onboard
UpgradeStatus: No upgrade log present (probably fresh install)

Revision history for this message
whitis (whitis) wrote :
Revision history for this message
whitis (whitis) wrote :

Fix was to install this PPA:
As suggested from this bug report

However, that bug report suggests multitouch should work. Specifically, that you should be able to hold the shift key while typing. This does not work. The second touch may or may not cause the shift to be released, but in either case characters will not be typed.
This isn't a big issue for me.

Revision history for this message
marmuta (marmuta) wrote :

Hi whitis, thanks for the detailed bug report. This particular bug has been reported before as Glad you found the PPA and yes, newer versions of Onboard have a workaround for this issue.

Concerning multi-touch, Onboard needs to receive TouchBegin/TouchUpdate/TouchEnd events for that to work. It cannot be done with devices that only generate ButtonPress/MotionNotify/ButtonRelease events. Wacom touch screens are configured to do the latter by default, behaving basically like plain old mice, but you can make yours generate touch events by turning off gesture support. See also

Something like (insert your touch screen name or numeric id returned by "xinput")
xsetwacom --set "Wacom ISDv4 E6 Finger touch" Gesture off
then restart Onboard.

If it doesn't work at first, please reboot once and retry.

Revision history for this message
Launchpad Janitor (janitor) wrote :

Status changed to 'Confirmed' because the bug affects multiple users.

Changed in onboard (Ubuntu):
status: New → Confirmed
Revision history for this message
yonnie (yonnieyonsta) wrote :

Laptop HP2740p Elitebook, this issue is in multiple distributions both RPM and DEB. A year ago, 14.04LTS, just had a seemingly minor issue with the touchscreen not working right, not sensing fingers, etc. 16.04LTS or OpenSUSE Leap 42.1 and a half dozen other distros tried inbetween. One problem observed is the "Input Devices" shows a Joystick and the input device is an accelerometer (typically the gizmo used for detecting laptop attitude for screeen rotation) BTW the screen doesn't rotate when laptop is tilted. Other serious issue which turns laptop unuseable is the Touchpad. Touching it or using its buttons will cause the mouse(all mice) to be unable to select anything outside of the window such as the main menu button located at bottom left in kick-panel. Turning the synaptics scratchpad off via xinput seems to be quite helpful. This is now offset by the touchscreen behaving much better with sensing finger touch with the newer software, at least you can scroll with the finger by touching the screen. One of the misbehaviors of the synaptics scratchpad, is selecting lines from covered windows and dragging the content into the present open window.

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.