← Back to team overview

desktop-packages team mailing list archive

[Bug 1456452] [NEW] Works with stylus but not finger: HP Elitebook 2740P Tablet PC

 

Public bug reported:

As I will explain below, it appears that this bug is probably a
manifestation of onboard using a completely incorrect algorithm for
detecting button presses and releases that only accidentally works on
legacy input devices and fails completely on a touchscreen.

HP EliteBook 2740p is a convertible Tablet PC with a screen that can rotate from Laptop mode into tablet mode.   In tablet mode, the screen covers the keyboard and an on screen keyboard is a necessity.   This machine has 4+ input devices:
   - multitouch touch screen
   - stylus on screen
   - pointing stick
   - trackpad
   - also, 3D accelerometer which could theoretically be used to move cursor.
These are without plugging in an external 3D space navigator, mouse, joystick, gamepad, eyemouse, etc.


I can type with:
   - Touch screen stylus on screen
   - Touchpad (cumbersome and not at all useful in tablet mode)
   - pointing stick  (AKA erasor/clitoris/trackpoint(tm)/etc) (cumbersome and not at all useful in tablet mode)
   - (physical keyboard)
I can not type using
  - Finger on multitouch screen

This is not a calibration issue.    This is not a rotation issue; even
though I have been playing with rotation, the screen  and input devices
are currently unrotated and I can type in xvkbd or florence on screen
keyboards with stylus or finger.

onboard complet

 xsetwacom --list
Serial Wacom Tablet WACf00e stylus	id: 13	type: STYLUS    
Serial Wacom Tablet WACf00e eraser	id: 15	type: ERASER    
Serial Wacom Tablet WACf00e touch	id: 16	type: TOUCH        # finger
# Not shown: synaptics touchpad


The following error message appears frequently in the terminal tab from which onboard was started (enough to render the tab unusable):  
(onboard:5529): Gdk-CRITICAL **: gdk_device_get_axis_use: assertion 'index_ < device->axes->len' failed
This error appears whether using stylus or finger (but not touchpad or stick)  and occurs more than once per key press as you hover.

Note the following  potentially relevant general behavior differences
between these 4 pointing devices.   With the three that work, the cursor
tracks movement when not clicking/dragging.    The one that doesn't
warps the mouse directly to the point touched and generates events at
that point and hides the mouse cursor since the mouse cursor can't track
a raised finger.

Default onboard preferences were used.

xev shows Button Press, Motion Notify, and Button release events when touching with finger.   Similar with stylus.
Differences, stylus sends motion notify updates when near screen, finger doesn't.
Finger sends motion notify updates while being held still touching screen, stylus does not.

Here are events captured with both buton and stylus:
stylus:
ButtonRelease event, serial 37, synthetic NO, window 0x5600001,
    root 0x80, subw 0x0, time 9012106, (110,108), root:(175,676),
    state 0x100, button 1, same_screen YES

Finger:
ButtonRelease event, serial 37, synthetic NO, window 0x5600001,
    root 0x80, subw 0x0, time 9065337, (96,85), root:(161,653),
    state 0x100, button 1, same_screen YES

Another difference is the "state" in the motion notify events:
   Stylus:  0x0, 0x0, ... 0x100, 0x100, 0x100, ... 0x0, 0x0, ....
   Finger:  0x100 events only.
  erasor or touchpad: generate 0x0 events.

This suggests that the bug in the software is to look at the "state"
variable in motion notify events to determine presses and releases
instead of using the correct algorithm of  looking at buttonpress and
buttonrelease events.    State can tell you the difference between a
drag and a mere traverse but it is not a valid source of press/release
event detection.

The stylus can report its position when hovering over, but near, the
glass.   The finger is only reported when it is actually pressing.
Thus, there are no motion notify events when the finger is withdrawn.


Multitouch trivia: double finger pan movements get converted to buttons 4,5,6,7.    In other words, they are converted to scrollwheel motion.   The finger motion is not reported as cursor motion.  pinch zoom in/out and rotate don't seem to be meaningfully reported.   You apparently have to register for xinput 2.2 events or something to get the position of each finger.


Note that in order to simulate this problem without a touchscreen device, one could use a second trackpad or wacom digitizer with a modified driver send only touchscreen compatible events.   Or write a program to pop up a window in which mouse movements (with button down only) are sent to another window as synthetic mouse movements.

ProblemType: Bug
DistroRelease: Ubuntu 14.04
Package: onboard 1.0.0-0ubuntu4
ProcVersionSignature: Ubuntu 3.16.0-30.40~14.04.1-generic 3.16.7-ckt3
Uname: Linux 3.16.0-30-generic x86_64
ApportVersion: 2.14.1-0ubuntu3.10
Architecture: amd64
CurrentDesktop: Unity
Date: Tue May 19 00:17:18 2015
InstallationDate: Installed on 2015-05-16 (2 days ago)
InstallationMedia: Ubuntu 14.04.2 LTS "Trusty Tahr" - Release amd64 (20150218.1)
SourcePackage: onboard
UpgradeStatus: No upgrade log present (probably fresh install)

** Affects: onboard (Ubuntu)
     Importance: Undecided
         Status: New


** Tags: amd64 apport-bug trusty

-- 
You received this bug notification because you are a member of Desktop
Packages, which is subscribed to onboard in Ubuntu.
https://bugs.launchpad.net/bugs/1456452

Title:
  Works with stylus but not finger: HP Elitebook 2740P Tablet PC

Status in onboard package in Ubuntu:
  New

Bug description:
  As I will explain below, it appears that this bug is probably a
  manifestation of onboard using a completely incorrect algorithm for
  detecting button presses and releases that only accidentally works on
  legacy input devices and fails completely on a touchscreen.

  HP EliteBook 2740p is a convertible Tablet PC with a screen that can rotate from Laptop mode into tablet mode.   In tablet mode, the screen covers the keyboard and an on screen keyboard is a necessity.   This machine has 4+ input devices:
     - multitouch touch screen
     - stylus on screen
     - pointing stick
     - trackpad
     - also, 3D accelerometer which could theoretically be used to move cursor.
  These are without plugging in an external 3D space navigator, mouse, joystick, gamepad, eyemouse, etc.

  
  I can type with:
     - Touch screen stylus on screen
     - Touchpad (cumbersome and not at all useful in tablet mode)
     - pointing stick  (AKA erasor/clitoris/trackpoint(tm)/etc) (cumbersome and not at all useful in tablet mode)
     - (physical keyboard)
  I can not type using
    - Finger on multitouch screen

  This is not a calibration issue.    This is not a rotation issue; even
  though I have been playing with rotation, the screen  and input
  devices are currently unrotated and I can type in xvkbd or florence on
  screen keyboards with stylus or finger.

  onboard complet

   xsetwacom --list
  Serial Wacom Tablet WACf00e stylus	id: 13	type: STYLUS    
  Serial Wacom Tablet WACf00e eraser	id: 15	type: ERASER    
  Serial Wacom Tablet WACf00e touch	id: 16	type: TOUCH        # finger
  # Not shown: synaptics touchpad

  
  The following error message appears frequently in the terminal tab from which onboard was started (enough to render the tab unusable):  
  (onboard:5529): Gdk-CRITICAL **: gdk_device_get_axis_use: assertion 'index_ < device->axes->len' failed
  This error appears whether using stylus or finger (but not touchpad or stick)  and occurs more than once per key press as you hover.

  Note the following  potentially relevant general behavior differences
  between these 4 pointing devices.   With the three that work, the
  cursor tracks movement when not clicking/dragging.    The one that
  doesn't warps the mouse directly to the point touched and generates
  events at that point and hides the mouse cursor since the mouse cursor
  can't track a raised finger.

  Default onboard preferences were used.

  xev shows Button Press, Motion Notify, and Button release events when touching with finger.   Similar with stylus.
  Differences, stylus sends motion notify updates when near screen, finger doesn't.
  Finger sends motion notify updates while being held still touching screen, stylus does not.

  Here are events captured with both buton and stylus:
  stylus:
  ButtonRelease event, serial 37, synthetic NO, window 0x5600001,
      root 0x80, subw 0x0, time 9012106, (110,108), root:(175,676),
      state 0x100, button 1, same_screen YES

  Finger:
  ButtonRelease event, serial 37, synthetic NO, window 0x5600001,
      root 0x80, subw 0x0, time 9065337, (96,85), root:(161,653),
      state 0x100, button 1, same_screen YES

  Another difference is the "state" in the motion notify events:
     Stylus:  0x0, 0x0, ... 0x100, 0x100, 0x100, ... 0x0, 0x0, ....
     Finger:  0x100 events only.
    erasor or touchpad: generate 0x0 events.

  This suggests that the bug in the software is to look at the "state"
  variable in motion notify events to determine presses and releases
  instead of using the correct algorithm of  looking at buttonpress and
  buttonrelease events.    State can tell you the difference between a
  drag and a mere traverse but it is not a valid source of press/release
  event detection.

  The stylus can report its position when hovering over, but near, the
  glass.   The finger is only reported when it is actually pressing.
  Thus, there are no motion notify events when the finger is withdrawn.

  
  Multitouch trivia: double finger pan movements get converted to buttons 4,5,6,7.    In other words, they are converted to scrollwheel motion.   The finger motion is not reported as cursor motion.  pinch zoom in/out and rotate don't seem to be meaningfully reported.   You apparently have to register for xinput 2.2 events or something to get the position of each finger.

  
  Note that in order to simulate this problem without a touchscreen device, one could use a second trackpad or wacom digitizer with a modified driver send only touchscreen compatible events.   Or write a program to pop up a window in which mouse movements (with button down only) are sent to another window as synthetic mouse movements.

  ProblemType: Bug
  DistroRelease: Ubuntu 14.04
  Package: onboard 1.0.0-0ubuntu4
  ProcVersionSignature: Ubuntu 3.16.0-30.40~14.04.1-generic 3.16.7-ckt3
  Uname: Linux 3.16.0-30-generic x86_64
  ApportVersion: 2.14.1-0ubuntu3.10
  Architecture: amd64
  CurrentDesktop: Unity
  Date: Tue May 19 00:17:18 2015
  InstallationDate: Installed on 2015-05-16 (2 days ago)
  InstallationMedia: Ubuntu 14.04.2 LTS "Trusty Tahr" - Release amd64 (20150218.1)
  SourcePackage: onboard
  UpgradeStatus: No upgrade log present (probably fresh install)

To manage notifications about this bug go to:
https://bugs.launchpad.net/ubuntu/+source/onboard/+bug/1456452/+subscriptions


Follow ups

References