← Back to team overview

touch-packages team mailing list archive

[Bug 1525979] Re: Touchscreen interactions should take priority over mouse and disable it

 

** Changed in: ubuntu-ux
       Status: New => Triaged

** Changed in: ubuntu-ux
   Importance: Undecided => Medium

** Changed in: ubuntu-ux
     Assignee: (unassigned) => Andrea Bernabei (faenil)

-- 
You received this bug notification because you are a member of Ubuntu
Touch seeded packages, which is subscribed to qtbase-opensource-src in
Ubuntu.
https://bugs.launchpad.net/bugs/1525979

Title:
  Touchscreen interactions should take priority over mouse and disable
  it

Status in Canonical System Image:
  New
Status in Ubuntu UX:
  Triaged
Status in qtbase-opensource-src package in Ubuntu:
  New

Bug description:
  It is possible, at the moment  (r199, krillin, rc-proposed), to use
  both touch and mouse at the same time.

  Because of:
  - QtQuick's touch-to-mouse events synthesis feature;
  - the fact that most of QML code in this world relies on MouseArea to handle input (touch as well);
  - the fact that there is no QML component that handles both Touch and Mouse events and gives the developer a good API to handle both;
  - the fact that making both touch and mouse usable at the same time easily leads to unexpected and broken UX;

  I suggest we make it so that only one input device can be used at any given time by default (exceptional cases to be handled separately).
  Moreover, I think it would be a good idea to give touch events a priority over mouse events, i.e. mouse stops working when the user touches the screen, but not viceversa.

  I also think the final decision should take into account the conventions the users are already accustomed to.
  I played with a laptop that features touchscreen (Microsoft's Surface) and Win10, and here's what I found:

  - in the default browser: interacting with the touch stops and hides
  the mouse, and gives priority to (multi)-touch gestures. The mouse
  pointer stays still (i.e. doesn't follow the fingers). The only way I
  found that let me take control of the mouse again was to perform a
  single tap and then wait a short amount of time before moving the
  mouse.

  - in other apps that did not feature special touch handling,
  interacting with the touch would still disable the mouse, but in this
  case the mouse pointer followed my finger (I guess this is a "if
  nothing consumes touch events -> then do mouse simulation")

  I believe this bug is a show stopper for the convergent experience.

  It is currently possible to trigger flickering and broken UX in
  multiple places in Unity8. Basically anything that relies on MouseMove
  events is broken and causes flickering.

  A few examples:
  - username vertical scrolling in the login manager (just drag the username with your finger and then move the mouse).
  - window positioning (same as above)
  - indicators horizontal scrolling
  - scrolling in ANY Flickable/ListView based views inside applications and platform menus
  - side scrolling in the Dash
  - etc, etc etc...

  NOTE: after a discussion on IRC with Saviq, we agreed that it would be
  awesome if MouseArea would be able to handle different input devices.
  I already researched this before writing this post, and I didn't see
  any way how MouseArea could (with the current APIs) be able to do
  that. That means, imho, a looong waiting time before we actually
  implement such a feature in Qt itself. Hence I proposed the solution
  above as a workaround, while we get all the rest of the pieces working
  as we expect.

To manage notifications about this bug go to:
https://bugs.launchpad.net/canonical-devices-system-image/+bug/1525979/+subscriptions