← Back to team overview

touch-packages team mailing list archive

[Bug 1525979] Re: Touchscreen interactions should take priority over mouse and disable it

 

That would be quite a bad UX :)

Yes, I definitely meant *temporarily* disable one input device while the
other is being used, more or less like when you enable the palm
detection on your touchpad setting :)

** Summary changed:

- Touchscreen interactions should take priority over mouse and disable it
+ Touchscreen interactions should take priority over mouse and temporarily disable it

** Description changed:

  It is possible, at the moment  (r199, krillin, rc-proposed), to use both
  touch and mouse at the same time.
  
  Because of:
  - QtQuick's touch-to-mouse events synthesis feature;
  - the fact that most of QML code in this world relies on MouseArea to handle input (touch as well);
  - the fact that there is no QML component that handles both Touch and Mouse events and gives the developer a good API to handle both;
  - the fact that making both touch and mouse usable at the same time easily leads to unexpected and broken UX;
  
  I suggest we make it so that only one input device can be used at any given time by default (exceptional cases to be handled separately).
+ That is, mouse and touch can of course be both plugged in, and both can be used, just not at the *very same* time. For instance, as long as the user is dragging a surface using the touchscreen, he should not be able to click/move/interact using the mouse at the same time. After the finger is released from the touchscreen, then the mouse can be used again (and viceversa).
+  
  Moreover, I think it would be a good idea to give touch events a priority over mouse events, i.e. mouse stops working when the user touches the screen, but not viceversa.
  
  I also think the final decision should take into account the conventions the users are already accustomed to.
  I played with a laptop that features touchscreen (Microsoft's Surface) and Win10, and here's what I found:
  
  - in the default browser: interacting with the touch stops and hides the
  mouse, and gives priority to (multi)-touch gestures. The mouse pointer
  stays still (i.e. doesn't follow the fingers). The only way I found that
  let me take control of the mouse again was to perform a single tap and
  then wait a short amount of time before moving the mouse.
  
  - in other apps that did not feature special touch handling, interacting
  with the touch would still disable the mouse, but in this case the mouse
  pointer followed my finger (I guess this is a "if nothing consumes touch
  events -> then do mouse simulation")
  
  I believe this bug is a show stopper for the convergent experience.
  
  It is currently possible to trigger flickering and broken UX in multiple
  places in Unity8. Basically anything that relies on MouseMove events is
  broken and causes flickering.
  
  A few examples:
  - username vertical scrolling in the login manager (just drag the username with your finger and then move the mouse).
  - window positioning (same as above)
  - indicators horizontal scrolling
  - scrolling in ANY Flickable/ListView based views inside applications and platform menus
  - side scrolling in the Dash
  - etc, etc etc...
  
  NOTE: after a discussion on IRC with Saviq, we agreed that it would be
  awesome if MouseArea would be able to handle different input devices. I
  already researched this before writing this post, and I didn't see any
  way how MouseArea could (with the current APIs) be able to do that. That
  means, imho, a looong waiting time before we actually implement such a
  feature in Qt itself. Hence I proposed the solution above as a
  workaround, while we get all the rest of the pieces working as we
  expect.
  
- 
- 
- 
  ============= UX UPDATE ===================
  This was discussed during today's (15th Dec) team UX review meeting.
  
  The outcome of the meeting was:
  - The UX team will start a research project to handle this matter in more detail.
  - We all agreed it makes sense to prevent multiple input devices from being active at the same time, i.e. mouse disables touch, touch disables mouse. This is however just a quick consideration done during the meeting, the details have to be considered as part of the research project described in the previous point.

-- 
You received this bug notification because you are a member of Ubuntu
Touch seeded packages, which is subscribed to qtbase-opensource-src in
Ubuntu.
https://bugs.launchpad.net/bugs/1525979

Title:
  Touchscreen interactions should take priority over mouse and
  temporarily disable it

Status in Canonical System Image:
  Confirmed
Status in Ubuntu UX:
  Triaged
Status in qtbase-opensource-src package in Ubuntu:
  New

Bug description:
  It is possible, at the moment  (r199, krillin, rc-proposed), to use
  both touch and mouse at the same time.

  Because of:
  - QtQuick's touch-to-mouse events synthesis feature;
  - the fact that most of QML code in this world relies on MouseArea to handle input (touch as well);
  - the fact that there is no QML component that handles both Touch and Mouse events and gives the developer a good API to handle both;
  - the fact that making both touch and mouse usable at the same time easily leads to unexpected and broken UX;

  I suggest we make it so that only one input device can be used at any given time by default (exceptional cases to be handled separately).
  That is, mouse and touch can of course be both plugged in, and both can be used, just not at the *very same* time. For instance, as long as the user is dragging a surface using the touchscreen, he should not be able to click/move/interact using the mouse at the same time. After the finger is released from the touchscreen, then the mouse can be used again (and viceversa).
   
  Moreover, I think it would be a good idea to give touch events a priority over mouse events, i.e. mouse stops working when the user touches the screen, but not viceversa.

  I also think the final decision should take into account the conventions the users are already accustomed to.
  I played with a laptop that features touchscreen (Microsoft's Surface) and Win10, and here's what I found:

  - in the default browser: interacting with the touch stops and hides
  the mouse, and gives priority to (multi)-touch gestures. The mouse
  pointer stays still (i.e. doesn't follow the fingers). The only way I
  found that let me take control of the mouse again was to perform a
  single tap and then wait a short amount of time before moving the
  mouse.

  - in other apps that did not feature special touch handling,
  interacting with the touch would still disable the mouse, but in this
  case the mouse pointer followed my finger (I guess this is a "if
  nothing consumes touch events -> then do mouse simulation")

  I believe this bug is a show stopper for the convergent experience.

  It is currently possible to trigger flickering and broken UX in
  multiple places in Unity8. Basically anything that relies on MouseMove
  events is broken and causes flickering.

  A few examples:
  - username vertical scrolling in the login manager (just drag the username with your finger and then move the mouse).
  - window positioning (same as above)
  - indicators horizontal scrolling
  - scrolling in ANY Flickable/ListView based views inside applications and platform menus
  - side scrolling in the Dash
  - etc, etc etc...

  NOTE: after a discussion on IRC with Saviq, we agreed that it would be
  awesome if MouseArea would be able to handle different input devices.
  I already researched this before writing this post, and I didn't see
  any way how MouseArea could (with the current APIs) be able to do
  that. That means, imho, a looong waiting time before we actually
  implement such a feature in Qt itself. Hence I proposed the solution
  above as a workaround, while we get all the rest of the pieces working
  as we expect.

  ============= UX UPDATE ===================
  This was discussed during today's (15th Dec) team UX review meeting.

  The outcome of the meeting was:
  - The UX team will start a research project to handle this matter in more detail.
  - We all agreed it makes sense to prevent multiple input devices from being active at the same time, i.e. mouse disables touch, touch disables mouse. This is however just a quick consideration done during the meeting, the details have to be considered as part of the research project described in the previous point.

To manage notifications about this bug go to:
https://bugs.launchpad.net/canonical-devices-system-image/+bug/1525979/+subscriptions