← Back to team overview

multi-touch-dev team mailing list archive

Re: Unity Gesture UI Guidelines 0.2 now available http://docs.google.com/View?id=dfkkjjcj_1482g457bcc7

 

> The right click menus are sub-optimal for touch interaction.  Touch point
> menus solve several of the problems with right click menuse in a finger
> touch environment including; items being obscured by the user's hand,
> insufficient spacing, difficulty navigating sub menus both in terms of
> precision and the finger covering menu items, touch targets being
> insufficiently visual defined etc...

The right click menus after a hold can be opened in an other place.
Then the selection of an item can be made with a release event.
http://www.autodeskresearch.com/publications/multitouchmm

> In terms of layout touch menus really need to be either horizontal or pie
> (we have opted for horizontal).

This is a good repository for menus ideas :
http://www.gillesbailly.fr/menua/?language=fr
Videos of menus interaction are faster to transmit ideas, but the
documents are also good.


> Ahhh...  ;-)  Pen input is quite different from finger input because pens
> have a much higher level of precision and because the pen obscures less of
> the screen.  But lucky the Wacom pens have a rocker button that is mapped to
> the left and right mouse button inputs.  However we do need to know when a
> input is coming from a pen as opposed to a finger because in some cases we
> will want different behaviors.

The pen with an ntrig digitizer is a separate device (in terms of
events source : /dev/input/eventX)
The pen movement is also detected before touching the screen, then
when it touches the screen a clickevent is transmitted.


> Unfortunately when using finger touch input we do not have a mouseover
> state.  We don't know where a user's fingers are positioned relative to the
> screen until the finger(s) make contact with the screen surface.  For
> example a user may perform a single finger hold-release to bring up a
> touchpoint menu, and then tap the item they wish to select.  We have no idea
> of the position of the users fingers between these two actions, and therefor
> have no means of expanding menu items underneath the raised location of the
> finger.

YES we can ! :D
We don't have the information of where the finger were before touching
the screen, but we have it after that until it is released. And any
performed action is mapped to a Mouse_Release_Event, not a
Mouse_Click_Event.

To select something from a menus with a conventional mouse/touchpad I can :
(1) MouseRightClick / MouseClick to bring the menu, quick MouseRelease
before a tempo is passed, MouseClick&MouseRelease on an item to select
it.
_OR_
(2) MouseRightClick / MouseClick to bring the menu, (hold),
MouseRelease on an item to select it.

The 1st behavior is the most used by users, but the 2nd is also
possible. And may be more natural when we don't know the position of
the finger before it touches the screen (our case).
I can no more find the FiniteStateMachine of mouse interaction (need
to search more), but I remember additional states for it.


> However there are lots of reasons right click menus don't really work in a
> finger touch environment and when we solve all the problems we end up with
> something that looks like the touchpoint menus.  Perhaps in the long term
> future we could look into replacing the right click menus with the
> touchpoint menus??  Although right click menus don't work well with finger
> touch, touchpoint menus do work with mouse interaction.

I was just trying to simplify the work and the need for additional menus.
RightClick-menus+Finger worked almost fine for me in another OS so I
just wanted to ask.
Maybe I become wrong with the usual users behaviors.

i



References