← Back to team overview

multi-touch-dev team mailing list archive

Re: Multi-touch with uTouch

 

On 03/30/2012 03:15 AM, Benjamin Longuechaud wrote:
> 
> I'm working on multi touch application and I have already defined some
> gesture on windows with WM_TOUCH. Now, i would like to do the same on Linux.
> I have seen that this was managed directly in the Kernel and we had to
> develop our own driver. But now, we can use uTouch library throught GEIS,
> that's right?

There are multiple points of entry to obtain touch input information.

(1) The kernel provides raw touch data through the /dev/input mechanism.
 This is difficult for application programmers to use and is definitely
not portable outside of Linux.  We do not recommend this method.

(2) Obtaining multi-touch events through the X11 interface.  This
requires a recent x.org X11 server and XInput 2.2 or later, and requires
working directly with the X11 API.  We do not recommend this method.

(3) Using the GEIS API to obtain touch events.  You will still need to
obtain the X11 window, but you likely already have this much
information.  We have plans to provide a native GEIS implementation for
platforms other than Linux at some point in the future.  We recommend
this method.

> So, can we fetch information about the finger position, times... with
> uTouch in oder to define some other gestures?
> 
> GeisTouch <http://developer.ubuntu.com/api/ubuntu-11.10/cplusplus/utouch-geis/class_geis_touch.html> is the simple touch event structure used by GEIS to implement gestures

Yes.  You will want to subscribe to the GEIS_GESTURE_TOUCH class of events.

> Do you know how Qt manage this?

I believe Qt has had a number of experimental approaches to multi-touch
and gesture events in their widget toolkit in the past.  I understand
their current approach is to provide touch areas in their QML toolkit.

We have a utouch-qml package that integrates uTouch into QML.

-- 
Stephen M. Webb  <stephen.webb@xxxxxxxxxxxxx>
Canonical Ltd.


Follow ups

References