← Back to team overview

multi-touch-dev team mailing list archive

Some notes from XDS on #xorg-devel

 

Hey all,

There was some interesting chatter and note-taking on #xorg-devel today
around X, MT, and gestures, so I thought I would pass it along:


[1:42am] <ajax> good morning
[1:42am] <ajax> got to the multitouch talk late, sorry about that
[1:42am] kem joined the chat room.
[1:43am] <ajax> let's see if i can't backfill from what i heard since
coming in
[1:44am] <ajax> unlike mpx, you have to glue all the touch points
together as one device, since otherwise you have to do device
create/destroy notifies for every touchpoint, which will wake up every
client and that's lame
[1:46am] <ajax> multitouch and gestures are actually separate things,
there are plenty of use cases that want multiple touches without gestures
[1:46am] <ajax> touch and pointer events are probably separate things;
pointer and touch grabs would ignore each other, if A has a pointer grab
B would still recieve touch events
[1:51am] tseliot joined the chat room.
[1:51am] <ajax> event replay makes for many weird issues, events can get
delivered out of order
[1:52am] <ajax> some devices can report events as bitmaps (actually rgb
images); ouch, imagine queueing all that up in the server until the
events replay
[1:53am] <ajax> "everyone wants multitouch, but nobody wants multitouch"
[1:54am] <ajax> experience with app development is that they end up 99%
singletouch with a sprinkling of pinch/zoom or similar
[1:54am] <ajax> mt probably not going to happen for 1.10
[1:58am] <ajax> multitouch workshop earlier in the week was mostly
violent agreement for the short term and then fuzzier after
[1:59am] <ajax> timeline is hard to predict; chase should have more time
for it after maverick, daniels working on it, so, _something_ will happen.
[2:00am] <ajax> ms surface has a ui principle of "no touch left behind",
every touch has visual feedback. otherwise people think the app is
broken and poke the screen harder and harder
[2:03am] gdm left the chat room. (Ping timeout: 265 seconds)
[2:03am] <ajax> only ~5 gestures are intuitive (swipe, pinch/zoom, etc);
after that the UI needs to give feedback to show what's happening
[2:04am] mcepl joined the chat room.
[2:08am] benh left the chat room. (Quit: Leaving)
[2:09am] <ajax> (some technical discussion about how to draw that
feedback, hardware cursors or ...)
[2:14am] <ajax> how do we help?  test!  write an actual app and find out
how it doesn't work.
[2:14am] <ajax> peter's not an app guy or a toolkit guy, so he's kind of
driving blind without that kind of feedback
[2:15am] <ajax> --
[2:23am] <remi|work> ajax, talk still going on?
[2:32am] MacSlow left the chat room. (Ping timeout: 272 seconds)
[2:34am] MacSlow joined the chat room.
[2:37am] <ajax> nope, that's what the -- was
[2:38am] <ajax> coffee break now
[2:42am] ickle joined the chat room.
[2:51am] <whot> mhmmmm. coffeee.....
[2:55am] ickle left the chat room. (Quit: Lost terminal)
[2:58am] zhenyuw left the chat room. (Quit: Leaving)
[3:02am] <mcepl> yes! coffee is good!
[3:02am] <remi|work> ajax, arf  I had questions/suggestions
[3:02am] <remi|work> never mind
[3:04am] <ajax> remi|work: well, ask 'em anyway, whot will be back on
irc eventually
[3:06am] <remi|work> well it was more of a grand plan which I wanted to
discuss with others at xds or elsewhere
[3:07am] <remi|work> I didn't manage to get a lot of folks at my fosdem
talk 2 years ago
[3:10am] <whot> remi|work: back, shoot
[3:11am] <remi|work> whot, hey  basically, I wanted to discuss MT and
other stuff
[3:12am] <remi|work> with what the ubuntu dude did a couple of weeks
ago, and what we did at InSitu around Metisse
[3:12am] <remi|work> the conclusion I'd come to was that there was a
real need for an "input compositor" type infrastructure
[3:12am] <whot> right. fwiw, chase is on now, "Gestures and X"
[3:13am] » remi|work will then wait for ajax's notes before rambling on
[3:14am] <whot> remi|work: it's a big discussion
[3:14am] <ajax> --- gestures and x ---
[3:15am] mmc joined the chat room.
[3:16am] <ajax> "gestures" here means intuitive gestures only (again, as
above, pinch zoom swipe rotate etc), not "magic" gestures like draw a
star to close an app
[3:17am] <ajax> general consensus that gestures need to be client-side
for the most part, X just demuxing events
[3:17am] <ajax> which isn't how it works now, but, getting there
[3:18am] <ajax> mtdev: multitouch tracking translation library.  eats
raw multitouch data (tracked or not), emits tracked data and filtered
coordinates
[3:19am] <whot> http://bitmath.org/code/mtdev/
[3:20am] Havis left the chat room. (Quit: Ex-Chat)
[3:21am] <ajax> utouch-grail: gesture recognition and instantiation
library.  eats tracked touches, emits gestures if they're recognized or
passes through if not
[3:21am] benh joined the chat room.
[3:21am] mmc left the chat room. (Ping timeout: 255 seconds)
[3:22am] <ajax> gestures only emitted if the client has expressed
interest in _that_ gesture (if you don't care about rotates, you won't
get them)
[3:24am] <ajax> utouch-geis (pronounced like "geist" without the t):
gesture engine interface support. input is platform-specific gesture
events, output is common gesture event interface.
[3:24am] <ajax> geis is api for apps to register for gestures
[3:24am] <ajax> (architecture diagram)
[3:25am] GNUtoo|laptop joined the chat room.
[3:25am] <remi|work> could someone in the room point out that Metisse
has infrastructure for that? it already supports gestures (done with the
mouse) which you can "cancel" if you've messed up, in which case metisse
replays the mouse events for the underlying windows
[3:25am] » whot waits for ajax asciiart skills
[3:25am] ssp joined the chat room.
[3:26am] <remi|work> not that I'm pushing for Metisse directly, the code
is a mess and I wouldn't force it on anyone other than me
[3:28am] <ajax> xcb-gesture is private implementation-detail protocol to
get this out of xserver up to geis and unity
[3:29am] benh left the chat room. (Quit: Leaving)
[3:29am] <ajax> another architecture diagram, this one moves stuff out
to client side
[3:29am] <whot> remi|work: discussion on what is mettise ensues
[3:30am] <ajax> motivation is about pluggable gesture recognizers and
keeping complexity out of the server
[3:31am] <remi|work> whot, may I assume you've described what it does?
[3:31am] <whot> ajax did, roughly, yes
[3:31am] <whot> metisse is not something easily explained in 30 sec
[3:31am] <remi|work> haha, right
[3:33am] <remi|work> well the 2 core features are window compositing and
per-window input redirection
[3:33am] <remi|work>the rest is just fluff
[3:39am] <ajax> unity has a use model where 3 and 4 finger gestures are
window and environment controls, so want to make those things that you
"unlock"; how do you design that well?  not really clear.
[3:39am] egbert joined the chat room.
[3:40am] <ajax> whot points out that direct (touchscreen) and indirect
(touchpad) multitouch are pretty different usage models, not clear that
you want to conflate them.
[3:40am] haitao left the chat room. (Quit: Leaving.)
[3:41am] Kayden joined the chat room.
[3:42am] <ajax> list of hardware with interface style and number of touches
[3:42am] <ajax> big categories are touchscreen, trackpad, and apple
magic mouse.
[3:44am] <ajax> (demo)
[3:49am] VertexSymphony left the chat room. (Remote host closed the
connection)
[3:51am] shining^ joined the chat room.
[3:52am] <ajax> question: have you seen any blob devices?  (shrug, no)
[3:55am] shining left the chat room. (Ping timeout: 276 seconds)
[3:56am] <ajax> usual licensing discussion
[3:56am] <ajax> ---



Follow ups