[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Ayatana] What do to about right-clicking on the indicator-applet?



hey there ;)

On 2010-06-17, Martin Owens <doctormo@xxxxxxxxx> wrote:
> Hey David,
>
> On Thu, 2010-06-17 at 07:59 -0700, David Hamm wrote:
>> there's also swipe (down), shake, multi-finger, and eye tracking. 0_0
>> but these require fine tuning. (insert (mighty) mouse)
>
> You mean perhaps in the future if we look confused at an icon then the
> computer will pick up our expression, track what the item is we're
> looking at and offer a tool tip. If our expression turns to ambivalence
> then perhaps a set of actions?
>
> haha, the future is here!
>
> Martin,

the last time we discussed Augmented Reality HIDs i remember
Jan-Christoph pointed us to face mimics as advanced gesture to scroll
and navigate with Opera, already implemented.

somebody pointed me to a broken link branch of a GIT for the compiz
head tracking plugin..

the future is here?
We are aware of a german/french combat helicopter, whose weapon system
is aimed with the eyes only.. that's nearly a decade old tracking
software, probably Free code out there like so many other AR
projects..

with all these codelets for machine-perception out there, i sometimes
ask myself how little high-level code it would take to make Festival
speak with attitude like KITT or Optimus Prime.

the computer's behaviour already contains enough entropy to be
interpreted as deliberate at some points..