[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Ayatana] simple NLP in dash



On Mon, Oct 17, 2011 at 16:13, Christian Rupp <christian@xxxxxxxx> wrote:
Am 17.10.2011 15:09, schrieb frederik.nnaji@xxxxxxxxx:
i'd like to type (and one day speak) natural language to achieve simple things, such as adding an appointment to my calendar or showing all pictures i copied, viewed, created or edited last tuesday..

for this to be possible, the dash would have to start making a difference between precise statements (known) and fuzzy expressions (guessed).
A precise statement aka a known e.g. would be "rm -R /home/fred/oldphotos" or "update-manger -d" or something of the kind, while a fuzzy statement would be any input that doesn't match the known reference tables.

this way it is possible to employ natural instructions as a user, such as "add an appointment at 7am tomorrow" or "schedule a task for 5pm" or "wake up at 6am".

wouldn't that be something?
this would also solve the problem of whether or not to interpret regular shell commands in the dash, there would simply be knowns and guesses, end of story.


_______________________________________________
Mailing list: https://launchpad.net/~ayatana
Post to     : ayatana@xxxxxxxxxxxxxxxxxxx
Unsubscribe : https://launchpad.net/~ayatana
More help   : https://help.launchpad.net/ListHelp
The idea that you can give natural instruction sounds great (hard to achieve?) and I really would like to have something like that... control over voice (even harder?)

I personally know that voice control has been available for a lot of functions on ordinary mobile phones (CPU < 500MHz) for at least 5 years now. 
There's nothing new about this and there's nothing hard about including audio in the input loop next to keyboard, mouse and touchscreen.
Now it's all about allowing a small set of natural language instructions in the dash.

For this i would suggest the creation of a human communication project for Ubuntu. This way especially A11y folks would have a much more friendly way to interact with their computer, than by guessing, using Orca or studying mnemonics by heart.

This implemented, casual developer words like "dialogue" or "alert" would finally become truly meaningful.


But I still don't like the idea behind, melting dash and Commands. I think keep on using [alt]+[f2] is a better choice. Anything else would confuse the "normal" user.

that's a valid point, one could make it so that the ALT+F2 Dash looks and behaves more like a Gnome-Terminal on steroids, while the "ordinary user" Dash does all the NLP stuff and looks less geeky at the same time.
Including simple NLP would also help get away from "search results" with only 5% relevance to "truly relevant suggestions" with over 50% relevance to the semantic human input query.

 
Let's hope for 12.04 that an calender is included and that Thunderbird looks better, also I'd would like to see "online accounts" working.

a calendar is already included, it's just that it comes with the much critisized and ancient Evolution software suite.
Perhaps a calendar is so trivial to implement, that developers decide not to create one!?
I can only think of implementation difficulties when it comes to UI design of such a calendar, but then again there already is GNOME Shell's clock menu or indicator date-time in Ubuntu, these are pretty much all that's required, imo.


On Mon, Oct 17, 2011 at 21:48, balint777@xxxxxxxxx <balint777@xxxxxxxxx> wrote:
I guess you're talking about this one https://launchpad.net/wintermute 

i read Gibson, more than a decade ago, i confess, after a few seconds of search i quickly remembered why "Wintermute" rang a bell :D
And yes, i love Science Fiction a lot, but i'm not suggesting any of that here, i'm only suggesting to include a very low hanging fruit in Unity, which has so obviously been made to take us closer to a semantic OS.