opencog-dev team mailing list archive
-
opencog-dev team
-
Mailing list archive
-
Message #00266
Re: [OpenCog] Re: filing bugs & RFEs against OpenCog
On Tue, Jul 22, 2008 at 5:24 AM, Linas Vepstas <linasvepstas@xxxxxxxxx> wrote:
> Do you have attention allocation described anywhere?
>
> I recently tried to describe it to someone, and did a poor job.
>
> The page http://opencog.org/wiki/Attention_allocation
>
> is utterly opaque to the non-initiate: "Currently Attention allocation
> is implemented for keeping track of the importance of atoms. The
> overall design of OpenCog calls for keeping track of MindAgent
> importance as well. MindAgents confer attention to the atoms they use,
> and are then rewarded in importance funds when they achieve system
> goals."
>
> What the heck is an atom? what's importance? why would I need
> to assign importance to atoms? what's a mindagent? why would
> this stuff ever be useful, and where/how would it ever be applied?
Well, because I was writing in the context of OpenCog, I was assuming
the reader would be familiar with terms like atoms and MindAgents. But
you make a point that it's not the best description. There are few
blog posts on Brainwave that describe the mechanics of the system, but
I haven't spent much time explaining the reasoning behind it. In part,
this is because I am under the impression Ben's promised OpenCog Prime
documents will inevitably cover some of that side of things... but
still it'd be worth me extending it somewhat.
> I attempted to summarize attention allocation as follows, although
> my summary is undoubtedly slanted/inaccurate. I said:
>
> "In reasoning and deduction,. especially when using deduction
> where logical connectives (if->then clauses) are weighted by
> probabilities, one is faced with a combinatorial explosion of
> possible deductive chains to explore. The goal of attention
> allocation is to limit and guide these choices, to cut-off directions
> that are unpromising, and stay focused on the topic."
>
> This is perhaps too narrow, and demonstrates my own
> misunderstanding ... but I'd like to see a "plain english"
> description of this sort, with at least a few plain-english
> examples of the technology actually applied.
No, that is a pretty good summary. The other thing that attention
allocation is of use for is to guide forgetting and the storage of
atoms (memory/harddrive/distributed).
I'll put something similar up on the wiki page, so it's a little less
obtuse. I'll also add the creation of an example to my tasklist.
J
Follow ups
References