← Back to team overview

opencog-dev team mailing list archive

Re: [OpenCog] Re: filing bugs & RFEs against OpenCog

 

Hi,

The OpenCog Prime docs should be up within a week (I'm done first drafts of
all of them,
and David Hart is helping to remove some of the wiki-syntax errors and put
them on the site...)

However, they don't exactly provide a novice-friendly treatment of AA or
anything else ... they
provide an in-depth integrated treatment of the whole OCP design...

Regarding AA, Joel and I are still working out the quirks.  Once it works
well, we will write up
a paper on it.

The ideas underlying AA are described in a paper I wrote in 2006 entitled
"Virtual Easter Egg
Hunting" or some such, which is on the AGIRI website on the page giving the
proceedings of
the AGI-06 workshop...

ben

On Mon, Jul 21, 2008 at 5:25 PM, Joel Pitt <joel.pitt@xxxxxxxxx> wrote:

> On Tue, Jul 22, 2008 at 5:24 AM, Linas Vepstas <linasvepstas@xxxxxxxxx>
> wrote:
> > Do you have attention allocation described anywhere?
> >
> > I recently tried to describe it to someone, and did a poor job.
> >
> > The page http://opencog.org/wiki/Attention_allocation
> >
> > is utterly opaque to the non-initiate: "Currently Attention allocation
> > is implemented for keeping track of the importance of atoms. The
> > overall design of OpenCog calls for keeping track of MindAgent
> > importance as well. MindAgents confer attention to the atoms they use,
> > and are then rewarded in importance funds when they achieve system
> > goals."
> >
> > What the heck is an atom? what's importance? why would I need
> > to assign importance to atoms? what's a mindagent? why would
> > this stuff ever be useful, and where/how would it ever be applied?
>
> Well, because I was writing in the context of OpenCog, I was assuming
> the reader would be familiar with terms like atoms and MindAgents. But
> you make a point that it's not the best description. There are few
> blog posts on Brainwave that describe the mechanics of the system, but
> I haven't spent much time explaining the reasoning behind it. In part,
> this is because I am under the impression Ben's promised OpenCog Prime
> documents will inevitably cover some of that side of things... but
> still it'd be worth me extending it somewhat.
>
> > I attempted to summarize attention allocation as follows, although
> > my summary is undoubtedly slanted/inaccurate. I said:
> >
> >  "In reasoning and deduction,. especially when using deduction
> > where logical connectives (if->then clauses) are weighted by
> > probabilities, one is faced with a combinatorial explosion of
> > possible deductive chains to explore.  The goal of attention
> > allocation is to limit and guide these choices, to cut-off directions
> > that are unpromising, and stay focused on the topic."
> >
> > This is perhaps too narrow, and demonstrates my own
> > misunderstanding ... but I'd like to see a "plain english"
> > description of this sort, with at least a few plain-english
> > examples of the technology actually applied.
>
> No, that is a pretty good summary. The other thing that attention
> allocation is of use for is to guide forgetting and the storage of
> atoms (memory/harddrive/distributed).
>
> I'll put something similar up on the wiki page, so it's a little less
> obtuse. I'll also add the creation of an example to my tasklist.
>
> J
>
> _______________________________________________
> Mailing list: https://launchpad.net/~opencog-dev<https://launchpad.net/%7Eopencog-dev>
> Post to     : opencog-dev@xxxxxxxxxxxxxxxxxxx
> Unsubscribe : https://launchpad.net/~opencog-dev<https://launchpad.net/%7Eopencog-dev>
> More help   : https://help.launchpad.net/ListHelp
>



-- 
Ben Goertzel, PhD
CEO, Novamente LLC and Biomind LLC
Director of Research, SIAI
ben@xxxxxxxxxxxx

"Nothing will ever be attempted if all possible objections must be first
overcome " - Dr Samuel Johnson

Follow ups

References