← Back to team overview

opencog-dev team mailing list archive

Re: Document on Hopfield network

 

Joel --

Some responses...

1)
Multiple correlation is well known, see

http://en.wikipedia.org/wiki/Multiple_correlation

and if one uses ranks instead of values in that math one gets multiple
Spearman correlation.  When the time comes I (or you, or someone)
can devise an appropriate crude incrementally-updatable heuristic
that approximates this.

2)
Definitely, we want to support AsymmetricHebbianLinks, and that
is in the OpenCogPrime design ...

3)
In the Hopfield network world, standard associative-memory Hopfield
nets have symmetric weights.  The assumption of symmetric weights
is required for the standard theorem that the network will converge to a
fixed
point attractor.

However, there has been loads of research on Hopfield nets with
asymmetric weights.  These are needed to store attractors that are NOT
fixed points, but that are, rather, periodic, chaotic or otherwise complex.
These have not often been used for pragmatic associative memory
purposes due to the greater complexity.

-- Ben


On Thu, Aug 14, 2008 at 6:52 PM, Joel Pitt <joel.pitt@xxxxxxxxx> wrote:

> Makes sense to me - will add it to the things to try ;)
>
> Thinking ahead, are there any 3-way or n-way correlation methods? I'm
> sure there must be but nothing immediately comes to mind or google.
> These would be useful for carrying over to HebbianLinks of > 2 arity.
> But I'm getting ahead of myself...
>
> I also had another thought... I'm not entirely sure, but my
> understanding now is that Hopfield networks don't have symmetric
> weight matrices. i.e c_ij does not necessarily equal c_ji. This is
> different to my initial understanding of a fully connected network
> that had a single link between each node of the network, instead of
> two directed links in each direction (although in hindsight seems
> somewhat obvious from the math).
>
> In this case, it would be worth including two Asymmetric Hebbian Links
> to replace every SymmetricHebbianLink (Inverse links are already
> asymmetric, but positive correlation is represented by
> SymmetricHebbianLinks) - otherwise our implementation is at somewhat
> of a disadvantage... it can't hold as much connection information!
>
> J
>
> On Fri, Aug 15, 2008 at 3:07 AM, Ben Goertzel <ben@xxxxxxxxxxxx> wrote:
> >
> > Joel,
> >
> > It's probably a small point, but it occurred to me that we might possibly
> do
> > better to
> > use a more robust measure of correlation to calculate what you call
> >
> > Correlation(i,j)
> >
> > I started thinking about how to make a quick and dirty correlation
> measure
> > that
> > would approximate the Spearman correlation, i.e.
> >
> > http://en.wikipedia.org/wiki/Spearman's_rank_correlation_coefficient<http://en.wikipedia.org/wiki/Spearman%27s_rank_correlation_coefficient>
> >
> > Spearman correlation is a rank-based correlation measure that is
> typically
> > very
> > robust with respect to noise.
> >
> > Recall that the formula for the Spearman correlation of two sets
> {x1,...,xn}
> > and {y1,...,yn} is
> >
> > p = 1 - 6( Sum_i (d_i ^2) / n (n^2 - 1) )
> >
> > where d_i is the difference between the ranks of x_i and y_i
> >
> > Let us use
> >
> > p_t
> >
> > as a shorthand for the Spearman correlation of
> >
> > {x_t,...,x_(t-n+1)} and {y_t,...,y_(t-n+1)}
> >
> > In the application to attention allocation, x_t and y_t may be for
> example
> > the STI values of
> > two different Atoms at cycle t.
> >
> > It seems to me that a crude incrementally-updatable approximation to this
> is
> > the simple iteration
> >
> > p_(t+1) = 1 - ( (1-p_t) * n /(n+1) + d_(t+1)^2 )
> >
> > or if we choose to rewrite it in terms of r_t = 1-p_t, just
> >
> > r_(t+1) = r_t * n /(n+1) + d_(t+1)^2
> >
> > unless I made an algebra error, which is quite possible as I'm figuring
> this
> > out as I type this email, too quickly ;-)
> >
> > Here the parameter n is obviously a rescaling of the current decay
> factor...
> >
> > Basically this is just the same thing you're already doing ... but with a
> > squared  difference of ranks in there...
> >
> > Now the AtomTable already keeps importance levels indexed in bins, for
> > purpose of efficient search by importance.
> >
> > Therefore, it seems we can get an approximate rank comparison just by
> > comparing the ranks of the *bins* that the
> > importance levels are kept in (though some tweak to the AtomTable API
> might
> > be needed).  I.e., we can approximate
> >
> > d_i a= ( rank of x_i's bin) - (rank of x_j's bin)
> >
> > and figure that differences of STI value within a bin are probably not
> > statistically meaningful anyway
> >
> > An advantage here is that in cases where we have a detailed historical
> > record of the STI levels of two Atoms,
> > we could just use the real Spearman correlation, and it would be on the
> same
> > scale as the estimated Spearman
> > correlation above.
> >
> > An obvious variant would be just to do
> >
> > r_(t+1) = r_t * n /(n+1) + d_(t+1)
> >
> > which is no longer an approximate correlation coefficient, but is also
> not a
> > stupid measure.
> >
> > Anyway I'm not sure whether this will make any real difference, it just
> > occurred to me that using ranks might provide
> > a more robust way to measure correlation, as it often does in cases of
> noisy
> > data....
> >
> > -- Ben
> >
> > p.s. here is some algebra deriving the above approximative iteration
> >
> > I note that we can write
> >
> > p_t = 1 - 6D_t/(n(n^2-1))
> >
> > (by appropriate defining D_t) and, rearranging,
> >
> > D_t = (1-p_t) n(n^2-1) / 6
> >
> > Thus, we can do a crude and highly approximate incremental updating of
> the
> > Spearman correlation
> > via
> >
> > p_(t+1) =  1 - 6( (D_t *n/(n+1) + d_(t+1)^2 ) / n (n^2 - 1) )
> >
> > it would seem... and cancelling out stuff gives the simple iteration
> > above...
> >
> >
> > On Thu, Aug 14, 2008 at 12:26 AM, Joel Pitt <joel.pitt@xxxxxxxxx> wrote:
> >>
> >> On Thu, Aug 14, 2008 at 4:07 PM, Ben Goertzel <ben@xxxxxxxxxxxx> wrote:
> >> > I wonder if this wouldn't be a good time to turn the
> >> > economic-AA/Hopfield
> >> > net stuff over to some enthusiastic volunteer?
> >> >
> >> > The foundational work is done and what's left is to implement
> >> > improvements
> >> > and make it work better ... this is not really a bad stage for someone
> >> > else
> >> > to step in and help out ... esp. someone with experience playing with
> >> > neural nets, as this stuff is sort of neural-nettish even though the
> >> > specific
> >> > math is different...
> >>
> >> Yup, I'd be fine to mentor someone who wanted to get up to speed on
> >> it, there's a wishlist bug listed here related to improving
> >> performance, and a number of other bugs on things that'd be nice to
> >> have:
> >>
> >> https://bugs.launchpad.net/opencog/+bug/250357
> >>
> >> So any enthusiastic people are welcome to comment on bugs to register
> >> interest, reply to the list or email me directly.
> >>
> >> As extra incentive, if the work leads to a published paper then Ben
> >> and I are happy share the authorship with any significant
> >> contributors. ;-)
> >>
> >> J
> >
> >
> >
> > --
> > Ben Goertzel, PhD
> > CEO, Novamente LLC and Biomind LLC
> > Director of Research, SIAI
> > ben@xxxxxxxxxxxx
> >
> > "Nothing will ever be attempted if all possible objections must be first
> > overcome " - Dr Samuel Johnson
> >
> >
> >
>



-- 
Ben Goertzel, PhD
CEO, Novamente LLC and Biomind LLC
Director of Research, SIAI
ben@xxxxxxxxxxxx

"Nothing will ever be attempted if all possible objections must be first
overcome " - Dr Samuel Johnson

References