← Back to team overview

ufl team mailing list archive

Re: The role of * in UFL

 

2008/4/7, Anders Logg <logg@xxxxxxxxx>:
> On Mon, Apr 07, 2008 at 01:49:08PM +0200, Martin Sandve Alnæs wrote:
>  > 2008/4/7, Pearu Peterson <pearu@xxxxxxxxx>:
>  > > Martin Sandve Alnæs wrote:
>  > >
>  > >  > That's one of the trivial examples in my previous email. A*B and A*v
>  > >  > work out fine, but v*A and u*v would be invalid products in
>  > >  > Matlab/linear algebra notation. If you write (with tensors) "u v", it
>  > >  > (often, usually, always?) means outer(u, v). Translating from linear
>  > >  > algebra notation to UFL tensor notation, "u v" is invalid, "u^T v" is
>  > >  > dot(u,v) and "u v^T" is outer(u, v).
>  > >  >
>  > >  > "Paper notation" contains a lot of special cases that doesn't
>  > >  > generalize, and would lead to a tangled mess of code. That's why we
>  > >  > must make our own precise definitions, that are always well defined
>  > >  > and translates well to code. And since we're dealing with tensor
>  > >  > algebra, we must use tensor algebra as our basis and not linear
>  > >  > algebra where they differ.
>  > >
>  > >
>  > > FYI, in sympycore we have the same problem with matrices
>  > >  and there we have resolved it as follows (for details see
>  > >  http://code.google.com/p/sympycore/wiki/MatrixSupportIdeas):
>  > >
>  > >  0) currently vectors are represented as 1-column matrices
>  > >  1) by default, `A * B` is matrix product.
>  > >  2) matrices have attributes .A, .T, etc (`.T` would be `^T` in your
>  > >  case) that return "array", "transpose", etc *views* of matrices
>  > >  and these views define what kind of multiplication will be used.
>  > >
>  > >  For example, `A.A * B` would be equivalent to elementwise multiplication
>  > >  of matrices. `A.T * B` would be dot product of matrices.
>  > >  Outer products could be modelled as `A.O * B`, for instance.
>  > >
>  > >  Regards,
>  > >
>  > > Pearu
>  >
>  > I've already implemented A.T the same way. But I think f.ex. outer(A,
>  > B) is clearer than A.O*B, and "A.O" in itself doesn't make sense to
>  > me. Also, I've done inv(A) instead of A.I.
>  >
>  > A.A*B we would currently have to do with manual loops, but I'm not
>  > sure we need it.
>
>
> My suggestion would be to first have some very generic operators like
>  dot() and inner() which work for all kinds of tensors (when the
>  dimensions match). Then we add the * operator for a few linear algebra
>  operations that we want to support:
>
>   A*B      rank(A) == rank(B) == 2
>   A*x      rank(A) == 2 and rank(x) == 1
>   x.T*y    rank(x) == rank(y) == 1 and transposed(x) and not transposed(y)
>   x.T*A    rank(x) == 1 and rank(A) == 2 and transposed(x)
>
>  Are there any others?
>
>  For outer products, one would need to write outer(x, y), which is very
>  natural.

I don't agree with the x.T stuff. There is no differentiation between
columns and rows in tensor algebra, so the transpose has no meaning
for a rank 1 tensor. This has to generalize to where x is a rank 1
expression and not only an explicitly constructed vector. If you have
a rank 4 tensor and pick a sub-tensor of rank 1 from it, is it a
column or row? Columns and rows doesn't generalize, and special cases
are evil.

--
Martin


Follow ups

References