← Back to team overview

dolfin team mailing list archive

Re: [Bug 681415] Re: Problem using sub-spaces

 

On 29 November 2010 10:46, Anders Logg <logg@xxxxxxxxx> wrote:

> On Mon, Nov 29, 2010 at 09:15:18AM -0000, Mikael Mortensen wrote:
> > I don't know if it's safe, but as for now it works, and for NS it can
> > give you speed-up of more than an order of magnitude. It does not work
> > in parallel of course.
> >
> > Anyway, just wanted you to know that people are using the subfunction
> > functionality for more than setting boundary conditions:)
>
> Would it work just as well (fast) to assemble into a smaller blocks
> and them add them together?
>
>
Not quite sure what you mean. The main reason it's much faster componentwise
is that the velocity components are uncoupled and thus the sparsity pattern
on each row is (for 3D flows) 3 times smaller than when assembling the whole
thing. If you could have a way of telling FFC that your velocity components
in the VectorFunctionSpace are uncoupled then you might obtain the same
thing I guess. And you would get rid of a large number of zeros in the
sparse matrix structure.

Mikael



> --
> Anders
>
> --
> Problem using sub-spaces
> https://bugs.launchpad.net/bugs/681415
> You received this bug notification because you are a member of DOLFIN
> Team, which is subscribed to DOLFIN.
>
> Status in DOLFIN: Confirmed
>
> Bug description:
> Hi,
>
> I have trouble using MixedFunctionSpace. Basically I want to project a
> function to a subspace of a MixedFunctionSpace:
>
>   from dolfin import *
>   mesh=UnitInterval(10)
>   U = FunctionSpace(mesh, "CG", 1)
>   V = FunctionSpace(mesh, "CG", 1)
>   W=U*V
>   f1=Function(W.sub(0))
>   f1.vector()[:]=1.0
>   #f2=project(f1, V)   # This works!
>   f2=project(f1, W.sub(1))  # This doesn't!
>
>
> The output of that script is attached to the end of this question. Now, the
> confusing thing is that the projection works fine if one projects to V
> instead of W.sub(1), although they are mathematically the same spaces.
>
> I hope I haven't missed a similar question already reported.
>
> Best wishes,
>
> Simon
>
>
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> probably memory access out of range
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> [0]PETSC ERROR: or see
> http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[0]PETSCERROR: or try
> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory
> corruption errors
> [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and
> run
> [0]PETSC ERROR: to get more information on the crash.
> [0]PETSC ERROR: --------------------- Error Message
> ------------------------------------
> [0]PETSC ERROR: Signal received!
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 3, Fri Jun  4 15:34:52
> CDT 2010
> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> [0]PETSC ERROR: See docs/index.html for manual pages.
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Unknown Name on a linux-gnu named doodson by sf1409 Thu Nov
> 25 13:17:44 2010
> [0]PETSC ERROR: Libraries linked from
> /build/buildd/petsc-3.1.dfsg/linux-gnu-c-opt/lib
> [0]PETSC ERROR: Configure run at Fri Sep 10 04:57:14 2010
> [0]PETSC ERROR: Configure options --with-shared --with-debugging=0
> --useThreads 0 --with-clanguage=C++ --with-c-support
> --with-fortran-interfaces=1 --with-mpi-dir=/usr/lib/openmpi
> --with-mpi-shared=1 --with-blas-lib=-lblas --with-lapack-lib=-llapack
> --with-umfpack=1 --with-umfpack-include=/usr/include/suitesparse
> --with-umfpack-lib="[/usr/lib/libumfpack.so,/usr/lib/libamd.so]"
> --with-spooles=1 --with-spooles-include=/usr/include/spooles
> --with-spooles-lib=/usr/lib/libspooles.so --with-hypre=1
> --with-hypre-dir=/usr --with-scotch=1
> --with-scotch-include=/usr/include/scotch
> --with-scotch-lib=/usr/lib/libscotch.so --with-hdf5=1 --with-hdf5-dir=/usr
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: User provided function() line 0 in unknown directory
> unknown file
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 59.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
>
>
>
>
>
>
>
>
> _______________________________________________
> Mailing list: https://launchpad.net/~dolfin<https://launchpad.net/%7Edolfin>
> Post to     : dolfin@xxxxxxxxxxxxxxxxxxx
> Unsubscribe : https://launchpad.net/~dolfin<https://launchpad.net/%7Edolfin>
> More help   : https://help.launchpad.net/ListHelp
>

References