← Back to team overview

dolfin team mailing list archive

[Bug 681415] Re: Problem using sub-spaces

 

** Changed in: dolfin
       Status: Confirmed => In Progress

-- 
You received this bug notification because you are a member of DOLFIN
Team, which is subscribed to DOLFIN.
https://bugs.launchpad.net/bugs/681415

Title:
  Problem using sub-spaces

Status in DOLFIN:
  In Progress

Bug description:
  Hi,

  I have trouble using MixedFunctionSpace. Basically I want to project a
  function to a subspace of a MixedFunctionSpace:

     from dolfin import *
     mesh=UnitInterval(10)
     U = FunctionSpace(mesh, "CG", 1)
     V = FunctionSpace(mesh, "CG", 1)
     W=U*V
     f1=Function(W.sub(0))
     f1.vector()[:]=1.0
     #f2=project(f1, V)   # This works!
     f2=project(f1, W.sub(1))  # This doesn't!

  
  The output of that script is attached to the end of this question. Now, the confusing thing is that the projection works fine if one projects to V instead of W.sub(1), although they are mathematically the same spaces. 

  I hope I haven't missed a similar question already reported.

  Best wishes,

  Simon

  
  [0]PETSC ERROR: ------------------------------------------------------------------------
  [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
  [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
  [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
  [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run 
  [0]PETSC ERROR: to get more information on the crash.
  [0]PETSC ERROR: --------------------- Error Message ------------------------------------
  [0]PETSC ERROR: Signal received!
  [0]PETSC ERROR: ------------------------------------------------------------------------
  [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 3, Fri Jun  4 15:34:52 CDT 2010
  [0]PETSC ERROR: See docs/changes/index.html for recent updates.
  [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
  [0]PETSC ERROR: See docs/index.html for manual pages.
  [0]PETSC ERROR: ------------------------------------------------------------------------
  [0]PETSC ERROR: Unknown Name on a linux-gnu named doodson by sf1409 Thu Nov 25 13:17:44 2010
  [0]PETSC ERROR: Libraries linked from /build/buildd/petsc-3.1.dfsg/linux-gnu-c-opt/lib
  [0]PETSC ERROR: Configure run at Fri Sep 10 04:57:14 2010
  [0]PETSC ERROR: Configure options --with-shared --with-debugging=0 --useThreads 0 --with-clanguage=C++ --with-c-support --with-fortran-interfaces=1 --with-mpi-dir=/usr/lib/openmpi --with-mpi-shared=1 --with-blas-lib=-lblas --with-lapack-lib=-llapack --with-umfpack=1 --with-umfpack-include=/usr/include/suitesparse --with-umfpack-lib="[/usr/lib/libumfpack.so,/usr/lib/libamd.so]" --with-spooles=1 --with-spooles-include=/usr/include/spooles --with-spooles-lib=/usr/lib/libspooles.so --with-hypre=1 --with-hypre-dir=/usr --with-scotch=1 --with-scotch-include=/usr/include/scotch --with-scotch-lib=/usr/lib/libscotch.so --with-hdf5=1 --with-hdf5-dir=/usr
  [0]PETSC ERROR: ------------------------------------------------------------------------
  [0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file
  --------------------------------------------------------------------------
  MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD 
  with errorcode 59.

  NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
  You may or may not see output from other processes, depending on
  exactly when Open MPI kills them.
  --------------------------------------------------------------------------



References