← Back to team overview

dolfin team mailing list archive

Re: Buildbot

 

On jaunty-amd64 demo/pde/simple/cpp hangs with this error:

fenics-slave@byggmester:cpp$ mpirun -n 2 ./demo
Process 0: Debug: creating local mesh data [at
dolfin/mesh/MeshPartitioning.cpp:62 in partition()]
Process 0: Number of global vertices: 1089
Process 0: Number of global cells: 2048
Process 0: Debug: check [at dolfin/mesh/LocalMeshData.cpp:158 in
broadcast_mesh_data()]
Process 0: Debug: check [at dolfin/mesh/LocalMeshData.cpp:172 in
broadcast_mesh_data()]
Process 0: Sending 545 vertices to process 0, range is (0, 545)
Process 0: Sending 544 vertices to process 1, range is (545, 1089)
Process 1: Debug: creating local mesh data [at
dolfin/mesh/MeshPartitioning.cpp:62 in partition()]
Process 1: Debug: check [at dolfin/mesh/LocalMeshData.cpp:228 in
receive_mesh_data()]
Process 1: Debug: check [at dolfin/mesh/LocalMeshData.cpp:240 in
receive_mesh_data()]
Process 1: Received 544 vertex coordinates
Process 1: Debug: check [at dolfin/mesh/LocalMeshData.cpp:248 in
receive_mesh_data()]
Process 0: Received 545 vertex coordinates
Process 0: Debug: check [at dolfin/mesh/LocalMeshData.cpp:191 in
broadcast_mesh_data()]
Process 0: Received 545 vertex indices
Process 0: Debug: check [at dolfin/mesh/LocalMeshData.cpp:205 in
broadcast_mesh_data()]
Process 0: Sending 1024 cells to process 0, range is (0, 1024)
Process 0: Sending 1024 cells to process 1, range is (1024, 2048)
Process 1: Received 544 vertex indices
Process 1: Debug: check [at dolfin/mesh/LocalMeshData.cpp:256 in
receive_mesh_data()]
Process 1: Received 1024 cell vertices
Process 1: Debug: created local mesh data [at
dolfin/mesh/MeshPartitioning.cpp:64 in partition()]
Process 0: Received 1024 cell vertices
Process 0: Debug: created local mesh data [at
dolfin/mesh/MeshPartitioning.cpp:64 in partition()]
Process 1: Partitioned mesh, edge cut is 39.
Process 0: Partitioned mesh, edge cut is 39.
Process 1: Building parallel dof map
Process 0: Building parallel dof map
Process 0: Finished building parallel dof map
Process 0: Building parallel dof map
Process 1: Finished building parallel dof map
Process 1: Building parallel dof map
Process 0: Finished building parallel dof map
Process 0: Solving linear variational problem
Process 1: Finished building parallel dof map
Process 1: Solving linear variational problem
  Process 0: Matrix of size 1089 x 1089 has 3819 nonzero entries.
  Process 0: Diagonal: 3589 (93.9775%), off-diagonal: 53 (1.3878%),
non-local: 177 (4.63472%)
  Process 1: Matrix of size 1089 x 1089 has 3810 nonzero entries.
  Process 1: Diagonal: 3588 (94.1732%), off-diagonal: 54 (1.41732%),
non-local: 168 (4.40945%)
  Process 1: Creating parallel PETSc Krylov solver (for LU factorization).
  Process 0: Creating parallel PETSc Krylov solver (for LU factorization).
  Process 1: Creating parallel PETSc Krylov solver (for LU factorization).
  Process 0: Creating parallel PETSc Krylov solver (for LU factorization).
  Process 1: Solving linear system of size 1089 x 1089 (PETSc LU solver, mumps).
  Process 0: Solving linear system of size 1089 x 1089 (PETSc LU solver, mumps).
[1]PETSC ERROR: --------------------- Error Message
------------------------------------
[1]PETSC ERROR: Nonconforming object sizes!
[1]PETSC ERROR: Sum of local lengths 565 does not equal global length
564, my local length 282
  likely a call to VecSetSizes() or MatSetSizes() is wrong.
See http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#PetscSplitOwnership!
[1]PETSC ERROR:
------------------------------------------------------------------------
[1]PETSC ERROR: Petsc Release Version 3.0.0, Patch 8, Fri Aug 21
14:02:12 CDT 2009
[1]PETSC ERROR: See docs/changes/index.html for recent updates.
[1]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[1]PETSC ERROR: See docs/index.html for manual pages.
[1]PETSC ERROR:
------------------------------------------------------------------------
[1]PETSC ERROR: Unknown Name on a linux-gnu named byggmester by
fenics-slave Thu Sep 17 12:35:15 2009
[1]PETSC ERROR: Libraries linked from
/usr/local/src/petsc-3.0.0-p8/linux-gnu-cxx-debug/lib
[1]PETSC ERROR: Configure run at Wed Sep 16 09:03:03 2009
[1]PETSC ERROR: Configure options --with-clanguage=cxx --with-shared=1
--with-x=0 --with-x11=0 --with-umfpack=1
--with-umfpack-include=/usr/include/suitesparse
--with-umfpack-lib="[/usr/lib/libumfpack.so,/usr/lib/libamd.so]"
--with-superlu=1 --with-superlu-include=/usr/include/superlu
--with-superlu-lib=/usr/lib/libsuperlu.so --with-parmetis=1
--with-parmetis-dir=/usr/include/parmetis
--with-hypre-include=/usr/include
--with-hypre-lib=/usr/lib/libHYPRE.so --download-scalapack=1
--download-blacs=1 --download-mumps=1
[1]PETSC ERROR:
------------------------------------------------------------------------
[1]PETSC ERROR: PetscSplitOwnership() line 94 in src/sys/utils/psplit.c
[1]PETSC ERROR: PetscMapSetUp() line 136 in src/vec/vec/impls/mpi/pmap.c
[1]PETSC ERROR: VecCreate_MPI_Private() line 182 in
src/vec/vec/impls/mpi/pbvec.c
[1]PETSC ERROR: VecCreate_MPI() line 232 in src/vec/vec/impls/mpi/pbvec.c
[1]PETSC ERROR: VecSetType() line 54 in src/vec/vec/interface/vecreg.c
[1]PETSC ERROR: VecCreateMPI() line 42 in src/vec/vec/impls/mpi/vmpicr.c

On hardy-i386 demo/pde/poisson/cpp also hangs. Unfortunately, I can't
check the output at the moment.

We should really do something about that the demos hangs when running
in parallel. Any ideas?

Johannes


Follow ups

References