← Back to team overview

dolfin team mailing list archive

Re: pyDOLFIN and MPI

 

On Thursday 14 August 2008 08:52:21 Garth N. Wells wrote:
> Anders Logg wrote:
> > On Wed, Aug 13, 2008 at 08:03:39PM +0100, Garth N. Wells wrote:
> >> I'm experiencing a puzzling problem wth pyDOLFIN and MPI again.
> >>
> >> When I do
> >>
> >>      python file.py
> >>
> >> where file.py is just
> >>
> >>      from dolfin import *
> >>
> >>      object = Function("/tmp/fileKFnQpl.xml")
> >>      plot(object)
> >>      interactive()
> >>
> >> I see a plot as expected, and get
> >>
> >>      Plot active, press 'q' to continue.
> >>
> >> After pressing 'q', I get
> >>
> >>      *** An error occurred in MPI_Attr_get
> >>      *** after MPI was finalized
> >>      *** MPI_ERRORS_ARE_FATAL (goodbye)
> >>      [gnw20pc:2277] Abort before MPI_INIT completed successfully; not
> >> able to guarantee that all other processes were killed!
> >>      *** An error occurred in MPI_Comm_rank
> >>      *** after MPI was finalized
> >>      *** MPI_ERRORS_ARE_FATAL (goodbye)
> >>      *** An error occurred in MPI_Type_free
> >>      *** after MPI was finalized
> >>      *** MPI_ERRORS_ARE_FATAL (goodbye)
> >>      Segmentation fault
> >>
> >> Somehow, Python appears to be calling MPI_Finalize before DOLFIN gets a
> >> chance to finalise things correctly. Any ideas/experience on how Python
> >> interacts with MPI? I've commented out MPI_Finalize() in DOLFIN to be
> >> sure that DOLFIN is not calling it.
> >>
> >> Garth
> >
> > Would it help if we just call MPI_Finalized to check before
> > finalizing? We can add a wrapper for it just like for MPI_Initialized.
>
> I've added some annoying debug output to SubSystemsManager. Can you tell
> me what you see when running
>
>      from dolfin import *
>      x = PETScVector(10)

Hello Garth,

my output is:

>>> from dolfin import *
>>> x = PETScVector(10)
MPI status in initPETSc() 0
MPI status in initMPI() 1

It looks like I do not have the problem.

I use the petsc and openmpi packages from ubuntu hardy.

Johan

> When I run it, the first line of output is
>
>      MPI status in initPETSc() 1
>
> which indicates that MPI_Initialize is saying that MPI has been
> initialised before PETSc is initialised, but we haven't initialise it.
> The same code in C++ gives
>
>      MPI status in initPETSc() 0
>
> which is the expected result.
>
> Garth
>
> > ------------------------------------------------------------------------
> >
> > _______________________________________________
> > DOLFIN-dev mailing list
> > DOLFIN-dev@xxxxxxxxxx
> > http://www.fenics.org/mailman/listinfo/dolfin-dev
>
> _______________________________________________
> DOLFIN-dev mailing list
> DOLFIN-dev@xxxxxxxxxx
> http://www.fenics.org/mailman/listinfo/dolfin-dev




References