← Back to team overview

dolfin team mailing list archive

Re: Parallel assembly

 

Hi all,

Connected to this discussion is also the msc thesis work on dolfin
parallization of Nicklas Jansson at KTH. He has now started working on
this based on the updated TODO list of dolfin. He has tried to send an
email to this list (dolfin-dev@xxxxxxxxxx) but it appears that it is stuck
in a filter awaiting moderator approval. Maybe someone (a moderator) could
help out so that we can get past this, to better coordinate
parallelization efforts?

Thanks!

/Johan


>
>
> Anders Logg wrote:
>> On Sat, Dec 01, 2007 at 05:02:13PM +0000, Garth N. Wells wrote:
>>> Looks like you forgot to add MPIManager to the repository.
>>>
>>> Do we want a class MPIManager, or should we let PETSc take of this? If
>>> we
>>> create an MPI object ourselves, it will probably clash with PETSc.
>>
>> We need it if we sometimes want to use MPI without PETSc (which is not
>> unlikely even if PETSc is the default).
>>
>> MPIManager works like PETScManager and takes care of the global
>> initialization at startup:
>>
>>   MPIManager::init();
>>
>> and also calls finalize() automatically when the program exits. It
>> talks to MPI to see if it has already been initialized (by PETSc,
>> itself or someone else) and does nothing if that is the case.
>>
>
> OK.
>
> Garth
> _______________________________________________
> DOLFIN-dev mailing list
> DOLFIN-dev@xxxxxxxxxx
> http://www.fenics.org/mailman/listinfo/dolfin-dev
>




Follow ups

References