← Back to team overview

dolfin team mailing list archive

Re: Parallel assembly

 

>
>
> Johan Hoffman wrote:
>>> Johan Hoffman wrote:
>>>> Hi all,
>>>>
>>>> Connected to this discussion is also the msc thesis work on dolfin
>>>> parallization of Nicklas Jansson at KTH. He has now started working on
>>>> this based on the updated TODO list of dolfin. He has tried to send an
>>>> email to this list (dolfin-dev@xxxxxxxxxx) but it appears that it is
>>>> stuck
>>>> in a filter awaiting moderator approval.
>>> If he joins the list, he'll be able to make posts.
>>
>> Ok.
>>
>>> Maybe someone (a moderator) could
>>>> help out so that we can get past this, to better coordinate
>>>> parallelization efforts?
>>>>
>>> One point on the TODO list: we discussed some time ago the mesh
>>> partitioning, and decided against ParMETIS or METIS because they do not
>>> use a GPL (compatible) license. Magnus has implemented a nice
>>> partitioning interface which uses SOCTCH which does have a GPL
>>> compatible license.
>>
>> Ok. Does the switch to LGPL licence for dolfin make any difference? Or
>> is
>> it still a conflict?
>>
>
> There is still a conflict. The METIS license basically says that it can
> be used for non-profit purposes only, and permission is required to
> re-distribute it.

Ok, then there is a problem.

>> About Scotch; the argument was that it lacked parallel partitioning, and
>> a
>> few other nice features of parMetis. But it seems that Scotch v5.0 is
>> moving towards a parallel implementation as well?
>>
>
> It does have it now. That said, I can't see us using or needing parallel
> partitioning in the short- to medium-term future.

Ok. Maybe we'll manage with Scotch for now then.

As for parallel assembly, we will need this in the coming months, so we
will push the fully parallel approach within Nicklas' msc project,
including parallel redistribution for adaptively refined meshes (which
parMetis seems to support nicely).

/Johan

> Garth
>
>> Also, it appears that Scotch is designed to mimic Metis to allow for a
>> common implementation of the two, in particular in v5.0 there seems to
>> be
>> such a compatibility library. So maybe the differences are very mild. If
>> Scotch does not have what we need for parallel partitioning today, we
>> could use parMetis until it does, or at least allow for this option,
>> since
>> the implementation should not change much (if at all).
>>
>> /Johan
>>
>>> Garth
>>>
>>>> Thanks!
>>>>
>>>> /Johan
>>>>
>>>>
>>>>> Anders Logg wrote:
>>>>>> On Sat, Dec 01, 2007 at 05:02:13PM +0000, Garth N. Wells wrote:
>>>>>>> Looks like you forgot to add MPIManager to the repository.
>>>>>>>
>>>>>>> Do we want a class MPIManager, or should we let PETSc take of this?
>>>>>>> If
>>>>>>> we
>>>>>>> create an MPI object ourselves, it will probably clash with PETSc.
>>>>>> We need it if we sometimes want to use MPI without PETSc (which is
>>>>>> not
>>>>>> unlikely even if PETSc is the default).
>>>>>>
>>>>>> MPIManager works like PETScManager and takes care of the global
>>>>>> initialization at startup:
>>>>>>
>>>>>>   MPIManager::init();
>>>>>>
>>>>>> and also calls finalize() automatically when the program exits. It
>>>>>> talks to MPI to see if it has already been initialized (by PETSc,
>>>>>> itself or someone else) and does nothing if that is the case.
>>>>>>
>>>>> OK.
>>>>>
>>>>> Garth
>>>>> _______________________________________________
>>>>> DOLFIN-dev mailing list
>>>>> DOLFIN-dev@xxxxxxxxxx
>>>>> http://www.fenics.org/mailman/listinfo/dolfin-dev
>>>>>
>>>>
>>> _______________________________________________
>>> DOLFIN-dev mailing list
>>> DOLFIN-dev@xxxxxxxxxx
>>> http://www.fenics.org/mailman/listinfo/dolfin-dev
>>>
>>
>>
>




Follow ups

References