← Back to team overview

dolfin team mailing list archive

Re: PyDOLFIN, SWIG and overloaded constructor

 

On Mon, Oct 27, 2008 at 11:51:04AM +0200, Evan Lezar wrote:
> 
> 
> On Sun, Oct 26, 2008 at 7:21 PM, Matthew Knepley <knepley@xxxxxxxxx> wrote:
> 
>     On Sun, Oct 26, 2008 at 8:13 AM, Evan Lezar <evanlezar@xxxxxxxxx> wrote:
>     >
>     >
>     > On Sun, Oct 26, 2008 at 2:41 PM, Garth N. Wells <gnw20@xxxxxxxxx> wrote:
>     >>
>     >> Whether or not something is being run in parallel should be determined
>     by
>     >> the number of processes,
>     >>
>     >>  uint dolfin::MPI::num_processes()
>     >>
>     >> Garth
>     >
>     > Yes, I am aware of that, but I want to have a bit more control over the
>     > parallelisation.  What I want is a number of processes each solving an
>     > independent eigenvalue system (actually I am doing a sweep in frequency
>     for
>     > an electromagnetic problem and each frequency point has its own
>     > eigensystem).
> 
>     There is aleady an elegant mechanism for controlling this,  namely the
>     communicator.
>     Every PETSc object has a communicator, thus determining how many processes
>     participate. Why make up another, less powerful mechanism?
> 
>       Matt
> 
> 
> Fair enough.  I have to admit that I am new to PETSc and MPI programming in
> general, but from what I gather from the existing PETScMatrix code is that the
> only two communicators used are PETSC_COMM_WORLD and PETSC_COMM_SELF.  The
> selection of these is based on the value of MPI::num_processes() and as such I
> have no controll over the communicator used, so in my opinion some work is
> needed in this regard.
>  
> Evan

Yes, we will provide some access to the communicators when we wrap up
the parallel implementation.

-- 
Anders


> 
>     > I did send an earlier mail discussing this.
>     >
>     > Any help with swig would be much appreciated.
>     >
>     > Evan
>     >
>     >>
>     >> Evan Lezar wrote:
>     >>>
>     >>> Hi
>     >>>
>     >>> I have added a constructor to PETScMatrix to allow me to control
>     whether
>     >>> the matrix is to be created in parallel or not.  The declarations are
>     given
>     >>> below
>     >>>
>     >>> /// Create empty matrix
>     >>> explicit PETScMatrix(Type type=default_matrix);
>     >>>
>     >>> /// Create an empty matrix explicitly setting the parallel flag
>     >>> explicit PETScMatrix(bool is_parallel, Type type=default_matrix);
>     >>>
>     >>> The idea is that I use
>     >>>
>     >>> S = PETScMatrix(False)
>     >>> If I want to create a matrix that is local.  The problem is that swig
>     >>> can't differentiate between Type and bool (and uint for the PETScMatrix
>     (uint
>     >>> M, uint N, Type type=default_matrix); constructor).  I have seen that
>     it is
>     >>> possible to rename PETScMatrix(bool) to PETScMatrix_bool(bool) for
>     example
>     >>> so that the python call would be
>     >>>
>     >>> S = PETScMatrix_bool(False),
>     >>>
>     >>> but I can't seem to get this working.  Any suggestions?
>     >>>
>     >>> Thanks
>     >>> Evan
>     >>>
>     >>>
>     >>>
>     ------------------------------------------------------------------------
>     >>>
>     >>> _______________________________________________
>     >>> DOLFIN-dev mailing list
>     >>> DOLFIN-dev@xxxxxxxxxx
>     >>> http://www.fenics.org/mailman/listinfo/dolfin-dev
>     >>
>     >
>     >
>     > _______________________________________________
>     > DOLFIN-dev mailing list
>     > DOLFIN-dev@xxxxxxxxxx
>     > http://www.fenics.org/mailman/listinfo/dolfin-dev
>     >
>     >
> 
> 
> 
> 
> 

> _______________________________________________
> DOLFIN-dev mailing list
> DOLFIN-dev@xxxxxxxxxx
> http://www.fenics.org/mailman/listinfo/dolfin-dev

Attachment: signature.asc
Description: Digital signature


References