dolfin team mailing list archive
-
dolfin team
-
Mailing list archive
-
Message #26018
Re: Slow PetscInitialize?
-
To:
"Garth N. Wells" <gnw20@xxxxxxxxx>
-
From:
Anders Logg <logg@xxxxxxxxx>
-
Date:
Sun, 14 Oct 2012 20:43:14 -0500
-
Cc:
DOLFIN Mailing List <dolfin@xxxxxxxxxxxxxxxxxxx>
-
In-reply-to:
<CAA4C66MQ1=w1+xZsZqJjqKKfEYvOkbeiH-5ub2aA0FK3PPDung@mail.gmail.com>
-
User-agent:
Mutt/1.5.21 (2010-09-15)
On Sun, Oct 14, 2012 at 06:33:55PM +0100, Garth N. Wells wrote:
> > for external_pkg in umfpack hypre mumps scalapack blacs ptscotch
> > scotch metis parmetis; do
> > CONFOPTS="${CONFOPTS} --download-${external_pkg}=1"
> > done
> >
> > Any of these that should be avoided? Any that should be added for a
> > better PETSc build?
> >
>
> You also have MPI, to which PETSc will make calls. I haven't used the
> Ubuntu OpenMPI package for a long time because I kept finding bugs in
> it (and it doesn't support threads), so now I just build my own MPI
> and don't bother with the MPI package.
I've been trying various options today. I had trouble getting MPICH to
compile on my machine and building OpenMPI gives the same result as
the Ubuntu OpenMPI.
I'm not 100% sure the Ubuntu OpenMPI is not being picked up. I removed
it first but needed to install it again since the boost package
depends on it. Do you use the package for boost or do you install that
locally too?
--
Anders
Follow ups
References