dolfin team mailing list archive
-
dolfin team
-
Mailing list archive
-
Message #09829
Re: [DOLFIN-DEV] Some Iterative Linear Solver Doubts.
> And using -ksp_type gmres -pc_type hypre -pc_hypre_type boomeramg is
> failing for solving with A?
isn't it the same as using dolfin
____________________________________________________
LinearSolver solver(gmres,amg);
____________________________________________________
> This doesn't make sense to me, I regularly
> use it for this problem. Is the memory blowup occurring during PC
> assembly?
I think so, it blows when the output is:
____________________________________________________
Solving linear system of size 566769 x 566769 (Krylov solver).
____________________________________________________
if we want to pass command line options we have to
use:
----
dolfin::SubSystemsManager::initPETSc(argc,argv);
----
but doesn't a
---
KrylovSolver solver(gmres,sor) for instance
---
override the options passed trhu (argc,argv) ?
> Are you sure the matrix is correct?
I hope so...it works with other proconditioners (SOR, is the faster till now)
and i reduced the problem to the simple case
> Is it parallel or
> serial?
serial...
> How much storage does the matrix alone take?
+-1.5g
> Does it work
> correctly for smaller problem sizes?
It worked fine with a poisson problems with bigger systems
> Have you tried with ML (-pc_type
> ml -mat_no_inode)? Multigrid should work very well for this problem.
Don't have it... I will build PETSc with ml
>
> Jed
thanks.
--
Nuno David Lopes
e-mail:ndl@xxxxxxxxxxxxxx (FCUL/CMAF)
nlopes@xxxxxxxxxxxxxxx (ISEL)
http://ptmat.ptmat.fc.ul.pt/%7Endl/
Thu Sep 25 10:35:19 WEST 2008
References