← Back to team overview

dolfin team mailing list archive

Re: Applying Dirichlet conditions by removing degrees of freedom (was [Fwd: Re: [HG DOLFIN] merge])

 

On Mon, Aug 18, 2008 at 02:12:11PM +0200, Jed Brown wrote:
> On Mon 2008-08-18 12:25, Kent-Andre Mardal wrote:
> > On ma., 2008-08-18 at 11:38 +0200, Jed Brown wrote:
> > > On Sun 2008-08-17 20:21, kent-and@xxxxxxxxx wrote:
> > > > I have never had any trouble with the additional diagonal matrix due to
> > > > Dirichlet conditions,  but I have mostly used multigrid. In K.-A. Mardal,
> > > > and R. Winther. Uniform Preconditioners for the Time Dependent Stokes
> > > > Problem, Numerische Mathematik 98(2):305--327, 2004.
> > > > we tested multigrid methods for Stokes type problems with Taylor-Hood,
> > > > P2-P0, Mini
> > > > and Crouzeix-Raviart elements (the CR experiments did not end up in the
> > > > final paper
> > > > but the number can be found in my thesis.) The condition number was below
> > > > 20 for all
> > > > methods independent of mesh size. I have done similar experiments with AMG.
> > > 
> > > Interesting paper, thanks for the reference.
> > > 
> > > I think we might be addressing a somewhat different problem since a
> > > significant part of A (at least half in the parameter range considered)
> > > comes from the mass matrix so zeroing rows will cause a much smaller
> > > perturbation to the condition number.  What is your experience as the
> > > time step goes to infinity?  In that case your preconditioner becomes a
> > > special case of the standard P_d = diag(A, -S) where S is replaced by
> > > the identity (or mass matrix) which is spectrally equivalent.  In my
> > > experience, this gives a very weak preconditioner, although it is cheap
> > > to apply.  With the full P_d above, the condition number of P_d^{-1} A
> > > has three distinct eigenvalues, 1, 1+sqrt(5), 1-sqrt(5).  However, if
> > > the Schur complement is only approximated, these become clusters and I
> > > have found convergence to be erratic.  I have had much better luck with
> > > preconditioners arising from the block LDU decomposition
> > > 
> > >   [ A   B_1'] = [     I        0] [A  0] [I  (A^-1 B_1')]
> > >   [B_2    0 ]   [(B_2 A^{-1})  I] [0  S] [0       I     ]
> > > 
> > > or from block triangular preconditioners also involving the Schur
> > > complement (although these seem less robust than the LDU version for the
> > > nonlinear problem and more sensitive to incomplete solves with S).  The
> > > condition numbers I was quoting are for the Schur complement, not for
> > > the preconditioned Jacobian (I normally make the preconditioner strong
> > > enough to fully converge in less than 10 iterations).  Note that the
> > > numbers I gave were influenced by the fact that the divergence matrix
> > > B=B_2=B_1 was actually of spectral order (49 in that case, applied by
> > > FFT) although the preconditioner for A was based on Q1 finite elements.
> > > While these are spectrally equivalent, it makes a difference in terms of
> > > constants.  Also, note that the mixed space was Q_p - Q_{p-2}, on which
> > > the inf-sup constant it grows like sqrt(p), so this influences the
> > > conditioning of the matrix.
> > > 
> > > I am very interested in other experiences with this problem.  I'm fairly
> > > new to the field and implemented something which seemed reasonably
> > > general.  The following is a picture of the scaling for my Stokes
> > > solver, using the LDU form above as a preconditioner for FGMRES on the
> > > outer problem, versus a Poisson problem.  The preconditioner for the
> > > uniformly positive definite operators was ML with PETSc default
> > > parameters.  The discretization was Chebyshev collocation in 3D using
> > > the mixed space Q_p - Q_{p-2}, applied matrix-free via FFT, with
> > > preconditioning matrices (A and M) assembled using Q1 elements.  All
> > > experiments were run using one core of my laptop.
> > > 
> > >   http://59A2.org/files/cheb-scaling.png
> > > 
> > > Note that the slope is almost exactly 1 for the Stokes case, better than
> > > for the Poisson problem.  I believe this is due to S being better
> > > approximated by M as the grid is refined, but it is also influenced by p
> > > and p-2 both being good sizes for the DCT.
> > > 
> > > The people I have talked to who tried these preconditioners in a code
> > > which enforces boundary conditions by zeroing rows of the matrix have
> > > not been happy with the convergence.  This may be an anomaly, but I have
> > > attributed it to the condition number of the Schur complement being
> > > higher.
> > > 
> > > Jed
> > 
> > In the paper I mentioned above,  the mass matrix is added to the Stokes
> > problem for the sake of presentation, ie. it is easy to go from Darcy to
> > Stokes-type flow with only one parameter. Removing the mass matrix does
> > not change the condition number much, in fact the condition number will
> > then be smaller. You can find my experiments on a pure Stokes 
> > problem in K.-A. Mardal, J. Sundnes, H. P. Langtangen, and A. Tveito.
> > Block preconditioning and Systems of PDEs, In: Advanced Topics in
> > Computational Partial Differential Equations - Numerical Methods and
> > Diffpack Programming, ed. by Langtangen, H.P. and Tveito, A..
> > Springer-Verlag, pp. 199-236. Lecture Notes in Computational Science and
> > Engineering, 2003.
> > I can send the chapter to you if you don't have access to the book. 
> 
> I found a postscript.  I have also tried this preconditioner (diag(A,M)
> where A is replaced by a multigrid cycle) and it converges in a constant
> number of iterations, but it takes many more than the LDU
> preconditioner.  I found that the runtime was twice as long as the LDU
> version and less robust to nonlinearities (in rheology).  This may be a
> peculiarity related to the discretization I'm using.
> 
> > However, the analyse of preconditioners for Stokes problem is well-known
> > eg. A preconditioned iterative method for saddle point problems
> > (context) - Rusten, Winther - 1992. Elman, Wathen and Silvester also
> > have several papers on this. Although, I am not sure whether any of
> > these papers have numerical experiments with multgrid methods. 
> 
> The Benzi, Golub and Liesen 2005 review is quite comprehensive.
> Here is a recent one:
> @article{larin2008cse,
>   title={{A comparative study of efficient iterative solvers for
>   generalized Stokes equations}},
>   author={Larin, M. and Reusken, A.},
>   journal={Numerical Linear Algebra with Applications},
>   volume={15},
>   number={1},
>   pages={13},
>   year={2008},
>   publisher={John Wiley \& Sons, Ltd}
> }
> 
> Note that for the largest problem size discussed, my solver converges in
> about half the time of their best result, although my computer is
> somewhat faster, so this comparison is questionable.  In any case, the
> high order operator doesn't seem to dramatically slow convergence (in my
> tests, 1.5 to 2 times the number of iterations are required compared to
> the straight Q1 case).
> 
> > My guess is that the increase in the condition number in your case
> > comes from two things, the deterioation of the inf-sup condition
> > and the low-order preconditioner. I guess you are aware of the works
> > that address low-order preconditioners for spectral elements ? I don't 
> > remember the details now, but I know that at least the following papers
> > address the issue. 
> > 
> > C. Canuto and A. Quarteroni (1985)
> > M. O. Deville and E. H. Mund (1985)
> > S. D. Kim and S. V. Parter (1997)
> > J. S. Shen, F. Wang and J. Xu (2000)
> 
> Also this one where spectral equivalence is proved:
> 
> @article{kim2007pbp,
>   title={{Piecewise bilinear preconditioning of high-order finite element methods}},
>   author={Kim, S.D.},
>   journal={Electronic Transactions on Numerical Analysis},
>   volume={26},
>   pages={228--242},
>   year={2007}
> }
> 
> The condition number of the preconditioned Jacobian is very good (I
> normally make it <1.1 by shifting enough work into the inner iteration).
> I was referring to the condition number of the Schur complement which I
> have seen no explicit reference to in the papers you have cited.
> 
> Maybe I'm making this out to be more than it is, but it is certainly the
> case that the spectrum of the (unpreconditioned) operator can change
> significantly when boundary conditions are imposed by zeroing rows.
> While this has an effect on the Schur complement, it is difficult to
> quantify and probably always sensitive to the specifics of the
> discretization.  The safe thing is to remove the degrees of freedom from
> the global system and this is quite easy to do.  I was just suggesting
> that it would be worth discussing and now seemed like a good time.
> 
> Jed
> 
> PS: It would be cool if the Stokes demo included a preconditioning
> matrix so that iterative linear algebra would work.  A block diagonal or
> block triangular form would be easy to do.

Can you submit a patch? Sounds like a good demo.

-- 
Anders

Attachment: signature.asc
Description: Digital signature


References