dolfin team mailing list archive
-
dolfin team
-
Mailing list archive
-
Message #19040
Re: Interpolation in parallel
On Wed, Aug 11, 2010 at 11:38:12AM +0200, Ola Skavhaug wrote:
> On Mon, Aug 9, 2010 at 4:19 PM, Anders Logg <logg@xxxxxxxxx> wrote:
> > On Mon, Aug 09, 2010 at 03:02:18PM +0100, Garth N. Wells wrote:
> >> On Mon, 2010-08-09 at 15:54 +0200, Ola Skavhaug wrote:
> >> > On Mon, Aug 9, 2010 at 3:39 PM, Garth N. Wells <gnw20@xxxxxxxxx> wrote:
> >> > > On Mon, 2010-08-09 at 15:32 +0200, Ola Skavhaug wrote:
> >> > >> Hi,
> >> > >>
> >> > >> when running DOLFIN in parallel, data reduction would be a nice thing
> >> > >> to have. Typically, it would be good to have something like this:
> >> > >>
> >> > >> mesh = UnitSquare(2000, 2000)
> >> > >> reduced_mesh = UnitSquare(200, 200)
> >> > >>
> >> > >> V = FunctionSpace(mesh, "CG", 1)
> >> > >> reduced_V = FunctionSpace(reduced_mesh, "CG", 1)
> >> > >>
> >> > >> u = Function(V)
> >> > >>
> >> > >> ...
> >> > >>
> >> > >> reduced_u = interpolate(u, reduced_V)
> >> > >> f << reduced_u
> >> > >>
> >> > >> However, the problem is that the automatic partitioning of the meshes
> >> > >> constructs two different non-overlapping meshes on each local node. I
> >> > >> really don't want to allow extrapolation in this case. I would be
> >> > >> happy to contribute fixing this problem; does anybody see how to
> >> > >> attack it?
> >> > >>
> >> > >
> >> > > We just need to have the program handle correctly the case in which a
> >> > > point is not found in the domain. It shouldn't be hard to fix (it may be
> >> > > just a case of removing an error message ans stressing in the code
> >> > > comment that points must reside on the same process).
> >> > >
> >> > > Start by looking in the Function::interpolate functions.
> >> >
> >> > Isn't the issue here that since the local meshes don't overlap, we
> >> > need to fetch the values from one or more neighboring processes? So,
> >> > during interpolation in parallel, all processes send a list of
> >> > coordinates for the values needed that are not local, and then all
> >> > processes send and receive these values. Or is this too simple for
> >> > more advanced elements?
> >> >
> >>
> >> The first step of finding the cell which contains a point doesn't depend
> >> on the element. We don't have this in place for points which reside on
> >> other processes. The next step would be for the value of function to be
> >> evaluated and communicated.
> >
> > I suggest that when a point is found which is not inside the domain
> > and MPI::num_processes > 1, the point is added to a list of points
> > which are not inside the domain. Then all the missing points are
> > distributed to all processes and each process goes through the list of
> > points to see which points it can handle. Then the results are
> > communicated back. I think this is the same as what Ola suggests.
>
> Yes, this is more or less what I have in mind. Should I add a
> structure in Data to hold the points?
In Data? That doesn't sound like an optimal solution. Data is supposed
to be a small data structure for local data. We probably need a new
data structure for that, or maybe just some std::vector<>s as part of
the algorithm that does the interpolation.
--
Anders
> > The problem with this is that it can potentially lead to quite a lot
> > of communication. In the worst case, all points would be missing for
> > all processes and then all points would have to be passed around
> > between all processes. But I don't see any other way since it's not
> > possible for a process to know where a missing point may be located.
>
> Yes, this might be pretty bad. Also if the two different meshes don't
> overlap at all, all points in each mesh are outside the other local
> mesh, giving lots of values to send around. But lets not worry too
> much about this yet :)
>
> Ola
>
> >
> > -----BEGIN PGP SIGNATURE-----
> > Version: GnuPG v1.4.10 (GNU/Linux)
> >
> > iEYEARECAAYFAkxgDncACgkQTuwUCDsYZdG6cACdE3u0O50658K3Llr2DNLaHyU6
> > 44UAn1/zql0Uh/oRuIpKnWrW6VnYwdgm
> > =98Vv
> > -----END PGP SIGNATURE-----
> >
> >
>
>
>
Attachment:
signature.asc
Description: Digital signature
Follow ups
References