On 12/7/06, Garth N. Wells <g.n.wells@xxxxxxxxxx> wrote:
Martin Sandve Alnæs wrote:
Yes, but that's a separate issue.
We're probably just talking past each other here.
After looking at your code again, I think the misunderstanding is in this:
(I don't know PetSC, so I might be wrong here)
You call VecCreateMPI(..., n_local, n_global, &vec), which probably
distributes the global vector entries in contiguous chunks to
processes. If you didn't renumber, I now see your "problem".
I've used an Epetra_Map(..., array_of_local_entries, n_local) which
distributes the global vector entries with a completely general
local-to-global mapping.
local entries in a contiguous array, so there is an implicit
renumbering here. But it is not a renumbering of the global dofs, it
is a separate renumbering or mapping between local and global vector
indices. With this approach, there is no connection between the
numbering of the global dofs and the communication amount.
martin