Garth N. Wells skrev den 11/07-2008 følgende:
Ola Skavhaug wrote:
Garth N. Wells skrev den 10/07-2008 følgende:
When I have MPI installed and configure DOLFIN with either PETSc or
Trilinos, I get a bunch of errors when plot() is called (see below).
A plot appears as expected, followed by the error messages. Looks
like a Viper issue (?).
As far as I know, there shouldn't be any MPI calls in viper/VTK. I have
absolutely no idea what causes this error.
Did you compile VTK with MPI? Could you report the output of this commando on
the shared vtk-libraries:
objdump -R /usr/lib/libvtk*.so | grep MPI
I didn't build VTK. I'm using the package. From objdump I get
objdump -R /usr/lib/libvtk*.so | grep MPI
000fc960 R_386_JUMP_SLOT
_ZN24vtkDistributedDataFilter15MPIRedistributeEP10vtkDataSetS1_
Garth
OK, then we have the vtk. As you can see, no MPI stuff is going on in viper,
and the MPI_Attr_get does not live in vtk. I have a similar problem now when I
try to combine dolfin with petsc bindings and the PyCC bindings to a serial
build of HYPRE. The problems are likely to be caused by some dummy MPI calls
in HYPRE that are mixed with the non-dummy versions of MPI in dolfin/PETSc.
I don't know what causes the problem you describe.
Can you give a more elaborate description of your system build + which
problems that fail?