yade-users team mailing list archive
-
yade-users team
-
Mailing list archive
-
Message #18216
[Question #675256]: running .so file wrapped by cython parallellized by mpi in yade
New question #675256 on Yade:
https://answers.launchpad.net/yade/+question/675256
Hello
I am trying to couple my 3-D LBM(Lattice Boltzmann Method) code, parallelized with MPI and written in C++, with yade. I used Cython for wrapping the C++ code. At the moment I can import my parallel and serial (without MPI header) code to python and run it. Also, I can import my serial code to Yade but when it reaches to parallel code I am not able to run it in Yade.
The error I am getting is:
/////////////////////////////////////////////////////////////////////////////////////////////
Welcome to Yade 2018.02b
TCP python prompt on localhost:9000, auth cookie `dakyus'
XMLRPC info provider on http://localhost:21000
Running script ex1.py
--------------------------------------------------------------------------
It looks like opal_init failed for some reason; your parallel process is
likely to abort. There are many reasons that a parallel process can
fail during opal_init; some of which are due to configuration or
environment problems. This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):
opal_error_register failed
--> Returned value -2 instead of OPAL_SUCCESS
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort. There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems. This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):
ompi_mpi_init: opal_init_util failed
--> Returned "(null)" (-2) instead of "Success" (0)
--------------------------------------------------------------------------
*** An error occurred in MPI_Init_thread
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[ata-B:6940] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
/////////////////////////////////////////////////////////////////////////////////////////////
Do you have any recommendations and suggestions?
Thanks in advance for your help.
--
You received this question notification because your team yade-users is
an answer contact for Yade.