← Back to team overview

yade-users team mailing list archive

Re: [Question #680609]: mpi4py MPI terminates Yade

 

Question #680609 on Yade changed:
https://answers.launchpad.net/yade/+question/680609

    Status: Open => Answered

Deepak proposed the following answer:
Hello,

What happens when you try this :
from mpi4py import MPI
?

On Fri, May 3, 2019 at 5:03 PM Jiadun Liu <
question680609@xxxxxxxxxxxxxxxxxxxxx> wrote:

> Question #680609 on Yade changed:
> https://answers.launchpad.net/yade/+question/680609
>
> Description changed to:
> Hi All,
> I met the following probem.
>
> jiadun@jiadun-HP-ZBook-15:~$ yade
> Welcome to Yade 2019.01a
> TCP python prompt on localhost:9000, auth cookie `ksyasu'
> XMLRPC info provider on http://localhost:21000
> [[ ^L clears screen, ^U kills line. F12 controller, F11 3d view (use h-key
> for showing help), F10 both, F9 generator, F8 plot. ]]
>
> Yade [1]: from mpi4py import MPI
> --------------------------------------------------------------------------
> It looks like opal_init failed for some reason; your parallel process is
> likely to abort.  There are many reasons that a parallel process can
> fail during opal_init; some of which are due to configuration or
> environment problems.  This failure appears to be an internal failure;
> here's some additional information (which may only be relevant to an
> Open MPI developer):
>
>   opal_error_register failed
>   --> Returned value -2 instead of OPAL_SUCCESS
> --------------------------------------------------------------------------
> --------------------------------------------------------------------------
> It looks like MPI_INIT failed for some reason; your parallel process is
> likely to abort.  There are many reasons that a parallel process can
> fail during MPI_INIT; some of which are due to configuration or environment
> problems.  This failure appears to be an internal failure; here's some
> additional information (which may only be relevant to an Open MPI
> developer):
>
>   ompi_mpi_init: opal_init_util failed
>   --> Returned "(null)" (-2) instead of "Success" (0)
> --------------------------------------------------------------------------
> *** An error occurred in MPI_Init_thread
> *** on a NULL communicator
> *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
> ***    and potentially your MPI job)
> [jiadun-HP-ZBook-15:3988] Local abort before MPI_INIT completed
> successfully; not able to aggregate error messages, and not able to
> guarantee that all other processes were killed!
> jiadun@jiadun-HP-ZBook-15:~$
>
> Best regards,
> Jiadun
>
> --
> You received this question notification because your team yade-users is
> an answer contact for Yade.
>
> _______________________________________________
> Mailing list: https://launchpad.net/~yade-users
> Post to     : yade-users@xxxxxxxxxxxxxxxxxxx
> Unsubscribe : https://launchpad.net/~yade-users
> More help   : https://help.launchpad.net/ListHelp
>

-- 
You received this question notification because your team yade-users is
an answer contact for Yade.