← Back to team overview

yade-mpi team mailing list archive

Fwd: mpi4py on froggy


Hi, can anyone help for this question?
Deepak described a problem for compiling yade on "Froggy" below. This is
probably workarounded by William already since he is using Froggy
routinely. What do you think William?
Note that while we will have access to Luke for the Hackathon. I don't know
currently if the OS (I mean details of library versions) are exactly the
same on Froggy and Luke. I thus suggest to try Luke as soon as you have
your account created on Perseus (as directed to by Pierre-Antoine in other

Please note that this mailing list yade-mpi@xxxxxxxxxxxxxxxxxxx is public
(everyone can read the archive at least, not everyone can write-to). I
think it is good since the public archive may help someone to understand
progression and take over in the future. However if some of you are not
comfortable with it we can switch to more confidential communications at
any point.
If the yade-mpi list is agreed on I invite Jean and Thomas to join yade-on
launchpad (https://launchpad.net/~yade-mpi).



---------- Forwarded message ---------
From: Deepak Kunhappan <deepak.kunhappan@xxxxxxxxxxxxxxx>
Date: Tue, 30 Oct 2018 at 16:41
Subject: mpi4py on froggy
To: Bruno Chareyre <bruno.chareyre@xxxxxxxxxxxxxxx>


So in froggy, the following GCC compilers are available in the default
module list. (from : /applis/site/env.bash) (individual 'projects' have
their own list of modules in addition to these!)

GCC compilers -> 4.4.6, [4.8.1], [4.8.2], [6.4.0]

MPI libs ->   [OpenMPI 1.6.4 with gcc-4.4.6], [OpenMPI 1.8.2 with
gcc-4.4.6]  (There are others with intel compilers)

Python -> [python 2.7.5, 2.7.6 --> has mpi4py with openmpi_1.6.4 and gcc

[* >> import mpi4y;  mpi4py.get_config() ] --> this is like 'which
mpicc' mpi4py is using.


Now in the list of modules with respect to my project
(flexible_fibers_in_turbulent_flow) :

We have : OpenMPI-1.7.3 with GCC 4.8 but no mpi4py.  (I used this to
compile YADE and YALES2)

I imported the existing mpi4py with the module OpenMPI-1.7.3_gcc 4.8 and
ran a simple python script, I get the following error:

Traceback (most recent call last):
   File "mptest.py", line 1, in <module>
Traceback (most recent call last):
   File "mptest.py", line 1, in <module>
     from mpi4py import MPI
ImportError: libnuma.so.1: cannot open shared object file: No such file
or directory
     from mpi4py import MPI
ImportError: libnuma.so.1: cannot open shared object file: No such file
or directory
Primary job  terminated normally, but 1 process returned
a non-zero exit code.. Per user-direction, the job has been aborted.
mpiexec detected that one or more processes exited with non-zero status,
thus causing
the job to be terminated. The first process to do so was:

   Process name: [[44286,1],1]
   Exit code:    1

The output of mpi4py.get_config() points to the mpicc of
openmpi_v1.7.3_gcc_4.8 (it had detected this mpi version!) ,

so probably this error could be with something else, (maybe openmpi was
compiled with specific flags for optimum performance with YALES2),


To get yade-mpi working, I think we require OpenMPI compiled with
GCC-4.8 (and the same with mpi4py ?) in the default module list.

(I'm also attempting to recompile yade and few of the libraries with
gcc-4.4 again).

Thanks and regards,