← Back to team overview

yade-mpi team mailing list archive

Re: mpi interactivity

 

Hi,

I dug this problem this morning, and it seems that the 2nd option is
ipython parallel as it avoids hard-drive writes/reads. Ipython parallel
allows communications to workers (aka engines) from a basic single-thread
ipython (aka communicator). The communicator can ask the workers to execute
a python script, a function, or even execute commands within the engines.
The engines can load mpi4py and fully use MPI. The idea would be to ask the
engines to load yade then execute the user script (as done in the current
yade-mpi).

François

Le jeu. 13 juin 2019 à 11:14, Bruno Chareyre <bruno.chareyre@xxxxxxxxxxxxxxx>
a écrit :

> Hi,
> After speaking with François and turning the problems in various ways I
> feel that the idea of full interactivity combined with mpi execution is
> chimeric.
>
> The only realistic solution I come up with, would use the HD to
> communicate the scene to a mpi system call. O.mpirun() run would do
> something like this if called outside mpi context (i.e. interactively):
>
> O.save("tmp.yade")
> import subprocess
> subprocess.run(["mpiexec -n N yade --mpi -niter=1000 userscript.py"])
> O.load("tmp.yade")
>
> The "--mpi" option would trigger:
> O.load("tmp.yade") #if master
> O.mpirun() #the real, current, mpi behavior
> O.save("tmp.yade") #by master, after a mergeScene()
>
> It probably requires that the userscript encapsulates function
> definitions, that's a small constraint:
>
> def main():
>     def function():
>         ....
>
> if __name__ == "__main__":
>   main()
>
>
> Let me know if you see other alternatives.
>
> Bruno
>
>
>
>
> --
> --
> _______________
> Bruno Chareyre
> Associate Professor
> ENSE³ - Grenoble INP
> Lab. 3SR
> BP 53
> 38041 Grenoble cedex 9
> Tél : +33 4 56 52 86 21
> ________________
>
> Email too brief?
> Here's why: email charter
> <https://marcuselliott.co.uk/wp-content/uploads/2017/04/emailCharter.jpg>
>

Follow ups

References